Industries
November 7, 2025

We used our own platform to build, iterate, and launch a production-ready robotic solution

Building our robotic sanding solution on Viam eliminated traditional hardware integration complexity, enabling us to swap and benchmark components without rewriting application code.
Simone Kalmakis
VP of Engineering
Jalen Geason
Technical Product Marketing
On this page

When Viam’s engineering team set out to build a robotic solution for fiberglass sanding—one of the toughest jobs in marine manufacturing—we had to be able to iterate at rapid speed without being slowed down by the typical hardware challenges that come with prototyping robots. In traditional robotics development, switching a single component like a camera mid-project can mean weeks of rewriting drivers, updating interfaces, and refactoring control logic. This means your engineering team spends time debugging hardware integrations instead of solving your actual business problem. Vendor lock-in compounds this issue, forcing teams to stick with suboptimal hardware because the barrier to entry to swap out, or even test, a better option is too high.

Building our robotic sanding solution on Viam eliminated traditional hardware integration complexity, enabling us to swap and benchmark components without rewriting application code.

Viam’s robotic fiberglass sanding solution working on a boat hardtop, featured in Professional Boatbuilder article
Viam’s robotic fiberglass sanding solution working on a boat hardtop, featured in Professional Boatbuilder article, 'Viking Takes First Steps Toward Sanding Automation'

Building and iterating faster with Viam 

Our sanding solution needed to handle the complex and variable geometry of fiberglass boat parts after injection molding while achieving high coverage with consistent pressure—requirements that demanded both precise depth perception and sophisticated motion planning and control. Here, we leveraged two key features of the Viam platform:

  1. Hardware abstraction layer: this allowed us to treat any depth camera or robotic arm as a standardized component with a common API. This meant we could swap between arms from different manufacturers, or different cameras, by simply updating the configuration in Viam to point to the new component. Downstream code is abstracted away from the actual hardware beneath the hood.
  2. Built-in motion service: this provided the complex, domain-specific inverse kinematics and motion planning out of the box so that we could focus our energy on developing the specific sanding patterns that would deliver a quality finish.

This combination gave us the flexibility to iterate quickly on hardware selection while our application logic remained stable. The Viam platform handled the complexity, letting us focus on higher-level business logic to create the sanding solution.

Optimizing hardware choices without rewriting code

When we started building the prototype, we used a popular off-the-shelf depth camera for robotic vision. With the camera mounted to the robot arm, we scanned the boat surface to create a 3D model on which to base the sanding plan. But as we pushed toward production-quality sanding, we discovered its limitations: noisy point clouds, poor performance beyond a distance of 50cm, and susceptibility to surface glare—all critical issues when scanning the complex curves of fiberglass boat parts.

This is where Viam's hardware abstraction proved invaluable. Instead of being locked into our initial camera choice, we evaluated and tested alternatives without touching our application code. We identified the Orbbec Astra 2 as a promising candidate. With Viam, it was even easier to swap out the cameras in the software as it was on the robot. All of the control logic and planning algorithms we wrote with the first depth camera worked out of the box with Orbbec because they interacted with Viam’s camera API that both options implement.

The ease of swapping also allowed us to run side-by-side tests with both cameras to get quantitative data about how they stack up with each other. The results were dramatic. At 80cm distance—critical for capturing larger surface areas in fewer scans—the Astra 2 delivered smooth, accurate point clouds while the original camera produced wavy textures with significant noise. Most importantly for our reflective fiberglass surfaces, the Astra 2's depth sensing remained unaffected by glare that created large holes in the alternative camera's point cloud.

Side-by-side point cloud comparison: Orbbec Astra 2 (left) vs the other camera (right) at 95 cm. The Astra 2’s point cloud is much tighter and unaffected by the glare visible in the top right.
Side-by-side point cloud comparison: Orbbec Astra 2 (left) vs the other camera (right) at 95 cm. The Astra 2’s point cloud is much tighter and unaffected by the glare visible in the top right.

As Olivia Miller, a Viam engineer, put it: "The Astra 2's ability to maintain accurate depth perception at 180cm completely changed our scanning strategy. We can now capture entire boat sections from a single vantage point instead of stitching together dozens of close-range scans." This rapid testing, data gathering and data-driven decision making would have been impossible without Viam's abstraction layer eliminating the traditional hardware integration burden.

Learn more

A motion service that manages complexity so we can focus on quality 

While hardware abstraction accelerated our depth camera testing, we still needed to solve the fundamental problem of making the robot arm operate like a human. This is where Viam's built-in motion service came in to handle the complex math required. The motion service's Move command handles constructing the full kinematic chain, solving inverse kinematics to determine joint angles, and moving the arm to a destination pose—all while respecting joint limits, dynamic and static obstacles, and configured safety constraints. This meant our team could iterate on sanding patterns using high-level commands rather than managing coordinate transforms and collision matrices. 

Here's a simplified example of our sanding logic. We interact with both the camera and the arm through Viam’s standardized APIs. To move the arm, we simply give the motion service a destination pose and it handles the rest.

# Define sanding pattern for a boat hull section
async def sand_surface(motion_service, arm, camera):
    # Get point cloud of surface
    point_cloud = await camera.get_point_cloud()
    surface_mesh = meshify(point_cloud)
    
    # Generate sanding paths from mesh
    sanding_paths = generate_sanding_plan(surface_mesh)
    
    for path in sanding_paths:
        # Move to start position
        await motion_service.move(arm, path.start_pose)
        
        # Execute sanding motion
        for waypoint in path.waypoints:
            await motion_service.move(
                arm, 
                waypoint.pose,
                linear_speed=150  # mm/s
            )

This abstraction meant our software engineers could directly implement and test on aspects like coverage strategy, direction of travel, and sanding block orientation based dynamically on the geometry of the surface without needing deep robotics programming expertise. When we needed to adjust the sanding pattern based on real-world testing, iteration speed was fast. We developed a rapid feedback loop: make a sanding plan change, test it on actual fiberglass, analyze the surface quality, and repeat. The motion service handled the complex math and safety checks while we iterated on the sanding solution.

When we needed to adjust the sanding pattern based on real-world testing, iteration speed was fast.

Acceleration through modularity

With Viam, our solution has truly become modular. The same code that operated on one arm with an Orbbec Astra 2 camera attached can operate a different arm with a different camera. We are no longer bound by the limitations of hardware.

Furthermore, Viam’s hardware agnosticism created a virtuous cycle of continuous testing without development delays. While one team member evaluated a new hardware component, others could simultaneously refine the sanding algorithms, confident that improvements would work across any hardware configuration. This parallel development path meant hardware evaluation never blocked software progress.

Most critically, we could make data-driven hardware decisions based on actual performance rather than vendor promises or integration complexity. When testing revealed the Orbbec Astra 2's strong performance for our application, we were able to make the switch immediately and move on to the next challenge. All of this was in service of a production-ready robotic solution that our team is proud of.

If you're building robotics solutions and want to focus on your application rather than hardware integration, try Viam. Whether you're automating manufacturing, deploying field robots, or prototyping new applications, starting with the right abstraction layer will accelerate your path from idea to deployment.

twitter iconfacebook iconlinkedin iconreddit icon

Find us at our next event

A spot illustration showing a calendar with all the dates crossed out.
Error! No upcoming events found. Please check back later!