Build and learn
April 2, 2026

Four robots you can build with a Raspberry Pi and Viam

Each one buildable in a day. Each one extensible into something real.
Jalen Geason
Technical Product Marketing
On this page

If you know how to write software, you already have the foundational knowledge required to build a robot: event-driven logic, API calls, data pipelines, and conditional control flow. The only thing missing is a layer that connects your code to hardware without making you learn an entirely new discipline first.

That's what Viam is for.

In this post, we walk through four robots you can build with a Raspberry Pi and off-the-shelf components. Each one is achievable in a day, and can be extended to new use cases.

The form factors and hardware may be different, but the pattern is identical across all four: declare your hardware in a config file, write your control logic against clean APIs, and extend by adding components or swapping modules.

What Is Viam?

Viam is a software platform for building, deploying, and managing robots. It gives you consistent APIs for hardware, built-in services for robotics foundations like motion and navigation, a registry of hardware drivers and software modules (like ML models and control logic), and the infrastructure to scale your prototype into a fleet without major code rewrites.

If you've worked with cloud infrastructure, the mental model transfers well. Think of Viam as a container runtime for physical hardware: you declare what you want, Viam pulls the right drivers and initializes everything, and your application code talks to clean APIs rather than raw hardware.

How Viam handles hardware

Every Viam machine is defined by a JSON config file. You declare your components (motors, cameras, sensors, arms), specify which modules provide their drivers, and configure any services you want to run (vision, navigation, data capture). Viam reads the config, pulls the required modules from the registry, and initializes everything automatically.

No driver installation. No dependency management. Change the hardware, update the config, and your application code keeps working because the API stays the same.

How Viam handles software

Your control logic runs against Viam's SDKs in Python, Go, or C++. Every component type exposes the same interface regardless of the underlying hardware. This means that as you build your robot, you can swap out hardware as you need without changing your control logic.

When it comes to writing that logic, you have two options. You can develop in your IDE of choice using Viam's Python, Go, or C++ SDKs, then deploy your module to the machine. Or you can write and run code directly in the Viam app using inline modules, without leaving the browser. Either way, the APIs are the same.

The result is a stack that feels familiar. You write application logic, not driver code. You use version control, remote monitoring, and staged rollouts on physical machines the same way you would on cloud services.

Before you start

You'll need the following for all four projects:

  • Raspberry Pi 5 (4GB or 8GB)
  • A Viam account (free at viam.com)
  • viam-server installed on the Pi (installation guide)
  • Python 3.8 or later
  • Project-specific hardware listed in each section

Project 1: Pick-and-place arm

The build

A 6-DOF robotic arm and gripper that picks objects from a defined position and places them somewhere else. At first, you can use fixed coordinates; it's straightforward to reason about and deeply satisfying to watch work.

Hardware

Where to take it

Right now the arm moves to coordinates you give it. Add a camera and Viam's vision service and it can find the object on its own.

Add a model that classifies objects by type, condition, or label and the arm starts making decisions. Now your robotic arm has become a fulfillment system.

Project 2: Camera-tracked servo mount

The build

Two servos keep a camera pointed at a moving object in real time. The vision service detects the object, your code calculates how far it is from the center of the frame, and adjusts the pan and tilt angles accordingly.

Hardware

Where to take it

The color detector is a great starting point, but Viam's vision service can run any TFLite model, and better yet, you can use Viam’s data capture and model training infrastructure. Train one on faces, products, barcodes, or equipment states, swap it into the config, and the mount now tracks whatever you care about without touching the control code.

Mount it in a retail environment and it becomes a shelf-scanning or loss-prevention system. Mount it in a broadcast or event venue and it's an automated camera operator that never loses the subject. Viam allows you to use the same hardware with whatever intelligence you want.

Project 3: Obstacle-avoiding rover

The build

A wheeled robot that drives forward and steers away when it detects something in its path. In Viam, you configure one ultrasonic sensor, two motors, and a simple control loop.

Hardware

Where to take it

Add a camera and Viam's vision service and the rover can react to what it sees, not just what it's near. It goes from proximity-reactive to visually aware with one config change.

Take it further and add GPS and Viam's navigation service. Define a route. The rover follows it, logs what it encounters, and alerts when something is out of place. That's the foundation of an autonomous facility inspection robot, running the same code you wrote on day one.

Project 4: Conveyor + vision inspection

The build

A conveyor belt runs continuously while a camera watches objects on it. When the vision model flags something, the belt stops, the result is logged, and the belt resumes. It's the most system-level of the four projects: multiple components, a feedback loop, and a pattern that maps directly to real production environments.

Hardware

Mount the webcam above the belt, pointed straight down, centered over your inspection zone. A fixed mount works best: consistent framing means more consistent detections.

To get started without a trained model, use Viam's built-in color detector to flag objects by color. Once you have real images from the belt, train a custom TFLite model directly in Viam and drop it into the config.

Where to take it

Train a custom vision model in Viam using images captured directly from the belt. No external ML pipeline needed: Viam handles data capture, labeling, training, and deployment in one place.

Add a second motor controlling a diverter gate and the system physically routes rejected items into a separate bin instead of just stopping. Deploy the same config to multiple belts and manage them as a fleet from the Viam dashboard. At that point you have a production-grade quality control line, built on the same code from the start.

What these four projects have in common

These projects, as different as they are from each other, all follow the same workflow: configure hardware, write control logic, extend, repeat.

That pattern doesn't change when you scale. The code you write for a single rover is the same code that runs on a fleet of inspection robots. The arm logic you test on your desk is the same logic running on a production line. Viam handles the infrastructure so that the gap between prototype and deployment is a config change, not a rewrite.

Pick one of these projects, get it running, and then see what you add next. The registry has hundreds of modules. The APIs are the same. The ceiling is yours to find.

Ready to start? Create a free Viam account, and start building.

twitter iconfacebook iconlinkedin iconreddit icon

Find us at our next event

A spot illustration showing a calendar with all the dates crossed out.
Error! No upcoming events found. Please check back later!