Build and learn
January 14, 2026

Controlling a robot arm with an iPhone using Flutter, ARKit, and Viam

How our mobile team built an intuitive robot arm control app without becoming roboticists
Viam Mobile Team
Jalen Geason
Technical Product Marketing
On this page

As mobile engineers, we have years of experience building iOS and Android applications. We know Swift, Kotlin, mobile UI frameworks, and the intricacies of app deployment. What we don't know is robotics. We've never studied inverse kinematics, implemented motor control systems, or debugged robot motion planning algorithms.

So when our founder and CEO Eliot Horowitz presented us with a challenge—make operating a robot arm as intuitive as moving your phone through space—we weren't sure where to start. Could mobile engineers with no robotics background actually build this?

As it turned out, we could. All we needed was Flutter, ARKit, and Viam.

Version 1: Pose tracking using raw sensor data

Our initial approach used the phone's built-in sensors to determine the position and orientation. The accelerometer tracked the phone’s movement, and the gyroscope tracked its rotation. We used these measurements to perform a number of calculations: First, to get the phone’s velocity, we integrated acceleration over time; next, to get its position, we integrated velocity; finally, to get its orientation, we integrated over the angular velocity provided by the gyroscope. Then we combined the position and orientation into a “pose” that we sent to Viam’s moveToPosition API to move the robot in real-time. We spent a couple of days on this approach, and by the end we had a solution–but it was faulty.

The core issue was sensor noise. The double integration of the accelerometer readings meant that errors compounded quadratically. For example, let’s say the accelerometer has an error of 0.01m2 (which is not uncommon). After 10 seconds, the velocity error is 0.01x10=0.1m. But the position error is 1/2x0.01x10^2=0.5m. After 60 seconds, that error is 1/2x0.01x60^2=18m, making the calculations essentially unusable. We experimented with adding a deadband filter, which helped a little, but couldn't reduce the sensor noise enough for reliable arm control.

Steps 1-5 of the first approach using the phone's built-in sensors to determine the phone’s position and orientation was technically accurate, but in reality sensor noise made the calculation error too large to be useful.

Version 2: Pose tracking using ARKit’s camera transform

After reconsidering our approach, we realized that we could avoid integration entirely by using ARKit. Our next iteration was more elegant: ARKit provides a camera transform matrix representing the phone's current pose in 3D space. We designed a reference system where users hold down on the screen to capture both the phone's current pose and the arm's current position. As the phone moves, we calculate how much it has moved from that reference point (the delta), then apply that delta to the arm's reference position to create a new target pose. This new pose is what we send to Viam's moveToPosition API.

This reference-based approach meant users could begin controlling the arm from any position, with phone movements translating proportionally to arm movements relative to its current location. Tracking with ARKit performed excellently—responsive, accurate, and without the drift issues we'd encountered with raw sensor data.

Steps 1-5 of the second approach utilized ARKit, which gave us accurate position and orientation values through the camera transform.
Try for free

Solving the coordinate system problem

The remaining technical challenge involved translating between different coordinate systems. ARKit uses a right-handed coordinate system where Y points up and Z points toward the user. Viam's robot control system uses a different convention: it is still a right-handed coordinated system, but Z points up. We needed to transform ARKit's orientation data into Viam's coordinate frame.

There's no industry-wide consensus on which coordinate system to use, which made it difficult for us to translate between ARKit and Viam.

We implemented this transformation using quaternion mathematics: first rotating 90° around the X-axis, then -90° around the Z-axis. This properly converted the phone's orientation from ARKit's reference frame into Viam's. We also implemented a deadband filter that ignores rotations smaller than 0.25 radians. This eliminated jitter from minor hand movements while maintaining natural, responsive control.

The experience of building a robot as a mobile engineer

To get this right, we had to teach ourselves a lot of new things. But the most significant aspect of this project was what we didn't need to learn. We didn't need to study inverse kinematics algorithms, motor control logic, or joint angle calculations. Viam's platform abstracted away the robotics complexity entirely.

Our work focused exclusively on the mobile implementation: integrating with ARKit, building the user interface, and managing data flow from phone to robot. We worked within our existing expertise as mobile developers rather than spending weeks learning robotics fundamentals. And we can iterate at the same pace we would on a mobile development project to make our robot do new and unexpected things—like picking up and sorting small blocks remotely, entirely through the mobile app. 

In a traditional robotics development workflow, we would have needed to understand kinematics libraries and debug low-level motor controllers before even beginning the mobile interface. Instead, we concentrated on what makes this experience uniquely mobile—touch interactions, AR tracking, and real-time responsiveness. The robot arm control? That was all handled by a simple API call.

Lead Mobile Engineer Clint Purser (pictured), partnered with mobile engineers Julie Krasnick and Martha Johnston to control a robot arm with an iPhone using Flutter, ArKit, and Viam.

Build your own mobile robotics application

At Viam, we're constantly innovating to make it easier for developers to build robots and turn them into robotics companies. Our platform is designed to abstract away the complexity of robotics development, allowing engineers from any discipline to focus on building solutions rather than wrestling with low-level systems. From mobile engineers to first-time founders, Viam makes it easier to build your first robot. 

If you have an idea for a mobile-first robotics application—whether it's controlling a rover with a smartphone, implementing computer vision on tablets, or monitoring a fleet of robot arms from a single mobile device—Viam's Flutter SDK provides the tools to build it without requiring deep robotics expertise. We're also open-source, meaning you can explore how everything works under the hood and contribute to the platform. 

We're interested to hear what other mobile developers create when the robotics complexity is abstracted away. Share your projects with us on Discord.

twitter iconfacebook iconlinkedin iconreddit icon

Find us at our next event

A spot illustration showing a calendar with all the dates crossed out.
Error! No upcoming events found. Please check back later!