Sensor-Based SLAM Navigation — iRobot Create
From raw IR range data to a navigable map — full SLAM pipeline on embedded hardware.
0°
Sensor Coverage
RRT*
Path Planner
University of Utah
Research Lab
Tech Stack
The Challenge
The iRobot Create platform has no built-in mapping capability. The challenge was to mount IR rangefinders on servo motors to achieve 360° coverage, build an occupancy grid map from the noisy sensor readings in real-time, and use that map to plan collision-free paths through a maze — all within the compute and power constraints of an embedded system.
Architecture & System Design

Rotating sensors build real-time 2D map of environment from range data. Occupancy grid algorithm marks free and occupied spaces. Path planning algorithm searches map for collision-free navigation routes. Robot executes waypoints using heading control feedback.
Two servo motors rotated IR rangefinder sensors to sweep the environment and collect range data at discrete angular increments. An occupancy grid algorithm converted range readings to a 2D binary map (occupied/free). The map was iteratively updated as the robot moved, with odometry used for dead-reckoning position estimates. A Rapidly-exploring Random Tree (RRT) planner searched the map for collision-free paths to the goal, with the robot executing waypoints via a proportional heading controller.
Code Walkthrough
3-step walk-through of the production implementation — file paths and intent shown above each block.
- 01
Step 1 of 3
IR range sweep → probabilistic occupancy grid
slam/occupancy.pyEach IR rangefinder reading gives range and angle relative to the robot. Bresenham's line algorithm traces the beam through the grid: every cell along the ray gets a small free-space credit, and the terminal cell gets a large occupied hit. Probabilistic updates (add/subtract, clamp 0–1) mean a single noisy reading doesn't immediately flip a cell — the map stabilises over repeated sweeps.
pythonimport numpy as np def update_occupancy_grid(grid, robot_pos, angle, distance, resolution=0.05): """ Bresenham ray-cast to update cells along a single IR beam. resolution: metres per grid cell. """ x0, y0 = world_to_grid(robot_pos, resolution) end_x = robot_pos[0] + distance * np.cos(angle) end_y = robot_pos[1] + distance * np.sin(angle) xn, yn = world_to_grid((end_x, end_y), resolution) # Cells along ray → free evidence (subtract 0.1) for (cx, cy) in bresenham_line(x0, y0, xn, yn): grid[cy, cx] = max(0.0, grid[cy, cx] - 0.1) # Terminal cell → occupied evidence (add 0.9, clamp to 1) grid[yn, xn] = min(1.0, grid[yn, xn] + 0.9) def world_to_grid(pos, resolution): """Convert world coordinates [m] to integer grid indices.""" return int(pos[0] / resolution), int(pos[1] / resolution)TakeawayProbabilistic updates tolerate sensor noise — a single return doesn't hard-set a cell; the map stabilises after 3–4 overlapping sweeps, matching what was observed in practice.
- 02
Step 2 of 3
RRT path planner on the occupancy grid
slam/rrt.pyGrid-based search (Dijkstra/A*) is complete but slow on a 200×200 grid with a 3-DOF robot. RRT builds a random tree that biases growth toward unexplored free space, finding a path in milliseconds. A 5% goal-bias nudges the tree toward the target without the greediness that causes local minima in pure goal-directed search.
pythonimport random def rrt_plan(grid, start, goal, max_iter=2000, step=0.1): """RRT path planner operating on a probabilistic occupancy grid.""" tree = [start] parent = {start: None} for _ in range(max_iter): # 95% random sample, 5% goal bias rand = random_free_point(grid) if random.random() > 0.05 else goal nearest = min(tree, key=lambda n: dist(n, rand)) new = steer(nearest, rand, step) if is_free(grid, nearest, new): tree.append(new) parent[new] = nearest if dist(new, goal) < step: return reconstruct_path(parent, new) return None # No path found within iteration budget def is_free(grid, a, b, threshold=0.5): """Check all cells along segment a→b are below the occupancy threshold.""" for (cx, cy) in bresenham_line(*world_to_grid(a, 0.05), *world_to_grid(b, 0.05)): if grid[cy, cx] > threshold: return False return TrueTakeawayThe 5% goal bias is the key tuning parameter — too low and the tree explores randomly forever, too high and it gets stuck in local narrow passages between obstacles.
- 03
Step 3 of 3
Waypoint follower — iRobot Create serial execution
slam/execute_path.pyThe RRT output is a list of (x, y) waypoints in world coordinates. The robot executes them via a proportional heading controller: the bearing error between current heading and the next waypoint drives the turn radius, and odometry from wheel encoder deltas keeps the dead-reckoning position updated without GPS.
pythonimport serial, struct, math, time DRIVE_DIRECT = 145 # iRobot Create OI opcode def irobot_drive(ser, vel_l, vel_r): """Send DRIVE_DIRECT command: individual wheel velocities in mm/s.""" cmd = struct.pack('>Bhh', DRIVE_DIRECT, int(vel_r), int(vel_l)) ser.write(cmd) def execute_path(ser, waypoints, velocity=150): """ Drive through RRT waypoints using proportional heading control. Dead-reckoning position updated from encoder odometry each cycle. """ for wp in waypoints: while True: dx, dy = wp[0] - robot_pos[0], wp[1] - robot_pos[1] if (dx**2 + dy**2) < 0.05**2: # 50 mm arrival radius break bearing_err = math.atan2(dy, dx) - robot_heading # Wrap to [-π, π] bearing_err = (bearing_err + math.pi) % (2*math.pi) - math.pi # Proportional: large error → tight turn; small error → straight turn = bearing_err * 80 # gain: 80 mm/s per radian irobot_drive(ser, velocity - turn, velocity + turn) time.sleep(0.05) update_odometry(ser) irobot_drive(ser, 0, 0) # StopTakeawayWrapping bearing error to [-π, π] before the proportional gain prevents the 2π discontinuity from commanding a full reverse spin when the robot overshoots 180°.
Results
The robot successfully mapped lab maze environments and navigated from start to goal positions autonomously. The occupancy grid converged to an accurate representation within 3-4 passes of the environment. RRT path planning consistently found collision-free paths in under 500ms on the embedded hardware.
Gallery & Demos
Click any image or video to expand · ← → keys navigate
More from University of Utah
Vision-Based Autonomous Quadrotor
MS thesis project: a quadrotor capable of autonomously taking off, navigating, and perching on branch-like structures using only visual feedback — designed for autonomous crop monitoring in agricultural fields.
Multi-Arm Coordination — 2-DOF QUANSER
Dual-arm robotic manipulation system using 2-DOF QUANSER robots with a master-slave architecture — one arm controlling position, the other controlling force — to collaboratively manipulate objects with precision.
Adaptive Backstepping — Indoor Micro-Quadrotor
Nonlinear controller design for an indoor micro-quadrotor with a suspended pendulum mass — a highly unstable configuration. Adaptive backstepping outperformed classical PID in robustness tests across multiple flight regimes.
PUMA 6-DOF Robot Arm — Forward & Inverse Kinematics
Full forward and inverse kinematics solver for a 6-DOF PUMA 762 robot arm, built from scratch using Denavit-Hartenberg parameters — with an interactive 3D MATLAB GUI featuring joint sliders, motion trail, and collision detection.
Sampling-Based & Graph-Search Motion Planning
MATLAB implementations of four canonical path-planning algorithms — Dijkstra, A*, PRM, and RRT — applied to a differential-drive robot navigating bitmap maps in configuration space, with real hardware execution on an iRobot Create.
Monocular Depth Estimation for UAV Perch Landing
C++/OpenCV vision system that estimates the 3D position and orientation of a landing perch from a single monocular camera — using image moments, covariance eigendecomposition for attitude, and focal-length triangulation for depth — enabling closed-loop visual servoing on a quadrotor.
RC Fixed-Wing Glider — Servo Actuation & Aerodynamics
Fixed-wing RC glider designed and built from scratch — two servos providing roll and pitch authority via aileron and flap control surfaces, with a brushless DC motor and ESC delivering forward thrust. Flight-tested outdoors.
Interested in this work?
Full architecture walkthrough and code review available during interviews.




