Northeastern University · Electrical & Computer Engineering · Seattle
Mobile Robotics
Build a robot that maps its environment, knows where it is, and completes a mission — autonomously.
A hands-on graduate course in mobile robotics covering probabilistic localization, SLAM, autonomous navigation with Nav2, and real-time AI perception on edge hardware. Every algorithm taught in lecture runs on a physical TurtleBot 4 robot that you modify, configure, and program yourself.
Course Overview
The four core problems of mobile robotics — sensing, localization, mapping, and navigation — studied through both theory and implementation. Topics progress from sensor models and probabilistic estimation through SLAM and the Nav2 autonomous navigation stack, ending with real-time object detection using a neural processing unit mounted on the robot.
A stock TurtleBot 4 is not enough. Students upgrade the robot in Week 1 by mounting a Raspberry Pi 5 with an AI HAT+ accelerator, connecting it via Ethernet, and configuring a two-machine ROS2 distributed system. The TurtleBot 4 handles only sensing; the Pi 5 runs SLAM, Nav2, and YOLO — simultaneously, on-robot, without a laptop.
Every algorithm — EKF sensor fusion, SLAM Toolbox graph optimization, Nav2 behavior trees, Hailo YOLO inference — runs on the physical robot, not just in simulation. Students directly observe how theory translates to robot behavior: how odometry drift motivates SLAM, how particle filters localize a lost robot, and why edge AI hardware matters for battery-powered systems.
The semester ends with a live competition. A pre-set course is revealed 72 hours before competition day. Teams navigate five checkpoints using ArUco markers, detect a target object by color, avoid obstacles including a live person, and dock autonomously — all without touching the keyboard once the run begins. Fastest successful run wins.
Course Topics
Semester Roadmap
Wk 1–2
Platform Build
Pi 5 + AI HAT · Ethernet · CycloneDDS · multi-machine ROS2
Wk 3–5
Sensing
LiDAR · IMU · OAK-D · TF2 · ROS2 node writing
Wk 6–9
Locate & Map
EKF · particle filter · SLAM Toolbox · map quality
Wk 10–12
Navigate
Nav2 · behavior trees · waypoints · auto-docking
Wk 13–15
AI + Mission
YOLO on Hailo · perception loop · competition
What You Will Do
robot_localization packageHardware Platform
Intelligence Node
Sensing Node
eth0 · Same ROS_DOMAIN_ID on both machines · Static IPs 192.168.11.1 and 192.168.11.2
Compute Capacity
The TurtleBot 4's onboard Raspberry Pi 4 runs out of CPU when SLAM, Nav2, and detection run simultaneously — messages are dropped and navigation fails. The Pi 5 has 2× the CPU performance and the AI HAT+ offloads inference entirely to the NPU.
Distributed Architecture as a Learning Goal
Real robot systems are always distributed — a vehicle LiDAR unit, an edge server, and a motion controller are separate computers. Configuring multi-machine ROS2 with CycloneDDS, understanding DDS discovery, and managing network bandwidth are skills that transfer directly to industry.
Edge AI Without Compromise
Running YOLO on the Hailo-8L NPU consumes near-zero CPU, leaving the Pi 5's four cores fully available for SLAM and Nav2. Students learn the performance argument for dedicated AI hardware — not just in theory, but by measuring htop during full-stack operation.
ROS2 Jazzy on Ubuntu 24.04
Both machines run ROS2 Jazzy on Ubuntu 24.04 — the same OS and ROS2 version on the TurtleBot 4 Pi 4 and the Pi 5. All packages are installed via apt using the official ros2-apt-source package — no source compilation required for standard labs.
SLAM Toolbox + Nav2
SLAM Toolbox performs synchronous online mapping, producing a 5 cm resolution occupancy grid. Nav2 provides global path planning (NavFn/Dijkstra), local control (DWB), AMCL localization, and a Python Simple Commander API for programmatic navigation.
HailoRT + hailo-rpi5-examples
Hailo's official package for Raspberry Pi 5 installs via apt install hailo-all. Pre-compiled .hef model files for YOLOv8n, YOLOv8s, and YOLOv8m are downloaded in a one-time setup step. A ROS2 wrapper node publishes vision_msgs/Detection2DArray.
Python Throughout
All student-written code is Python: ROS2 nodes (rclpy), particle filter implementation (numpy), A* and path planning (heapq), color detection (opencv-python), and A* homework (matplotlib for visualization).
Core Curriculum
Sensor Modeling
Derive the noise model and measurement equation for 2D LiDAR, IMU gyroscope/accelerometer, and wheel encoders. Understand why no single sensor is sufficient and why sensor fusion is necessary for reliable state estimation.
Probabilistic State Estimation
Build the Bayes filter prediction-update cycle from first principles. Implement a 1D Bayes filter and a 2D particle filter in Python. Derive the EKF linearization using Jacobians and configure robot_localization to fuse odometry and IMU on the physical robot.
Occupancy Grid Mapping
Represent 2D environments as probability maps using the log-odds update rule. Understand how LiDAR raycasting marks free and occupied cells. Evaluate map quality using coverage statistics on a real map built in the lab.
Simultaneous Localization and Mapping
Formulate the joint SLAM posterior and explain why it is hard. Study graph-based SLAM — how pose nodes, odometry edges, and loop closure constraints form the optimization problem that SLAM Toolbox solves. Tune SLAM parameters to build a high-quality map of a real lab space.
Path Planning & Navigation
Implement Dijkstra's algorithm and A* from scratch on a grid. Study costmap inflation and the role of the global vs local planner. Configure Nav2 — planners, behavior trees, controller tolerances, and AMCL parameters — on a real robot navigating to multiple waypoints.
Edge AI Inference
Compare NPU, CPU, and GPU compute architectures. Study the YOLO single-pass detection architecture. Trace the PyTorch → ONNX → HEF compilation pipeline. Measure inference throughput (FPS) and power draw on the Hailo-8L during full-stack operation alongside SLAM and Nav2.
End-of-Semester Competition
The course map is revealed 72 hours before competition day. Teams have that window to tune localization, verify their Nav2 configuration, and test their detection stack. When the run begins, the team member steps back from the keyboard — the robot must complete the entire mission on its own.
The robot must navigate five checkpoints in order (confirmed by ArUco marker IDs), detect and report the location of a color-coded target object, avoid dynamic obstacles including a live person standing in its path, and dock autonomously at the end. Every element of the course maps to a specific lab — nothing in the competition is a surprise.
| Checkpoint / Event | Points |
|---|---|
| Checkpoint 1 reached (stop within 0.3 m, ArUco confirmed) | 10 |
| Checkpoint 2 reached | 10 |
| Checkpoint 3 reached (narrower corridor) | 15 |
| Checkpoint 4 reached | 15 |
| Checkpoint 5 reached | 20 |
| Target object detected & coordinates reported | 15 |
| Successful autonomous dock | 10 |
| No collisions (5 pts deducted per collision) | 5 |
| Maximum score | 100 |
Why Take This Course
ROS2, Nav2, SLAM Toolbox, and Python-based robotics programming are the exact tools used at Boston Dynamics, iRobot, Amazon Robotics, Fetch Robotics, and nearly every autonomous vehicle company. Students who complete this course can contribute to a production robotics software stack on day one of an internship.
You will encounter every real-world robotics challenge: network latency dropping messages, DDS discovery on a congested network, SLAM drift from wheel slip, Nav2 getting stuck in a narrow corridor. Debugging these problems on physical hardware is irreplaceable preparation for a career in robotics.
Running YOLO on a dedicated NPU while SLAM and Nav2 run concurrently is a system-integration challenge most robotics engineers don't encounter until mid-career. This course gives you early exposure to the hardware-software co-design thinking required to deploy AI on resource-constrained mobile platforms.
A video of your robot completing an autonomous mission — navigating a real environment, stopping for a person, detecting an object, and docking — is something you can share with any employer. It demonstrates probabilistic reasoning, system integration, and embedded AI in one clip. Most graduate students don't have anything like it.
Course Materials & Repository
31 pages covering all prerequisites, lessons, labs, homework, and the competition specification. Every lab page includes numbered procedures, complete runnable Python code, expected output, and a troubleshooting table. Updated throughout the semester.
Open Wiki ↗All course code lives here: ROS2 Python nodes, SLAM Toolbox configs, Nav2 parameter files, behavior tree XMLs, the YOLO Hailo ROS2 node, and the competition state machine scaffold. Clone the repo in Week 1.
Open Repository ↗Three prerequisite pages walk through ROS2 Jazzy on Ubuntu 24.04, TurtleBot 4 stock bring-up and verification, and Hailo AI HAT+ driver installation. Complete these before the first lab session in Week 1.
Setup Guides ↗Lab 0 is a 90-minute guided build session: mount the Pi 5 + AI HAT+ on the TurtleBot 4 top shelf, configure static IPs and CycloneDDS, wire the power banks, and verify cross-machine ROS2 topic visibility — 16-item checklist included.
Lab 0 Guide ↗Course Information
| Course Number | EECE 5550 |
| Title | Mobile Robotics |
| Credits | 4 credit hours |
| Level | Graduate & Senior Undergraduate |
| Campus | Northeastern University · Seattle |
| Prerequisites | EECE 2520 or equivalent · Python programming experience · or Graduate Admission |
| Hardware | TurtleBot 4 (provided) · Raspberry Pi 5 8 GB (provided) · Raspberry Pi AI HAT+ Hailo-8L (provided) |
| Tools | ROS2 Jazzy · Python 3.10+ · SLAM Toolbox · Nav2 · robot_localization · HailoRT · OpenCV · Ubuntu 24.04 |
| Grading | Lab reports 30% · Homework 15% · Midterm exam 20% · Final project 35% |