Northeastern University  ·  Electrical & Computer Engineering  ·  Seattle

EECE 5550

Mobile Robotics

4 Credit Hours  ·  Graduate & Senior Undergraduate

Build a robot that maps its environment, knows where it is, and completes a mission — autonomously.

A hands-on graduate course in mobile robotics covering probabilistic localization, SLAM, autonomous navigation with Nav2, and real-time AI perception on edge hardware. Every algorithm taught in lecture runs on a physical TurtleBot 4 robot that you modify, configure, and program yourself.

TurtleBot 4  ·  Raspberry Pi 5  ·  Hailo AI HAT+  ·  ROS2 Jazzy  ·  SLAM Toolbox  ·  Nav2

Course Overview

What This Course Covers

The four core problems of mobile robotics — sensing, localization, mapping, and navigation — studied through both theory and implementation. Topics progress from sensor models and probabilistic estimation through SLAM and the Nav2 autonomous navigation stack, ending with real-time object detection using a neural processing unit mounted on the robot.

The Distributed Hardware Platform

A stock TurtleBot 4 is not enough. Students upgrade the robot in Week 1 by mounting a Raspberry Pi 5 with an AI HAT+ accelerator, connecting it via Ethernet, and configuring a two-machine ROS2 distributed system. The TurtleBot 4 handles only sensing; the Pi 5 runs SLAM, Nav2, and YOLO — simultaneously, on-robot, without a laptop.

Algorithms on Real Hardware

Every algorithm — EKF sensor fusion, SLAM Toolbox graph optimization, Nav2 behavior trees, Hailo YOLO inference — runs on the physical robot, not just in simulation. Students directly observe how theory translates to robot behavior: how odometry drift motivates SLAM, how particle filters localize a lost robot, and why edge AI hardware matters for battery-powered systems.

Autonomous Mission Competition

The semester ends with a live competition. A pre-set course is revealed 72 hours before competition day. Teams navigate five checkpoints using ArUco markers, detect a target object by color, avoid obstacles including a live person, and dock autonomously — all without touching the keyboard once the run begins. Fastest successful run wins.

Course Topics

Robotics Theory

  • Differential-drive kinematics
  • Sensor models — LiDAR, IMU, encoders
  • Bayes filter & log-odds mapping
  • Extended Kalman Filter (EKF)
  • Monte Carlo / particle filter
  • Occupancy grid mapping
  • SLAM — graph optimization, loop closure
  • Path planning — A*, Dijkstra, RRT

Distributed Systems & ROS2

  • Multi-machine ROS2 with CycloneDDS
  • TF2 coordinate frame transforms
  • Publisher / subscriber / action patterns
  • SLAM Toolbox on a remote computer
  • Nav2 behavior trees & planners
  • Nav2 Simple Commander Python API
  • ROS2 bags, parameters, launch files
  • AMCL particle-filter localization

Edge AI & Perception

  • NPU vs CPU vs GPU for inference
  • YOLO architecture & model deployment
  • Hailo-8L inference pipeline (HailoRT)
  • PyTorch → ONNX → HEF conversion
  • ROS2 object detection publishing
  • ArUco marker detection & pose estimation
  • HSV color-space object tracking
  • Perception-driven navigation decisions

Semester Roadmap

Wk 1–2

Platform Build

Pi 5 + AI HAT · Ethernet · CycloneDDS · multi-machine ROS2

Wk 3–5

Sensing

LiDAR · IMU · OAK-D · TF2 · ROS2 node writing

Wk 6–9

Locate & Map

EKF · particle filter · SLAM Toolbox · map quality

Wk 10–12

Navigate

Nav2 · behavior trees · waypoints · auto-docking

Wk 13–15

AI + Mission

YOLO on Hailo · perception loop · competition

What You Will Do

Mount a Raspberry Pi 5 + Hailo AI HAT+ on a TurtleBot 4 and configure a distributed ROS2 system over Ethernet
Implement an Extended Kalman Filter fusing wheel odometry and IMU data using the robot_localization package
Drive a TurtleBot 4 through a lab space to build a SLAM map, then tune SLAM Toolbox parameters for map quality
Navigate autonomously to four waypoints using Nav2, then write a custom behavior tree that adds pause and patrol behaviors
Run YOLOv8n on the Hailo-8L NPU at 30+ FPS while SLAM and Nav2 run concurrently — measure the latency budget of the full stack
Implement a safety behavior: robot stops for any detected person in its path and resumes when the path is clear
Detect ArUco markers and colored objects with OpenCV to confirm checkpoints and locate a target item by color
Compete in the autonomous mission challenge — no keyboard input once the run begins

Hardware Platform

Intelligence Node

Raspberry Pi 5 (8 GB) + AI HAT+

  • Runs SLAM Toolbox, Nav2, YOLO inference
  • Hailo-8L NPU — 13 TOPS at under 5 W
  • YOLOv8n at 30–80 FPS, CPU load: ~0%
  • Connected via PCIe Gen 3 × 1 M.2 interface
  • Powered by dedicated USB-C PD power bank

Sensing Node

TurtleBot 4 (iRobot Create 3 + RPi 4)

  • RPLIDAR A1 — 2D, 360°, 12 m, 10 Hz
  • OAK-D camera — RGB 1080p + stereo depth
  • Create 3 IMU + wheel encoders at 62 Hz
  • Built-in 26 Wh battery + external power bank
  • Publishes raw sensor data only — no compute

Why Two Computers?

Compute Capacity

The TurtleBot 4's onboard Raspberry Pi 4 runs out of CPU when SLAM, Nav2, and detection run simultaneously — messages are dropped and navigation fails. The Pi 5 has 2× the CPU performance and the AI HAT+ offloads inference entirely to the NPU.

Distributed Architecture as a Learning Goal

Real robot systems are always distributed — a vehicle LiDAR unit, an edge server, and a motion controller are separate computers. Configuring multi-machine ROS2 with CycloneDDS, understanding DDS discovery, and managing network bandwidth are skills that transfer directly to industry.

Edge AI Without Compromise

Running YOLO on the Hailo-8L NPU consumes near-zero CPU, leaving the Pi 5's four cores fully available for SLAM and Nav2. Students learn the performance argument for dedicated AI hardware — not just in theory, but by measuring htop during full-stack operation.

Software Stack

ROS2 Jazzy on Ubuntu 24.04

Both machines run ROS2 Jazzy on Ubuntu 24.04 — the same OS and ROS2 version on the TurtleBot 4 Pi 4 and the Pi 5. All packages are installed via apt using the official ros2-apt-source package — no source compilation required for standard labs.

SLAM Toolbox + Nav2

SLAM Toolbox performs synchronous online mapping, producing a 5 cm resolution occupancy grid. Nav2 provides global path planning (NavFn/Dijkstra), local control (DWB), AMCL localization, and a Python Simple Commander API for programmatic navigation.

HailoRT + hailo-rpi5-examples

Hailo's official package for Raspberry Pi 5 installs via apt install hailo-all. Pre-compiled .hef model files for YOLOv8n, YOLOv8s, and YOLOv8m are downloaded in a one-time setup step. A ROS2 wrapper node publishes vision_msgs/Detection2DArray.

Python Throughout

All student-written code is Python: ROS2 nodes (rclpy), particle filter implementation (numpy), A* and path planning (heapq), color detection (opencv-python), and A* homework (matplotlib for visualization).

Core Curriculum

Sensor Modeling

Derive the noise model and measurement equation for 2D LiDAR, IMU gyroscope/accelerometer, and wheel encoders. Understand why no single sensor is sufficient and why sensor fusion is necessary for reliable state estimation.

Probabilistic State Estimation

Build the Bayes filter prediction-update cycle from first principles. Implement a 1D Bayes filter and a 2D particle filter in Python. Derive the EKF linearization using Jacobians and configure robot_localization to fuse odometry and IMU on the physical robot.

Occupancy Grid Mapping

Represent 2D environments as probability maps using the log-odds update rule. Understand how LiDAR raycasting marks free and occupied cells. Evaluate map quality using coverage statistics on a real map built in the lab.

Simultaneous Localization and Mapping

Formulate the joint SLAM posterior and explain why it is hard. Study graph-based SLAM — how pose nodes, odometry edges, and loop closure constraints form the optimization problem that SLAM Toolbox solves. Tune SLAM parameters to build a high-quality map of a real lab space.

Path Planning & Navigation

Implement Dijkstra's algorithm and A* from scratch on a grid. Study costmap inflation and the role of the global vs local planner. Configure Nav2 — planners, behavior trees, controller tolerances, and AMCL parameters — on a real robot navigating to multiple waypoints.

Edge AI Inference

Compare NPU, CPU, and GPU compute architectures. Study the YOLO single-pass detection architecture. Trace the PyTorch → ONNX → HEF compilation pipeline. Measure inference throughput (FPS) and power draw on the Hailo-8L during full-stack operation alongside SLAM and Nav2.

End-of-Semester Competition

🏆 Autonomous Mission Challenge

The course map is revealed 72 hours before competition day. Teams have that window to tune localization, verify their Nav2 configuration, and test their detection stack. When the run begins, the team member steps back from the keyboard — the robot must complete the entire mission on its own.

The robot must navigate five checkpoints in order (confirmed by ArUco marker IDs), detect and report the location of a color-coded target object, avoid dynamic obstacles including a live person standing in its path, and dock autonomously at the end. Every element of the course maps to a specific lab — nothing in the competition is a surprise.

Checkpoint / EventPoints
Checkpoint 1 reached (stop within 0.3 m, ArUco confirmed)10
Checkpoint 2 reached10
Checkpoint 3 reached (narrower corridor)15
Checkpoint 4 reached15
Checkpoint 5 reached20
Target object detected & coordinates reported15
Successful autonomous dock10
No collisions (5 pts deducted per collision)5
Maximum score100

Technical Requirements

  • Full stack runs on Pi 5 — no laptop control
  • Navigation via Nav2 + saved map
  • Person detection triggers safety stop
  • ArUco confirmation at each checkpoint
  • Color-based target object detection
  • Autonomous dock at run completion

Team Deliverables

  • 8 lab reports throughout semester
  • 2 homework problem sets
  • System architecture diagram
  • State machine diagram & code
  • Competition run video (both attempts)
  • Final technical report & post-mortem
Fastest Successful Run Best System Architecture Best Recovery Behavior Most Creative Solution

Why Take This Course

Direct Industry Relevance

ROS2, Nav2, SLAM Toolbox, and Python-based robotics programming are the exact tools used at Boston Dynamics, iRobot, Amazon Robotics, Fetch Robotics, and nearly every autonomous vehicle company. Students who complete this course can contribute to a production robotics software stack on day one of an internship.

Real Hardware, Real Problems

You will encounter every real-world robotics challenge: network latency dropping messages, DDS discovery on a congested network, SLAM drift from wheel slip, Nav2 getting stuck in a narrow corridor. Debugging these problems on physical hardware is irreplaceable preparation for a career in robotics.

Edge AI as a First-Class Skill

Running YOLO on a dedicated NPU while SLAM and Nav2 run concurrently is a system-integration challenge most robotics engineers don't encounter until mid-career. This course gives you early exposure to the hardware-software co-design thinking required to deploy AI on resource-constrained mobile platforms.

A Portfolio Project That Shows

A video of your robot completing an autonomous mission — navigating a real environment, stopping for a person, detecting an object, and docking — is something you can share with any employer. It demonstrates probabilistic reasoning, system integration, and embedded AI in one clip. Most graduate students don't have anything like it.

Course Materials & Repository

📖 Course Wiki

31 pages covering all prerequisites, lessons, labs, homework, and the competition specification. Every lab page includes numbered procedures, complete runnable Python code, expected output, and a troubleshooting table. Updated throughout the semester.

Open Wiki ↗

💻 GitHub Repository

All course code lives here: ROS2 Python nodes, SLAM Toolbox configs, Nav2 parameter files, behavior tree XMLs, the YOLO Hailo ROS2 node, and the competition state machine scaffold. Clone the repo in Week 1.

Open Repository ↗

🤖 Platform Setup Guides

Three prerequisite pages walk through ROS2 Jazzy on Ubuntu 24.04, TurtleBot 4 stock bring-up and verification, and Hailo AI HAT+ driver installation. Complete these before the first lab session in Week 1.

Setup Guides ↗

🔧 Hardware Modification

Lab 0 is a 90-minute guided build session: mount the Pi 5 + AI HAT+ on the TurtleBot 4 top shelf, configure static IPs and CycloneDDS, wire the power banks, and verify cross-machine ROS2 topic visibility — 16-item checklist included.

Lab 0 Guide ↗

Course Information

Course NumberEECE 5550
TitleMobile Robotics
Credits4 credit hours
LevelGraduate & Senior Undergraduate
CampusNortheastern University  ·  Seattle
PrerequisitesEECE 2520 or equivalent  ·  Python programming experience  ·  or Graduate Admission
HardwareTurtleBot 4 (provided)  ·  Raspberry Pi 5 8 GB (provided)  ·  Raspberry Pi AI HAT+ Hailo-8L (provided)
ToolsROS2 Jazzy  ·  Python 3.10+  ·  SLAM Toolbox  ·  Nav2  ·  robot_localization  ·  HailoRT  ·  OpenCV  ·  Ubuntu 24.04
GradingLab reports 30%  ·  Homework 15%  ·  Midterm exam 20%  ·  Final project 35%