Northeastern University · Electrical & Computer Engineering · Seattle
Robotics Sensing and Navigation
A robot does not know where it is. Teach it — with GPS, IMU, LiDAR, cameras, Kalman filters, and SLAM.
Fuse multi-sensor data through probabilistic estimation to build a navigation stack that works — from a TurtleBot 4 in the hallway to a full-scale autonomous vehicle on the road. Every algorithm taught in lecture runs on real hardware you collect data from yourself.
Course Overview
The four pillars of robot navigation: sensing (how robots perceive the world through cameras, LiDAR, GPS, and IMUs), estimation (extracting position and orientation from noisy data using Kalman and particle filters), mapping (building spatial representations via SLAM), and navigation (planning and executing paths through the environment). Every algorithm taught in lecture runs on real hardware.
You work with two platforms: a TurtleBot 4 for indoor experiments (odometry, SLAM, autonomous navigation) and the NUance autonomous vehicle — a Ford Mustang Mach-E equipped with Ouster LiDAR, Lucid cameras, and Continental radar — for outdoor multi-sensor data collection. You drive the car, collect the data, and build a complete navigation stack from it.
Course Topics
Semester Roadmap
ROS 2 environment, sensor overview (IMU, LiDAR, GPS, camera), TurtleBot 4 first drive, signal processing primer.
Lab 1 HW 1Dead reckoning, odometry error, Bayes filter, Kalman filter, EKF. Implement EKF localization with LiDAR landmarks.
Lab 2–3 HW 2–3Particle filter, GMapping, Cartographer, SLAM with TurtleBot 4. Build a map of real indoor space.
Lab 4 MidtermGPS/RTK, visual odometry, LiDAR odometry, sensor fusion. Outdoor data collection with GPS/IMU. NUance vehicle multi-sensor navigation.
Lab 5–6 HW 4Team-based autonomous navigation challenge. TurtleBot 4 navigates an unknown environment using everything you've built.
Project DemoHardware & Software
Create 3 + RPi 4 + RPLIDAR + OAK-D
Ford Mustang Mach-E autonomous platform
128-channel 3D LiDAR on NUance
Machine vision camera on NUance
Outdoor positioning + IMU fusion
Inertial sensing and heading
| Layer | Technology |
|---|---|
| Robot OS | ROS 2 Jazzy |
| Navigation | Nav2 (AMCL, planners, costmaps, behavior trees) |
| SLAM | SLAM Toolbox, GMapping, Cartographer |
| Point Clouds | Open3D, PCL |
| Visualization | RViz2, Foxglove |
| Data Analysis | Python (NumPy, SciPy, Matplotlib) |
| Data Recording | ROS 2 bag (rosbag2) |
| Simulation | Gazebo |
Capstone
Six labs build skills progressively from sensor inspection to multi-sensor navigation:
Teams build a complete autonomous navigation system for TurtleBot 4. The robot must navigate an unknown indoor environment, avoid obstacles, reach goal waypoints, and return to its starting position — using your own SLAM map, localization pipeline, and path planner.
Why This Course
Every algorithm — Kalman filter, particle filter, SLAM — runs on a real robot you can touch.
Indoor TurtleBot 4 and a full-size autonomous vehicle. Few courses offer both.
Students collect their own multi-sensor data on the NUance vehicle — LiDAR, camera, GPS, IMU.
Labs use real sensor data with real noise, not clean textbook datasets.
Extensive hands-on with ROS 2, Nav2, TF2, and rosbag — the industry standard stack.
Sensor fusion, SLAM, and autonomous navigation are in high demand at robotics and AV companies.
Resources
Recommended textbook: S. Thrun, W. Burgard, D. Fox — Probabilistic Robotics (MIT Press, 2005)
Course Info
| Lab reports (6 labs) | 25% |
| Homework (4 sets) | 25% |
| Midterm exam | 20% |
| Final project (robot + report + demo) | 30% |