Northeastern University  ·  Electrical & Computer Engineering  ·  Seattle

EECE 5554

Robotics Sensing and Navigation

4 Credit Hours  ·  Graduate & Senior Undergraduate

A robot does not know where it is. Teach it — with GPS, IMU, LiDAR, cameras, Kalman filters, and SLAM.

Fuse multi-sensor data through probabilistic estimation to build a navigation stack that works — from a TurtleBot 4 in the hallway to a full-scale autonomous vehicle on the road. Every algorithm taught in lecture runs on real hardware you collect data from yourself.

TurtleBot 4  ·  NUance Vehicle  ·  Ouster LiDAR  ·  GPS/IMU  ·  ROS 2 Jazzy  ·  Nav2  ·  SLAM Toolbox

Course Overview

From raw sensor data to autonomous navigation

What this course covers

The four pillars of robot navigation: sensing (how robots perceive the world through cameras, LiDAR, GPS, and IMUs), estimation (extracting position and orientation from noisy data using Kalman and particle filters), mapping (building spatial representations via SLAM), and navigation (planning and executing paths through the environment). Every algorithm taught in lecture runs on real hardware.

What makes it different

You work with two platforms: a TurtleBot 4 for indoor experiments (odometry, SLAM, autonomous navigation) and the NUance autonomous vehicle — a Ford Mustang Mach-E equipped with Ouster LiDAR, Lucid cameras, and Continental radar — for outdoor multi-sensor data collection. You drive the car, collect the data, and build a complete navigation stack from it.

Course Topics

Sensors, estimation, mapping, and navigation

Kalman Filter & EKF Particle Filter SLAM IMU & Gyroscope GPS & RTK LiDAR Point Clouds Camera & Stereo Vision Dead Reckoning Odometry Visual Inertial Odometry Sensor Fusion Bayes Filter GMapping & Cartographer Nav2 Stack ROS 2 TF2 Scan Matching / ICP Signal Processing

Semester Roadmap

Five phases — from sensors to full navigation

Phase 1 · Wk 1–3

Sensors & Setup

ROS 2 environment, sensor overview (IMU, LiDAR, GPS, camera), TurtleBot 4 first drive, signal processing primer.

Lab 1 HW 1
Phase 2 · Wk 4–6

Estimation

Dead reckoning, odometry error, Bayes filter, Kalman filter, EKF. Implement EKF localization with LiDAR landmarks.

Lab 2–3 HW 2–3
Phase 3 · Wk 7–9

Mapping & SLAM

Particle filter, GMapping, Cartographer, SLAM with TurtleBot 4. Build a map of real indoor space.

Lab 4 Midterm
Phase 4 · Wk 10–12

GPS & Multi-Sensor

GPS/RTK, visual odometry, LiDAR odometry, sensor fusion. Outdoor data collection with GPS/IMU. NUance vehicle multi-sensor navigation.

Lab 5–6 HW 4
Phase 5 · Wk 13–15

Final Project

Team-based autonomous navigation challenge. TurtleBot 4 navigates an unknown environment using everything you've built.

Project Demo

Hardware & Software

Two platforms — indoor and outdoor

🤖

TurtleBot 4

Create 3 + RPi 4 + RPLIDAR + OAK-D

🚗

NUance Vehicle

Ford Mustang Mach-E autonomous platform

📡

Ouster LiDAR

128-channel 3D LiDAR on NUance

📷

Lucid Camera

Machine vision camera on NUance

📍

GPS / RTK

Outdoor positioning + IMU fusion

🧭

IMU / Magnetometer

Inertial sensing and heading

LayerTechnology
Robot OSROS 2 Jazzy
NavigationNav2 (AMCL, planners, costmaps, behavior trees)
SLAMSLAM Toolbox, GMapping, Cartographer
Point CloudsOpen3D, PCL
VisualizationRViz2, Foxglove
Data AnalysisPython (NumPy, SciPy, Matplotlib)
Data RecordingROS 2 bag (rosbag2)
SimulationGazebo

Capstone

Labs and Final Project

Lab Progression

Six labs build skills progressively from sensor inspection to multi-sensor navigation:

Final Project: Autonomous Navigation Challenge

Teams build a complete autonomous navigation system for TurtleBot 4. The robot must navigate an unknown indoor environment, avoid obstacles, reach goal waypoints, and return to its starting position — using your own SLAM map, localization pipeline, and path planner.

Why This Course

Six reasons to take Robotics Sensing and Navigation

Theory meets hardware

Every algorithm — Kalman filter, particle filter, SLAM — runs on a real robot you can touch.

Two scales of robotics

Indoor TurtleBot 4 and a full-size autonomous vehicle. Few courses offer both.

You drive the car

Students collect their own multi-sensor data on the NUance vehicle — LiDAR, camera, GPS, IMU.

Real field data

Labs use real sensor data with real noise, not clean textbook datasets.

ROS 2 proficiency

Extensive hands-on with ROS 2, Nav2, TF2, and rosbag — the industry standard stack.

Career-ready skills

Sensor fusion, SLAM, and autonomous navigation are in high demand at robotics and AV companies.

Resources

Wiki, textbook, and references

Recommended textbook: S. Thrun, W. Burgard, D. Fox — Probabilistic Robotics (MIT Press, 2005)

Course Info

Details and prerequisites

Course: EECE 5554 — Robotics Sensing and Navigation
Credits: 4
Campus: Northeastern University · Seattle
Instructor: Dr. Xian Li  ·  [email protected]
Prerequisites: Graduate standing or instructor permission
Lab reports (6 labs)25%
Homework (4 sets)25%
Midterm exam20%
Final project (robot + report + demo)30%