Northeastern University · Electrical & Computer Engineering · Seattle
Assistive Robotics
Build an exoskeleton. Simulate human muscles. Teleoperate a robot arm. Evaluate what helps people.
A hands-on graduate course that bridges human needs and robotic capabilities — from exoskeleton fabrication and musculoskeletal simulation in OpenSim to multi-modal teleoperation of a UR12e collaborative robot, evaluated with formal human factors metrics.
Course Overview
The three pillars of assistive robotics — manipulation, wearable exoskeletons, and human-robot teleoperation — studied through both theory and hands-on implementation. You move from ROS2 fundamentals and robot kinematics to exoskeleton fabrication, musculoskeletal simulation, and multi-modal control of a collaborative robot arm.
The centerpiece of the course is a three-session lab where you 3D-print structural parts, sew a textile arm interface, wire sensors and a servo motor to an Arduino, then simulate the exoskeleton’s biomechanical effects in OpenSim. Static Optimization predicts how muscles respond to different assistance levels — informing real motor specifications.
OpenSim’s arm26_EduExo musculoskeletal model lets you compare muscle activations with and without exoskeleton assistance before touching hardware. You sweep assist torque levels, identify the optimal motor specification for a clinical version, and validate predictions against EMG sensor data.
The semester ends with a team project where you design a multi-modal teleoperation interface for an Activity of Daily Living task on the physical UR12e. Evaluated on task performance, NASA-TLX workload, and SUS usability — not just whether the robot moves, but whether the interface actually helps.
Course Topics
Semester Roadmap
Wk 1–3
Foundations
ROS2 · kinematics · MoveIt2 · Lab 1
Wk 4–6
Manipulation
UR12e control · PolyScope · sim → real · Lab 2
Wk 7–11
Exo + OpenSim
Build EduExo · EMG · Static Opt · Lab 3
Wk 11–13
Teleoperation
MediaPipe · mapping · gripper · Lab 4
Wk 14–15
Challenge
Multi-modal interfaces · ADL demo · Lab 5
What You Will Do
Hardware & Software Platform
UR12e + Robotiq Hand-E
6-DOF collaborative robot arm with 1300 mm reach and 12 kg payload. Adaptive parallel gripper with 50 mm stroke and 130 N grip force. Controlled via PolyScope, URScript, RTDE, and ROS2 Jazzy with MoveIt2.
EduExo (AUXIVO)
1-DOF elbow exoskeleton kit — 3D-printed structure, sewn textile interface, MG996R servo motor, rotary angle sensor, and FSR. Also modeled in OpenSim as arm26_EduExo.osim with a CoordinateActuator at the elbow joint.
PincherX 100
5-DOF desktop robot arm by Trossen Robotics. Serves as a leader arm for teleoperation in Lab 5 — move it by hand, and the UR12e mirrors your motions via delta-from-home joint mapping.
Arduino Mega + Sensors
Reads the EduExo angle sensor and FSR at 100 Hz, drives the servo motor, and streams data to the PC over serial. Optional MyoWare 2.0 surface EMG sensor for biceps activation measurement.
OpenSim 4.5
Musculoskeletal simulation software. Inverse Dynamics computes required joint torques; Static Optimization decomposes torques into individual muscle forces. Used to predict the biomechanical effect of the EduExo.
MediaPipe + Webcam
Google’s real-time pose estimation detects 33 skeletal landmarks from a single RGB camera. Joint angles computed from landmarks drive the UR12e teleoperation pipeline at 15–30 FPS with One Euro filtering.
Full Python Stack
Python 3.10+ with NumPy, SciPy, OpenCV, MediaPipe, pySerial, python-rtde, and the OpenSim Python API. ROS2 Jazzy with MoveIt2 for robot control. Arduino IDE 2.x for EduExo firmware development.
Assistive Robotics Curriculum
Assistive Robot Taxonomy
Classification by function — manipulation aids, exoskeletons, prosthetics, mobility aids, social robots. The ICF disability framework. ISO 13482 safety standards for personal care robots.
Robot Kinematics
DH convention, forward and inverse kinematics for the UR12e. Singularity analysis. Workspace boundaries. Computed analytically and verified with robotics-toolbox-python.
Exoskeleton Design
Rigid vs. soft architectures. Joint alignment with human anatomy. Actuator selection — servo, SEA, cable-driven. Backdrivability, mechanical advantage, comfort, and don/doff time as design criteria.
EMG & Muscle Modeling
Motor unit physiology, surface EMG signal processing (band-pass → rectify → RMS → MVC normalize). The Hill muscle model — activation × force-length × force-velocity. MyoWare 2.0 sensor integration.
OpenSim Simulation
Musculoskeletal modeling with arm26_EduExo.osim. Inverse Dynamics for joint torque computation. Static Optimization for muscle force decomposition. Simulation-guided motor specification.
Teleoperation Theory
Master-slave architectures, unilateral vs. bilateral control. Joint-space and Cartesian-space motion mapping. Workspace scaling, deadband, and latency management. Stability under delay.
Computer Vision for HRI
MediaPipe Pose — BlazePose architecture, 33 landmarks, joint angle computation from 3D vectors. Camera-to-robot frame alignment. One Euro Filter for real-time smoothing. Confidence gating.
Interface Evaluation
Fitts’ law throughput for pointing performance. NASA-TLX for subjective workload (6 subscales). System Usability Scale (SUS). Task metrics — completion time, success rate, path efficiency.
Final Project
Teams of 3–4 design a multi-modal assistive interface for an Activity of Daily Living task on the physical UR12e with Robotiq Hand-E. Choose from table setting, shelf retrieval, container manipulation, or block sorting. The interface must use at least two input modalities and incorporate simulation insights from Lab 3.
Evaluated on task success rate, completion time, NASA-TLX workload, and SUS usability scores. Live demonstration during the final week followed by an IEEE-format technical report.
Why Take This Course
Not metaphorically — your exoskeleton physically contacts the human body. Your teleoperation interface lets someone control a robot arm with their movements. This is design with consequence.
OpenSim predicts how muscles respond to your exoskeleton before you specify the motor. You derive clinical specifications from biomechanical data — the way real medical devices get designed.
UR12e for industrial manipulation, EduExo for wearable robotics, PincherX for leader-arm teleoperation, webcam for markerless control. Most graduate courses use one robot — you use four.
NASA-TLX and Fitts’ law reveal which interface actually works. Students are consistently surprised by the gap between what feels intuitive and what performs well under formal measurement.
Course Materials & Repository
30 pages: 11 lessons, 5 labs, 3 homework sets, project spec, and reference pages. Theory with equations, step-by-step procedures, runnable code, and expected outputs.
Open Wiki ↗Lab starter code, OpenSim models (arm26_EduExo.osim), EduExo STL files, Arduino firmware, MediaPipe teleoperation scripts, and Python utilities.
The arm26_EduExo.osim model includes arm skeleton, 6 Hill-type muscles, EduExo STL geometry, and a CoordinateActuator. Prescribed motion files and tracking tasks included.
URSim Docker configuration, Arduino IDE, ROS2 Jazzy installation, MediaPipe environment, and OpenSim Python API. One-time setup takes 60–90 minutes.
Setup Guide ↗Course Information
| Course Number | EECE 5552 |
| Title | Assistive Robotics |
| Credits | 4 credit hours |
| Level | Graduate & Senior Undergraduate |
| Campus | Northeastern University · Seattle |
| Prerequisites | Python programming · Basic linear algebra · or Graduate Admission |
| Tools | Python 3.10+ · ROS2 Jazzy · MoveIt2 · URSim · OpenSim 4.5 · MediaPipe · Arduino IDE 2.x |
| Grading | Labs 30% · Homework 15% · Midterm 20% · Final Project 35% |