Simulation to Reality
Week 14 • CMPSC 304 Robotic Agents
Press Space or → to advance
The Full Path
SLAM
Build maps
→
Nav2
Click goals
→
Nodes
Code goals
→
Today
Real robots
Everything you have built works in simulation. What changes on a physical TurtleBot 4?
The Sim-to-Real Gap
Definition: The difference in behavior between a robot operating in simulation vs. the real world. Also known as the reality gap.
Simulation
- Perfect physics (no friction variance)
- Perfect sensors (no noise)
- Instant communication
- Unlimited battery
- No physical damage possible
Reality
- Uneven surfaces, wheel slip
- Sensor noise, reflections, dead zones
- Network latency (Wi-Fi)
- Battery drains, voltage drops
- Collisions cause real damage
TurtleBot 4 vs. TurtleBot 3
Course documentation and online tutorials often reference TurtleBot3. We use TurtleBot 4 — here is what is different.
| Feature | TurtleBot 3 Burger | TurtleBot 4 |
| Base | Custom waffle plate | iRobot Create 3 |
| Microcontroller | OpenCR (ARM Cortex-M7) | Create 3 onboard MCU |
| Lidar | LDS-02 (5 Hz) | RPLIDAR A1 (~8 Hz) |
| Camera | None (Burger) | OAK-D Pro (standard) / None (Lite) |
| Max speed | 0.22 m/s | 0.31 m/s |
| Docking | Manual | Autonomous (IR beacon) |
| Launch package | turtlebot3_* | turtlebot4_* |
Many online guides say turtlebot3_cartographer or turtlebot3_navigation2. On TurtleBot 4, use turtlebot4_navigation instead.
TurtleBot 4: The Hardware We Use
| Component | Specification | Notes |
| Mobile base | iRobot Create 3 | Differential drive, bump/cliff sensors, IMU, docking |
| Compute | Raspberry Pi 4 (4GB) | Runs ROS2 Jazzy on Ubuntu 24.04 |
| Lidar | RPLIDAR A1 (360°, ~8Hz) | USB serial, publishes /scan |
| Camera | OAK-D Pro (Luxonis) | Standard only — Lite has no camera |
| Battery | Create 3 Li-ion | ~2.5 hrs, charges on dock automatically |
| Display | 1.3" OLED + 4 buttons | Standard only — Lite has no display or buttons |
| Max speed | 0.31 m/s linear | Use 0.10–0.15 m/s for mapping |
We have 2 TurtleBot 4 (with camera) and 2 TurtleBot 4 Lite (no camera) — both work identically for lidar-based SLAM.
Ring Light Indicators
The Create 3 base has an LED ring that communicates robot state at a glance.
| Light Pattern | Meaning |
| Solid green | Ready, battery > 50% |
| Solid orange/yellow | Battery 20–50%, still usable |
| Solid/pulsing red | Battery < 20% — charge soon |
| Rotating white/blue | Booting up or running action |
| Pulsing blue | Docked and charging |
| Solid white | Fully charged on dock |
| Flashing red | Error / hazard detected (picked up, cliff, stuck) |
Important: When docked (blue pulsing), the lidar motor is off and /cmd_vel is ignored. Always undock before mapping or driving.
Display & Button Navigation (TurtleBot 4 Standard only)
4 Buttons
- Button 1 (top-left): Scroll up / back
- Button 2 (top-right): Scroll down / next
- Button 3 (bottom-left): Select / confirm
- Button 4 (bottom-right): Home / cancel
Button layout may vary by firmware version — check the display label.
Menu Options
- Dock / Undock — sends the robot to or from the dock
- EStop — immediate motor stop
- Scroll: IP address — shows current WiFi IP
- Scroll: Battery — charge percentage
- Scroll: RPLIDAR — start/stop lidar motor
- Scroll: OAK-D — start/stop camera
Quick tip (standard only): If the lidar has stopped, navigate to RPLIDAR → Start in the menu instead of rebooting.
On the Lite: Restart the lidar via ros2 service call /start_motor std_srvs/srv/Empty or reboot the robot.
RPLIDAR A1: How It Works
Hardware
- Spinning laser (360° coverage)
- Range: 0.15m – 12m
- Angular resolution: ~1°
- Scan rate: ~7.9 Hz (7–8 scans/second)
- Connects via USB →
/dev/ttyUSB0
- Driver:
rplidar_composition node
ROS2 Topics
/scan — sensor_msgs/LaserScan
- Published by the RPi, visible to laptop over WiFi
- Cartographer and SLAM Toolbox both subscribe to
/scan
- RViz shows it as red dots around the robot
# Verify lidar is running:
ros2 topic hz /scan
# Healthy: ~7.9 Hz
Lidar motor stops automatically when docked. If ros2 topic hz /scan shows 0 Hz, check dock status first.
OAK-D Pro Camera (TurtleBot 4 Standard only)
What It Does
- RGB camera: 1080p color image
- Stereo depth: 3D point cloud from two IR cameras
- On-chip AI: Intel Myriad X VPU for object detection
- IMU: 6-axis inertial measurement
- Connected via USB 3.0 SuperSpeed
ROS2 Topics
/color/preview/image — RGB preview
/color/image — full res RGB
/stereo/depth — depth image
/stereo/points — 3D point cloud
/imu — camera IMU data
# View camera in RViz:
# Add Image display, topic:
# /color/preview/image
For P4: The camera is not used for SLAM — we use lidar. The camera is available for future projects (visual detection, depth-based obstacle avoidance).
Other Sensors on the Create 3 Base
| Sensor | Topic | What It Detects |
| IMU | /imu | Linear acceleration + angular velocity. Helps fuse odometry. |
| Wheel encoders | /odom | Position estimate from wheel rotations. Drifts over time. |
| Bump sensors | /hazard_detection | Front/side contact — robot stops and backs up automatically. |
| Cliff sensors | /hazard_detection | Downward IR — stops at stair edges, raised thresholds. |
| IR dock sensors | /dock_status | Detects dock IR beacon for autonomous docking. |
| Wheel drop | /hazard_detection | Detects if robot is picked up — stops motors immediately. |
Cliff sensors can trigger false positives on dark floor tiles or highly reflective surfaces — the robot may refuse to drive over them.
Docking: The Biggest "Gotcha"
The Create 3 base manages docking autonomously. This affects your ROS2 session in ways that are not obvious.
- Docked = lidar off:
/scan stops publishing. SLAM and RViz show nothing.
- Docked = no movement:
/cmd_vel commands are silently ignored.
- Auto-dock: If the robot wanders near the dock, IR sensors may trigger autonomous docking mid-session.
- After undocking: Wait 3–5 seconds for the lidar motor to spin up before starting SLAM.
- Between sessions: If teleop stops working after a break, check if the robot re-docked.
Diagnosis: ros2 topic hz /scan showing 0 Hz almost always means the robot is docked. Undock first, then retry.
P4 Part 4: Mapping Workflow
- Undock the robot — wait for lidar to spin up (~5 sec)
- On the RPi (SSH) or laptop:
ros2 launch turtlebot4_navigation slam.launch.py
- On the laptop:
ros2 launch turtlebot4_viz view_navigation.launch.py
- In another terminal:
ros2 run teleop_twist_keyboard teleop_twist_keyboard
- Drive slowly (0.1–0.15 m/s) through the entire space, revisit start for loop closure
- Save the map:
ros2 run nav2_map_server map_saver_cli -f ~/real_map --ros-args -p save_map_timeout:=10.0
- Copy to repo:
cp ~/real_map.pgm ~/real_map.yaml <your-repo>/maps/
Keep the dock out of the mapping area, or move the robot away from it — auto-dock will kill your session.
Lidar: Simulation vs. Reality
Gazebo Lidar
- Clean, consistent readings
- No reflective surface issues
- No interference from other robots
- Perfect geometry — walls are flat planes
- Updates at exactly the configured rate
Real RPLIDAR A1
- Noise in every scan (± a few cm)
- Glass, mirrors, dark objects may not reflect
- Crosstalk if multiple robots nearby
- Irregular surfaces (chair legs, cables)
- Speed varies with motor RPM
Impact on mapping: Real maps will have more noise. Walls may appear thicker or slightly uneven. This is normal — your navigation should still work.
Odometry: Drift and Slip
Odometry estimates position from wheel rotations. In simulation it is nearly perfect. In reality, it drifts.
- Wheel slip: Smooth floors or fast turns cause wheels to lose traction
- Carpet vs. tile: Different surfaces change wheel dynamics
- Accumulated error: Small errors compound — after driving 10m, position may be off by 20+ cm
- Why AMCL matters: The particle filter localizer corrects odometry drift by matching lidar scans to the map
Key insight: This is exactly why we use SLAM and AMCL. Pure odometry navigation would fail on a real robot within minutes. The map-based approach you learned is the solution.
Networking with Physical Robots
In Docker, everything runs in one container. With a real TurtleBot 4, your laptop and the robot communicate over Wi-Fi — on a dedicated network set up specifically for the robots.
- Dedicated robot Wi-Fi: Plug in the router and connect your laptop to it. The robots connect to it automatically. Do not use the regular campus network.
ROS_DOMAIN_ID partitions ROS2 traffic so that multiple robots or groups do not interfere with each other. Only nodes sharing the same domain ID can see each other. The bash setup file on the robot's laptop already configures this — you do not need to export it manually.
RMW_IMPLEMENTATION is also set by the bash file. We use rmw_fastrtps_cpp (FastDDS) to match the firmware of the Create 3 base.
Latency matters: Wi-Fi adds 5–50 ms delay. RViz on your laptop may lag slightly behind the real robot.
Mapping: Sim vs. Real
| Aspect | Simulation | Physical |
| Environment | Gazebo world (fixed walls) | Lab/classroom (furniture, people) |
| Speed | Drive fast, no consequences | Drive slow (0.1-0.15 m/s recommended) |
| Map quality | Clean, sharp edges | Noisy, thicker walls, occasional gaps |
| Loop closure | Works reliably | May need slower driving to succeed |
| Dynamic obstacles | None (unless added) | People walking, doors opening |
| Time to map | 2-3 minutes | 5-10 minutes for a room |
Tip: Map when the space is empty, at a slow consistent speed, and make sure to revisit your starting area for good loop closure.
Navigation: What Changes?
- Initial pose matters more: Set it precisely — AMCL needs a good starting estimate on a real robot
- Costmap inflation: You may need to increase the inflation radius to keep the robot farther from real walls (it cannot clip through them)
- Recovery behaviors activate more often: The robot may need to spin or back up when it encounters unexpected obstacles
- Goal tolerance: On a real robot, reaching within 10-15 cm of the goal is typically acceptable
Your navigation node code does NOT change. The same NavigateToPose action works identically — you just need new coordinates for the real environment.
Safety with Physical Robots
Rules
- Always have a team member ready to pick up the robot
- Never run untested code at full speed
- Test in sim first every time you change waypoints
- Keep the area clear of trip hazards
- Low battery = unpredictable behavior
Emergency Stop
- Pick up the robot (wheels will free-spin)
- Or publish a zero velocity:
ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{}"
- Standard: Use the display menu → EStop
- Lite: SSH in and run
ros2 topic pub --once /cmd_vel geometry_msgs/msg/Twist "{}"
Recommended Workflow for P4 Part 4
1. Map the
real space
→
2. Save map
to robot
→
3. Launch Nav2
with real map
→
4. Test goals
in RViz
→
5. Update node
coordinates
→
6. Run node
on real robot
Key principle: Get everything working in simulation first. Transfer to the real robot means only swapping the map file and updating waypoint coordinates.
Textbook Connection
Introduction to Autonomous Mobile Robots (Siegwart, Nourbakhsh, Scaramuzza)
Chapter 5: Perception — sensor models and noise characteristics relevant to understanding the sim-to-real gap in lidar and odometry.
Chapter 3: Mobile Robot Kinematics — differential drive model that explains why wheel slip affects odometry on real surfaces.
Summary
| Topic | Key Takeaway |
| Sim-to-real gap | Real sensors are noisy, surfaces cause drift, network adds latency |
| Lidar differences | Reflective surfaces, noise, and irregular objects affect scan quality |
| Odometry drift | AMCL corrects accumulated errors using the map |
| Networking | Dedicated robot Wi-Fi (plug in router); bash file handles ROS_DOMAIN_ID and rmw_fastrtps_cpp |
| Your code | Stays the same — only map and coordinates change |
| Safety | Test in sim first, drive slow, always be ready to stop |
The Full Course Arc
Everything we did this semester, in order
| Week(s) | Topic | Big Idea |
| 1 | Locomotion & Actuators | How robots move — wheels, legs, motors, degrees of freedom |
| 2 | Motors & Arduino | Turning electrical signals into motion; PWM, H-bridges, microcontrollers |
| 3–4 | Sensors | How robots perceive the world — IR, ultrasonic, encoders, lidar, cameras |
| 5 | PLC Training System | Industrial control logic — ladder diagrams, relays, timers, I/O |
| 7 | Collaborative Robots | Safety-rated robots working alongside humans — teach pendants, motion types |
| 10 | Introduction to ROS2 | Nodes, topics, publishers, subscribers — the OS for robots |
| 11 | SLAM & Mapping | Building a map while localizing — occupancy grids, Cartographer |
| 11–12 | Navigation & Path Planning | Nav2, costmaps, AMCL, global and local planners |
| 12 | Writing ROS2 Nodes | Python nodes, action clients, sending goals programmatically |
| 14 | Simulation to Reality | Transferring everything above to a physical TurtleBot 4 |
How It All Fits Together
You started by understanding how robots move (locomotion, motors) and how they sense the world (sensors, lidar). You then learned how to control systems both industrially (PLCs) and collaboratively (cobots). In the second half you connected all of that to ROS2 — a real robotics software framework — building maps, planning paths, writing autonomous navigation nodes, and finally running everything on a physical robot.
Move
→
Sense
→
Control
→
Map & Navigate
→
Real Robot
Project 4 is the capstone of that journey.
Everything you built in simulation now runs on real hardware.