A comprehensive ROS2 mobile robot package featuring SLAM (Simultaneous Localization and Mapping) and Nav2 (Navigation2) integration for autonomous navigation. This package includes support for multiple sensors (LiDAR, Camera, Depth Camera) and provides complete simulation and visualization capabilities.
- Multi-Sensor Support: LiDAR, RGB Camera, Depth Camera
- SLAM Integration: Real-time mapping using slam_toolbox
- Autonomous Navigation: Nav2 stack for path planning and obstacle avoidance
- Simulation Ready: Gazebo simulation with custom worlds
- Visualization: Complete RViz configurations for all sensors
- Teleoperation: Manual control for testing and mapping
- GUI Control Interface: Web-based GUI for robot control (turtlebot3_gui)
- Ubuntu 20.04/22.04
- ROS2 Humble/Foxy
- Gazebo (installed with ROS2)
# Update package list
sudo apt update
# Install ROS2 Navigation2 and SLAM Toolbox
sudo apt install ros-humble-navigation2 ros-humble-nav2-bringup ros-humble-slam-toolbox
# Install teleoperation tools
sudo apt install ros-humble-teleop-twist-keyboard
# Install visualization tools
sudo apt install ros-humble-rviz2# Navigate to your ROS2 workspace
cd ~/ros/dev_ws/src
# Clone this repository
git clone <your-repo-url> mob_bot
# Navigate back to workspace root
cd ~/ros/dev_ws
# Install dependencies
rosdep install --from-paths src --ignore-src -r -y
# Build the package
colcon build --packages-select mob_bot
# Source the workspace
source install/setup.bashmob_bot/
├── config/ # RViz configurations and Nav2 parameters
│ ├── lidar.rviz
│ ├── camera_uncompressed.rviz
│ ├── depth_cam.rviz
│ ├── drive_bot.rviz
│ ├── nav2_params.yaml
│ └── view_bot.rviz
├── description/ # Robot URDF/XACRO files
│ ├── robot.urdf.xacro
│ ├── robot_core.xacro
│ ├── gazebo_control.xacro
│ ├── lidar.xacro
│ ├── camera.xacro
│ └── depth_camera.xacro
├── launch/ # Launch files for different operations
│ ├── rsp.launch.py
│ ├── launch_sim.launch.py
│ ├── slam.launch.py
│ └── navigation.launch.py
├── maps/ # Saved maps from SLAM
├── worlds/ # Gazebo world files
│ ├── empty.world
│ └── obstacles.world
└── CMakeLists.txt
The robot supports three sensor configurations. Modify robot.urdf.xacro to enable/disable sensors:
<!-- Enable LiDAR -->
<xacro:include filename="lidar.xacro"/>
<!-- Comment out other sensors -->
<!-- <xacro:include filename="camera.xacro"/> -->
<!-- <xacro:include filename="depth_camera.xacro"/> --><!-- Comment out LiDAR -->
<!-- <xacro:include filename="lidar.xacro"/> -->
<!-- Enable Camera -->
<xacro:include filename="camera.xacro"/><!-- Comment out LiDAR -->
<!-- <xacro:include filename="lidar.xacro"/> -->
<!-- Enable Depth Camera -->
<xacro:include filename="depth_camera.xacro"/><xacro:include filename="lidar.xacro"/>
<xacro:include filename="camera.xacro"/>
<xacro:include filename="depth_camera.xacro"/>- Type: 2D Laser Scanner
- Range: 0.3m - 12m
- Angle: 360° (full circle)
- Resolution: 1° (360 samples)
- Topic:
/scan - Message Type:
sensor_msgs/LaserScan - Update Rate: 10 Hz
- Resolution: 640x480
- FOV: 108.9°
- Format: RGB8
- Topic:
/camera/image_raw - Message Type:
sensor_msgs/Image
- Provides: Depth/pointcloud data
- Topics: Various depth-related topics
- Message Types:
sensor_msgs/Image,sensor_msgs/PointCloud2
# Terminal 1
cd ~/ros/dev_ws
source install/setup.bash
ros2 launch mob_bot launch_sim.launch.pyThe launch files automatically use the obstacles.world which contains:
- 5 cylindrical obstacles
- 4 box obstacles
- Realistic environment for testing
ros2 launch mob_bot rsp.launch.py# Launch simulation first, then in new terminal:
rviz2 -d src/mob_bot/config/lidar.rvizWhat you'll see:
- Robot Model: 3D visualization of the robot
- LaserScan: Red points showing distance measurements
- TF Frames: Coordinate frames for odom, base_link, laser_frame
- Map: If SLAM is running, shows the built map
Key RViz Displays:
LaserScan→ Topic:/scanTF→ Show all frames enabledRobotModel→ Description Source:robot_description
# Launch simulation with camera enabled, then:
rviz2 -d src/mob_bot/config/camera_uncompressed.rvizWhat you'll see:
- Camera Image: Live RGB camera feed
- Camera Frustum: Field of view visualization
- TF Frames: Camera coordinate frames
Key RViz Displays:
Image→ Topic:/camera/image_rawCamera→ Topic:/camera/image_raw
# Launch simulation with depth camera enabled, then:
rviz2 -d src/mob_bot/config/depth_cam.rvizWhat you'll see:
- Depth Cloud: Point cloud from depth sensor
- Depth Image: Grayscale depth visualization
rviz2 -d src/mob_bot/config/view_bot.rvizrviz2 -d src/mob_bot/config/drive_bot.rvizFor a user-friendly desktop GUI control interface, use the companion turtlebot3_gui repository. This Python-based GUI using PyQt5 provides:
- Real-time Odometry Display: Shows current robot position (x, y coordinates)
- Velocity Control: Interactive buttons for forward/backward movement and rotation
- Speed Control: Adjustable sliders for linear and angular velocities (0-1.0 m/s, 0-1.0 rad/s)
- Live Velocity Graphs: Real-time visualization of linear and angular velocities
- Trajectory Mapping: Interactive plot showing robot's movement path over time
- Dark Theme: Modern dark UI design for comfortable viewing
-
Clone the GUI repository:
cd ~/ros/dev_ws/src git clone https://github.com/mrithip/turtlebot3_gui.git
-
Install Python dependencies:
pip install PyQt5 matplotlib numpy
-
Install GUI ROS2 dependencies:
cd ~/ros/dev_ws rosdep install --from-paths src --ignore-src -r -y colcon build --packages-select tb3_gui source install/setup.bash
-
Launch the robot simulation:
# Terminal 1: Launch robot ros2 launch mob_bot launch_sim.launch.py -
Launch the GUI:
# Terminal 2: Launch GUI ros2 run tb3_gui tb3_gui_node
- Forward: Move robot forward at selected linear speed
- Backward: Move robot backward at selected linear speed
- Left: Rotate robot counter-clockwise at selected angular speed
- Right: Rotate robot clockwise at selected angular speed
- Stop: Immediately halt all robot movement
- Linear Speed Slider: Controls forward/backward movement speed (0-1.0 m/s)
- Angular Speed Slider: Controls rotation speed (0-1.0 rad/s)
- Velocity Graph: Shows real-time linear and angular velocity
- Trajectory Plot: Displays robot's path over time
- Odometry Display: Current position coordinates (x, y)
- Subscribed:
/odom(nav_msgs/Odometry) - Robot odometry information - Published:
/cmd_vel(geometry_msgs/Twist) - Velocity commands to robot
The GUI works seamlessly with the mob_bot package and provides an intuitive alternative to keyboard teleoperation for robot control and monitoring.
- Mapping Phase: Drive robot to build map
- Localization Phase: Use existing map for navigation
- Save/Load: Persist maps for later use
# Terminal 1: Launch SLAM with simulation and teleop
cd ~/ros/dev_ws
source install/setup.bash
ros2 launch mob_bot slam.launch.py
# Terminal 2: Optional - Visualize in RViz
rviz2 -d src/mob_bot/config/lidar.rvizTeleoperation Controls (in the teleop terminal):
Moving around:
u i o
j k l
m , .
For Holonomic mode (strafing), hold down the shift key:
---------------------------
U I O
J K L
M < >
t : up (+z)
b : down (-z)
anything else : stop
q/z : increase/decrease max speeds by 10%
w/x : increase/decrease only linear speed by 10%
e/c : increase/decrease only angular speed by 10%
CTRL-C to quit
Mapping Tips:
- Drive systematically covering all areas
- Go slow for better accuracy
- Cover overlapping areas for loop closure
- Map in both directions for better coverage
# In a new terminal (while SLAM is running)
cd ~/ros/dev_ws
source install/setup.bash
ros2 run nav2_map_server map_saver_cli -f my_mapThis creates:
my_map.pgm- Occupancy grid imagemy_map.yaml- Map metadata and configuration
Modify the launch file or create a localization launch:
# Change mode from 'mapping' to 'localization' in slam.launch.py
# Add map_file_name parameter pointing to your saved map- Completed SLAM map (saved as
my_map.yamlandmy_map.pgm) - Map files placed in
src/mob_bot/maps/directory
# Terminal 1: Launch navigation stack
cd ~/ros/dev_ws
source install/setup.bash
ros2 launch mob_bot navigation.launch.py
# Terminal 2: Launch RViz for navigation
rviz2 -d src/mob_bot/config/view_bot.rviz- Select Tool: Click "2D Goal Pose" button in RViz toolbar
- Set Goal: Click and drag in the map to set:
- Position: Where you want the robot to go
- Orientation: Direction the robot should face when it arrives
- Watch Navigation: Robot will:
- Plan a path (green line)
- Avoid obstacles
- Navigate autonomously to the goal
# Check navigation status
ros2 topic echo /navigate_to_pose/_action/status
# View current goal
ros2 topic echo /goal_pose
# Monitor velocity commands
ros2 topic echo /cmd_vel- Global Costmap: Static map layer for known obstacles
- Local Costmap: Dynamic obstacle avoidance
- Controller: Path following and velocity control
- Planner: Global path planning
- Behavior Tree: Navigation logic flow
lidar.rviz: LiDAR scanning and mapping visualizationcamera_uncompressed.rviz: RGB camera feeddepth_cam.rviz: Depth/pointcloud visualizationdrive_bot.rviz: Teleoperation interfaceview_bot.rviz: General robot monitoring
# Ensure you're in the correct directory and sourced
cd ~/ros/dev_ws
source install/setup.bash- Check that world file exists:
ls src/mob_bot/worlds/ - Ensure Gazebo is properly installed
- Check LiDAR topic:
ros2 topic list | grep scan - Verify robot is moving: check
/odomtopic - Ensure proper TF tree:
ros2 run tf2_tools view_frames
- Verify map exists in
src/mob_bot/maps/ - Check Nav2 parameters in
config/nav2_params.yaml - Ensure proper sensor data is publishing
- Check topic names match between publishers and RViz
- Verify QoS settings (reliable vs best effort)
- Ensure proper TF transforms are published
# Check running nodes
ros2 node list
# Check topics
ros2 topic list
# Check TF tree
ros2 run tf2_tools view_frames
# Monitor specific topic
ros2 topic echo /scan
# Check service availability
ros2 service list
# View parameter values
ros2 param list /slam_toolboxCreate new worlds in worlds/ directory and modify launch files to use them.
Combine multiple sensors by enabling them in robot.urdf.xacro and updating configurations.
Modify nav2_params.yaml to adjust:
- Robot speed limits
- Obstacle avoidance sensitivity
- Path planning algorithms
- Fork the repository
- Create a feature branch
- Make your changes
- Test thoroughly
- Submit a pull request
This project is licensed under the terms specified in the LICENSE file.
For issues and questions:
- Check the troubleshooting section
- Review ROS2 and Nav2 documentation
- Open an issue with detailed information about your setup and the problem
Note: This documentation assumes ROS2 Humble installation. Commands may vary slightly for different ROS2 distributions.