A Differential Drive Mobile Robot that can perform SLAM (Simultaneous Localization and Mapping) and Autonomous Navigation using navigation stack and avoid obstacles in a room.
NOTE: SCROLL DOWN to view the Photo Gallery of the Robot and its functionalities.
- Navigation Simulation on Gazebo: https://youtu.be/JQFO9v_3PpE
- Performing SLAM with PEPPER: https://youtu.be/AsfRXxU7h94
- Autonomous Navigation Demonstration: https://youtu.be/RHS6B5DbNY4
Full Playlist (including old videos): https://youtube.com/playlist?list=PL44ElmNkyTvBqNAVxaPmKnhxJGcgDSsyj
- Ubuntu 18.04 (Bionic Beaver) OS
- ROS Melodic Morenia Framework
- Python 2.7
- Gazebo Simulator with ROS integration
Setting up the Packages:
Create a catkin workspace and clone all the contents in /catkin_ws/src
directory of this repository into its source folder. And then run the catkin_make
command.
For running the simulations follow the instructions provided in the official Turtlebot3 documentation, as follows:
- Visit the Turtlebot3 Simulation e-manual by clicking on the link: https://emanual.robotis.com/docs/en/platform/turtlebot3/simulation/#gazebo-simulation
- Run the commands present in the following sections:
- 6.1) Gazebo Simulation
- 6.2) SLAM Simulation
- 6.3) Navigation Simulation
- Make sure to check the "Melodic" tab on top of each page to run the compatible ROS commands.
- Connect the Arduino Mega in the Robot to the Laptop via USB Cable and run the
chmod 777 (port name)
command to grant data transfer permissions. - Upload the following Arduino Sketch into the Ardino Mega of the Robot:
/catkin_ws/src/ros_arduino/ros_arduino_base/firmware/two_wheel_base/two_wheel_base.ino
- Plug in the USB port of the on-board Microsoft Kinect on the Robot into the laptop and run the
chmod 777 (port name)
command again on the port of the Kinect. - Run the following commands each in individual terminal tabs/windows, to start the SLAM process:
roslaunch ros_arduino_base base_bringup.launch roslaunch pepper urdf.launch roslaunch pepper pepper_rtab.launch roslaunch teleop_twist_joy teleop.launch
- Run
rviz
command, and select the ros topics you would like to visualize during SLAM. - Now drive the Bot around using the joystick and map the entire room. The mapped areas can be visualized by adding the
/map
topic in the rviz window. - Finally after mapping, run the following command to save the map offline, for future use such as navigation:
rosrun map_server map_saver -f $(find pepper)/maps/map
- Close all the running ros nodes and run the following launch files, to enable point to point navigation:
roslaunch ros_arduino_base base_bringup.launch roslaunch pepper pepper_loc_rtab.launch roslaunch pepper pepper_navigation.launch roslaunch teleop_twist_joy teleop.launch
- Using the joystick, drive the Bot shortly facing a wall/obstacle/corner until the rtab-map package auto-localizes the Bot's location on the map, using the scan data obtained from the Kinect.
- Now select the "Set Goal" option on the toolbar at the top of the rviz window, and click a point on the map where you want the Bot to autonomously navigate to.
- If you have followed the instructions correctly, you can now observe the Bot autonomously doing the path planning and navigating to the Goal point that you chose.
- After the Bot reaches the destination point, you can select your next desired goal and repeat Steps 3, 4 & 5 as many times as you want.