This project implements an autonomous driving system for the Quanser QCar, along with a simulation environment to accurately test in.
The modules implemented in this project are:
- Hardware interface with camera, depth sensor, IMU, and motor control.
- Image processing pipeline to calibrate and synchronize data.
- Perception system including lane detection and object detection.
- Planning system to determine the vehicles movements.
- Control system to actuate the throttle and steering.
- Simulation using CARLA and Gazebo.
This project is implemented using the Robotic Operating System (ROS). The root of this repository acts as the catkin workspace. ROS packages are found in src/
. Please make sure you have ROS Noetic or Melodic installed before installing this repository, by following the official ROS installation guide.
Build the catkin workspace:
source /opt/ros/$ROS_DISTRO/setup.bash # melodic or noetic
catkin_make -DPYTHON_EXECUTABLE=/usr/bin/python3 # ensure it is built with python3
source devel/setup.bash # update the environment
Gazebo is used as a simulation environment for the system. Gazebo is installed when the 'Desktop-Full' ROS installation method is used. URDF files are provided to accurately model the vehicle. More instructions for running the gazebo simulation can be found in the qcar_gazebo
package.
To run the simulation with all systems running:
roslaunch qcar_gazebo qcar_world.launch
To run the simulation with only perception systems running:
roslaunch qcar_gazebo qcar_perception.launch
CARLA is an open source simulation platform built with the Unreal Engine. In order to use CARLA, first install it here, then install the ROS-CARLA-Bridge. Once that that is completed, the perception systems can be run in the CARLA world using:
roslaunch qcar_carla qcar_perception
This project is licensed under the MIT license - see the License for details.