Pylot is an autonomous vehicle platform for developing and testing autonomous vehicle components (e.g., perception, prediction, planning) on the CARLA simulator and real-world cars.
The easiest way to get Pylot running is to use our Docker image. Please ensure
you have nvidia-docker on your machine before you start installing Pylot.
In case you do not have nvidia-docker please
run ./scripts/install-nvidia-docker.sh
We provide a Docker image with both Pylot and CARLA already setup.
docker pull erdosproject/pylot
nvidia-docker run -itd --name pylot -p 20022:22 erdosproject/pylot /bin/bashFollowing, start the simulator in the container:
nvidia-docker exec -i -t pylot /home/erdos/workspace/pylot/scripts/run_simulator.shFinally, start Pylot in the container:
nvidia-docker exec -i -t pylot /bin/bash
cd workspace/pylot/
python3 pylot.py --flagfile=configs/detection.confIn case you desire to visualize outputs of different components (e.g., bounding boxes),
you have to forward X from the container. First, add your public ssh key to the
~/.ssh/authorized_keys in the container:
nvidia-docker cp ~/.ssh/id_rsa.pub pylot:/home/erdos/.ssh/authorized_keys
nvidia-docker exec -i -t pylot sudo chown erdos /home/erdos/.ssh/authorized_keys
nvidia-docker exec -i -t pylot sudo service ssh startFinally, ssh into the container with X forwarding:
ssh -p 20022 -X erdos@localhost
cd /home/erdos/workspace/pylot/
python3 pylot.py --flagfile=configs/detection.conf --visualize_detected_obstaclesIf everything worked ok, you should be able to see a visualization like the one below:
Alternatively, you can install Pylot on your base system by executing the following steps:
./install.sh
pip install -e ./Next, start the simulator:
export CARLA_HOME=$PYLOT_HOME/dependencies/CARLA_0.9.8/
./scripts/run_simulator.shIn a different terminal, setup the paths:
export CARLA_HOME=$PYLOT_HOME/dependencies/CARLA_0.9.8/
cd $PYLOT_HOME/scripts/
source ./set_pythonpath.shFinally, run Pylot:
cd $PYLOT_HOME/
python3 pylot.py --flagfile=configs/detection.confPylot comprises of several components: obstacle detection, traffic light detection, lane detection, obstacle tracking, localization, segmentation, fusion, prediction, planners, and control. Each component is implemented using one or more ERDOS operators and can be executed in isolation or with the entire Pylot application. Please read the Documentation for a more in depth description.
Run the following command to see a demo of all the components, and the Pylot driving policy:
python3 pylot.py --flagfile=configs/demo.confThe demo will execute: obstacle detection, traffic light detection, segmentation, prediction, planning, and the driving policy.
You can also run components in isolation:
Pylot supports three object detection models: frcnn_resnet101,
ssd-mobilenet-fpn-640 and ssdlite-mobilenet-v2. The following command runs
a detector in isolation:
python3 pylot.py --flagfile=configs/detection.confIn case you want to evaluate the detector (i.e., compute mAP), you can run:
python3 pylot.py --flagfile=configs/detection.conf --evaluate_obstacle_detectionIn case you are not satisfied with the accuracy of our obstacle detector, you can run a perfect version of it:
python3 pylot.py --flagfile=configs/perfect_detection.confIf the detector does not run at your desired frequency, or if you want to track obstacles across frames, you can use a mix of detector plus tracker by running:
python3 pylot.py --flagfile=configs/tracking.confPylot has uses a separate component for traffic light detection and classification. The following command runs the component in isolation:
python3 pylot.py --flagfile=configs/traffic_light.confIn case you require higher accuracy, you can run perfect traffic light detection
by passing the --perfect_traffic_light_detection flag.
python3 pylot.py --flagfile=configs/lane_detection.confpython3 pylot.py --flagfile=configs/tracking.confIn order to run Pylot's segmentation component in isolation execute the following command:
python3 pylot.py --flagfile=configs/segmentation.confSimilarly, pass --perfect_segmentation if you desire ideal pixel semantic
segmentation.
Pylot offers a simple linear prediction component:
python3 pylot.py --flagfile=configs/prediction.confThe planning component provides two planning options, which can be specified
using the --planning_type flag:
waypoint: a simple planner that follows predefined waypoints. These waypoints can either be either pre-specified or computed using the A-star planner part of the CARLA simulator map. The planner ensures that the ego-vehicle respects traffic lights, stops whenever there are obstacles in its path, but does not implement obstacle avoidance.frenet_optimal_trajectory: a Frenet Optimal Trajectory planner.rrt_star: a Rapidly-explory Random Tree planner.hybrid_astar: a Hybrid A* planner.
# To run the Frenet Optimal Trajectory planner.
python3 pylot.py --flagfile=configs/frenet_optimal_trajectory_planner.conf
# To run the RRT* planner.
python3 pylot.py --flagfile=configs/rrt_star_planner.conf
# To run the Hybrid A* planner.
python3 pylot.py --flagfile=configs/hybrid_astar_planner.confPylot supports three controllers, which can be specified using the
control flag:
pid: follows the waypoints computed by the planning component using a PID controller.mpc: uses model predictive control for speed and waypoint following.carla_auto_pilot: uses the CARLA auto pilot to drive on predefined routes. This controller drives independent of the output of the other components.
You can run all the components, together with one of the two policies by executing:
# Runs all components using the algorithms we implemented and the models we trained:
python3 pylot.py --flagfile=configs/e2e.conf
# Runs the MPC
python3 pylot.py --flagfile=configs/mpc.conf
# Runs the CARLA auto pilot.
python3 pylot.py --control=carla_auto_pilotIn case you want to debug the application, you can active additional logging
by passing: --log_file_name=pylot.log --v=1 to your command.
Pylot also provides a script for collecting CARLA data such as: RGB images, segmented images, obstacle 2D bounding boxes, depth frames, point clouds, traffic lights, obstacle trajectories, and data in Chauffeur format.
Run python3 data_gatherer.py --help to see what data you can collect.
Alternatively, you can inspect
a link
for an example of a data collection setup.
In case you want to build your own images from the latest code, you can execute:
cd docker
./build_images.shThe script creates two Docker images: one that contains the CARLA simulator and another one that contains ERDOS and Pylot.

