Repository containing the code for the Newcastle Racing AI team. It contains the ROS nodes to control the car and the docker environment to run the simulation.
---
title: Starting from scratch
---
flowchart TD
subgraph fsds[FSDS]
direction LR
install_fsds[Install the simulator]
unzip_fsds[Unzip the simulator]
run_fsds[Run the simulator]
end
subgraph docker[Docker environment]
direction LR
install_docker[Install Docker]
clone[Clone the repository]
run_docker[Run docker compose]
end
fsds -- Only after the simulator is running--> docker
install_fsds --> unzip_fsds
unzip_fsds --> run_fsds
install_docker --> clone
clone --> run_docker
click install_docker "https://docs.docker.com/engine/install/"
click clone "https://github.com/NewcastleRacingAI/eufs_sim"
click run_fsds "https://github.com/NewcastleRacingAI/eufs_sim?tab=readme-ov-file#Running-the-docker-environment"
click install_fsds "https://github.com/NewcastleRacingAI/eufs_sim?tab=readme-ov-file#Installing-the-simulator"
click run_fsds "https://github.com/NewcastleRacingAI/eufs_sim?tab=readme-ov-file#Running-the-simulator"
Important
If you get an error related to permissions when using any docker commands,
you will need to prefix all of them with sudo.
Read
docs
for more information and instructions on how to avoid it.
In short, you will need to run the following commands:
# Create the docker group
sudo groupadd docker
# Add your user to the docker group
sudo usermod -aG docker $USERLog out and log back in afterwards, so that your group membership is re-evaluated.
We are using our fork of the Formula Student Driverless Simulator. Go to the release page and download the latest release for linux (tested) or windows (not tested).
Unzip it and run either the FSDS.exe on Windows or FSDS.sh on Linux.
You can customize the simulator by editing the settings.json file in the root folder.
To launch the environment, run the following command:
# Host machine
docker compose upThe first time it will take a while to configure everything. Subsequent launches will be much quicker.
Note
If at any moment you want to clean the slate and start from scratch, run
docker compose down --volumesSome parameters can be configured in the docker-compose.yml file, such as the
simulation timestep.
While the containers are running, you can attach a shell to the ROS container
newcastle-racing-ai. You can either do it from the terminal or use
VSCode with the Dev
containers
plugin installed or similar IDEs.
# Host machine
docker exec -it newcastle-racing-ai /bin/bashflowchart TD
c[Controller]
p[Perception]
pl[Planner]
subgraph FSDS
ss[\Simulated sensors\]
gt[\Ground Truth\]:::gt
v[[Car]]
end
ss -- /nrfai/camera --> p
ss -- /nrfai/depth --> p
ss -- /nrfai/imu --> p
ss -- /nrfai/lidar --> p
gt -. /nrfai/odom .-> p
gt -. /nrfai/track .-> p
p -. /nrfai/cones .-> pl
pl -- /nrfai/path --> c
c -- /nrfai/control --> v
c -. /nrfai/reset .-> v
v -- UPDATE --> ss
classDef gt stroke-dasharray: 5px, 5px;
Note
The Simulated sensor will eventually be replaced by real sensors on the car.
Note
The Ground Truth is used for testing, but it will not be available on the real car.
The ROS nodes, the rectangles in the diagram above, mimic the team division of the Newcastle Racing AI team:
- Perception: receives the camera feed and other sensors to determines the cone information
- Planner: receives the cone information to calculates the path to follow
- Controller: receives the path to follow and sends commands to the car
| Topic Name | Type | From | To | Description |
|---|---|---|---|---|
/nrfai/camera |
Image | Sensors | Perception | Camera feed from the car (optional) |
/nrfai/depth |
Image | Sensors | Perception | Depth image from the camera feed |
/nrfai/imu |
Imu | Sensors | Perception | IMU data (orientation, angular velocity, linear acceleration) |
/nrfai/lidar |
PointCloud2 | Sensors | Perception | Lidar points |
/nrfai/odom |
Odometry | Ground Truth | Perception | Odometry information |
/nrfai/track |
Track | Ground Truth | Perception | Track information |
/nrfai/cones |
ConeArrayWithCovariance | Perception | Planner | Cones as detected by the Perception (TBD) |
/nrfai/path |
PoseArray | Planner | Controller | List of waypoints calculated by the Planner node (TBD) |
/nrfai/control |
ControlCommand | Controller | Car | Command to move the car |
/nrfai/reset |
Bool | Controller | Car | Reset the simulation |
Note
The depth topic returns an image with the depth information of each pixel.
More precisely, the distance is capped to 40m and the values are normalized to the range [0, 255].
The returned image is in the mono8 format, where each pixel represents the depth in meters.
Note
The position of the cones in the /nrfai/track topic is relative to the car's initial position.
By default, the ROS nodes will be launched automatically when the container
starts. If you want to make changes, stop the container, apply the changes to the nodes in the newcastle_racing_ai folder and then restart the container.
The manual process is shown in entrypoint.sh.
When the simulation is running, the simulated sensors will send the cone
information using the /nrfai/camera, nra/imu and nra/lidar topics.
You can check the information being sent by using the ros2 topic echo
command, for example:
# Check the camera feed
ros2 topic echo /nrfai/camera
# Check the IMU data
ros2 topic echo /nrfai/imu
# Check the lidar points
ros2 topic echo /nrfai/lidarFurthermore, for debugging purposes, the Perception node will also store the images it receives in the newcastle_racing_ai/imgs folder.
# Build all packages in the workspace
colcon build
# Build a specific package
colcon build --packages-select <package_name>
# Source the current workspace
source install/setup.bash
# Check the available nodes
ros2 node list
# Check the available topics
ros2 topic list
# Visualise the messages in a topic
ros2 topic echo <topic_name>
# Get information about a topic
ros2 topic info <topic_name>
# Send a message to a topic
ros2 topic pub <topic_name> <message_type> <message>
# e.g.
# ros2 topic pub /nrfai/control \
# newcastle_racing_ai_msgs/ControlCommand \
# '{ throttle: 1, brake: 0, steering: -1 }'
#
# ros2 topic pub /nrfai/reset \
# std_msgs/Bool '{ data: true }'