For running the project, one system is used to run ROS to control the robot based on the control signal received, which is the result from the lane and object detection model on an external GPU.
Connect to the robot via SSH and run the following commands:
rosrun tankbot_scripts differential_tank.py
rosrun lightDetection tst_server.py
Run the following script inside the Deployment scripts
folder:
python wheel_control_generation.py
After the above steps, in the ROS system, run:
rosrun lightDetection turtlebot3_pointop_key
Run:
python fusedControl_v1.py
rosrun lightDetection autoUI.py
Please copy the resources folder to the Deployment Scripts folder in the repository.
For a video demonstration of the project, visit the webpage here.
If you find our work useful, please consider citing us!
@article{Gaikwad2023,
title={Developing a computer vision based system for autonomous taxiing of aircraft},
author={Gaikwad, P. and Mukhopadhyay, A. and Muraleedharan, A. and Mitra, M. and Biswas, P.},
journal={Aviation},
volume={27},
number={4},
pages={248--258},
year={2023},
month={Dec},
doi={10.3846/aviation.2023.20588}
}
Parts of this project page were adopted from the Nerfies page.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.