Stingray is a ROS based framework for autonomous underwater vehicles (AUV)
- ROS2 iron - base framework
- YOLOv5 - for object detection
- pytransitions - for state machine
- serial - for communication with stm32 and etc.
- Initialize and update git submodules used in project:
git submodule update --init --recursive
-
Install requirements from yolov5
-
Install ros packages:
sudo apt-get install ros-iron-serial ros-iron-usb-cam ros-iron-rosbridge-server ros-iron-image-view ros-iron-zbar-ros
- Install pytransitions dependencies
sudo apt install graphviz-dev
pip3 install pygraphviz transitions
- Build
source /opt/ros/iron/setup.bash
catkin_make
Setup workspace before you start to work:
source install/setup.bash
See the example of custom launch file here
- Clone and build our simulator (now it's only for sauvc competition).
- Run simulator.
roslaunch stingray_startup main.launch simulation:=true
Use arg:
qr_launch:=true
Show your qr code to vehicle camera.
If stop
qr code has been detected then the running launch file will be stopped.
hardware_connection:=false
- disable connection btw jetson and stm32 via serial (uart_driver)stream:=true
- enable web video stream from all camerasdebug:=true
- enable image_view nodes and publishing output videos after object detectionfile_cam:=true
- provide input videos from filerecord_raw:=true
- enable recording video from all camerasrecord_output:=true
- enable recording video after object detection
stingray_startup
package contains launch files for running the whole system.
Use main.launch
as the base for your custom launch file.
Include like this:
<!-- MAIN -->
<include file="$(find stingray_startup)/launch/main.launch">
<arg name="ROS_OUTPUT" value="$(arg ROS_OUTPUT)" />
<arg name="DEBUG" value="$(arg DEBUG)" />
<arg name="STREAM" value="$(arg STREAM)" />
<arg name="SIMULATION" value="$(arg SIMULATION)" />
<arg name="HARDWARE_CONNECTION" value="$(arg HARDWARE_CONNECTION)" />
<arg name="QR_LAUNCH" value="$(arg QR_LAUNCH)" />
<arg name="QR_CAMERA" value="$(arg QR_CAMERA)" />
<arg name="QR_LAUNCH_PACKAGE_NAME" value="$(arg QR_LAUNCH_PACKAGE_NAME)" />
<arg name="QR_NAME_PATTERN" value="$(arg QR_NAME_PATTERN)" />
</include>
*see args inside launch file or below
Add camera to your custom launch file file like that:
<!-- FRONT CAMERA -->
<include file="$(find stingray_startup)/launch/camera.launch">
<arg name="REAL_CAM" value="true" unless="$(arg SIMULATION)" />
<arg name="SIMULATION_CAM" value="true" if="$(arg SIMULATION)" />
<arg name="CAMERA_NAME" value="$(arg FRONT_CAMERA)" />
<arg name="CAMERA_PATH" value="$(arg FRONT_CAMERA_PATH)" />
<arg name="CAMERA_TOPIC" value="$(arg FRONT_CAMERA_TOPIC)" />
<arg name="ROS_OUTPUT" value="$(arg ROS_OUTPUT)" />
<arg name="SHOW" value="$(arg SHOW)" />
<arg name="DEBUG" value="$(arg DEBUG)" />
<arg name="RECORD_RAW" value="$(arg RECORD_RAW)" />
<arg name="RECORD_OUTPUT" value="$(arg RECORD_OUTPUT)" />
<arg name="RECORD_DIR" value="$(arg RECORD_DIR)" />
</include>
Specify camera args: CAMERA_NAME
, CAMERA_PATH
, CAMERA_TOPIC
.
Also, you can specify the camera type: REAL_CAM
, SIMULATION_CAM
or FILE_CAM
.
RECORD
args for recording video from camera.
Add object detection to your custom launch file file like that:
<!-- OBJECT DETECTION -->
<include file="$(find stingray_startup)/launch/object_detection.launch">
<arg name="IMAGE_TOPIC_LIST" value="$(arg FRONT_CAMERA_TOPIC) $(arg BOTTOM_CAMERA_TOPIC)" />
<arg name="WEIGHTS_PACKAGE_NAME" value="$(arg WEIGHTS_PACKAGE_NAME)" />
<arg name="ROS_OUTPUT" value="$(arg ROS_OUTPUT)" />
<arg name="DEBUG" value="$(arg DEBUG)" />
</include>
Provide the list of topics to subscribe to: IMAGE_TOPIC_LIST
.
Provide the name of the package with weights: WEIGHTS_PACKAGE_NAME
.
- Edit config.yaml to add your labels
- Put best checkpoint of yolov5 as best.pt in weights folder
Nodes:
- harware_bridge - abstract bridge node between hardware and ros
- uart_driver - node for communication with stm32 via uart
Contain gazebo_bridge
node for communication with gazebo simulator.
Contain nodes for working with lifter device.
FSM package allows you to create a missions for robot.
AUVController
class is a high-level mission controller.AUVMission
allows you to create custom missions.SubscriptionEvent
listens to the topic and triggers the event when the message is received. Allows you to create custom events.
Use ObjectDetectionEvent
to trigger the event when the object is detected.
Contains c++ nodes for controlling robot movement - basic
, common
and patterns
.
Contains config files and util libs
Config files:
control.json
- all stuff for control algos and control systemhardware.json
- all stuff for different devices, hardware communication and etc.ros.json
- names for ROS topics, services, actionssimulation.json
- all stuff for simulation mode
Libs:
utils.py
- util python methodsload_config
- loads config file with name
Contains launch files and qr trigger node.
Launch files:
main.launch
camera.launch
object_detection.launch
Contains util libs and nodes.
json.hpp
- cpp lib for json files
Qr trigger node has parameters:
launch_pkg_name
- package name with launch/ directori which contains launch files you want to triggername_pattern
- specify this prefix to trigger specific launch files
Generate qr code from launch file name without custom prefix and .launch.
Example: you have
stingray_qr_mission.launch
file. stingray_qr_ is the prefix which you pass to qr_trigger node asname_pattern
parameter. Also you don't need .launch to generate qr code. Eventually you need to generatemission
to qr code
Launch main.launch
with param:
qr_launch:=true
and you'll able to trigger launch files with qr codes.
If stop
has been detected then running launch file will be stopped.
Contains node for recording video from camera.