Description
Version: Apollo 9.0 Hardware: X86 and Orin
Issue Details: I am experiencing an issue where launching lidar and camera perception using the same launch file results in the lidar frame rate dropping from the original 10 frames to only 2 or 3 frames. However, if I use two separate launch files to start the lidar and camera simultaneously, the frame rate remains normal.
Does this indicate that running camera perception and lidar perception in the same process leads to GPU resource contention, whereas separate processes do not encounter this issue?
Currently, I want to perform multi-sensor fusion, such as with perception_all.launch, which starts lidar perception and camera perception within the same process. I wish to separate these two. But because the channels sent by the them are internal channels, they cannot be found in cyber_monitor. Therefore, to start multi_sensor_fusion, it seems I have to run it in the same launch file, but the frame rate is poor. What should I do? Is it possible to use msg_adapter to do something helpful? Are there any examples available?
Activity