@@ -63,7 +63,7 @@ If no CUDA enabled device is available, you can build without the GPU implementa
63
63
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
64
64
```
65
65
66
- ### Install and run example
66
+ ### Install and run the example
67
67
68
68
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
69
69
and mesh models of some objects. Additionally it contains launch files, which allow
@@ -90,6 +90,9 @@ If you did not install CUDA, you can run instead:
90
90
``` bash
91
91
roslaunch dbot_example launch_example_cpu.launch
92
92
```
93
+ Note that the tracking performance is significantly better with the GPU version.
94
+
95
+
93
96
As soon as you launch the example, an interactive marker should show up in
94
97
rviz. This is for initialization of the tracker, you can move it to align it
95
98
with the point cloud, but it should already be approximately aligned. Once you
@@ -117,9 +120,10 @@ inproceedings{wuthrich-iros-2013,
117
120
118
121
## Robot Tracking
119
122
123
+ ### Workspace setup and compilation
120
124
The robot tracking setup builds on top of the object tracking, i.e. follow
121
- first the workspace setup and of the object tracking above. Then checkout
122
- the following package to the workspace
125
+ first the workspace setup of the object tracking above. Then continue
126
+ with the instructions below:
123
127
124
128
``` bash
125
129
cd $HOME
@@ -133,7 +137,7 @@ Again, if no CUDA enabled device is available, you can deactivate the GPU implem
133
137
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
134
138
```
135
139
136
- ### Example Robot Tracking Project using MPI Apollo Robot
140
+ ### Install and run the example
137
141
138
142
Add the following example project to the workspace
139
143
@@ -144,17 +148,26 @@ cd ..
144
148
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
145
149
source devel/setup.bash
146
150
```
147
- Once compile you can run the robot tracker along with the
151
+ Now you can run the robot tracker along with the
148
152
recorded sensory data:
149
153
150
154
``` bash
151
155
roslaunch dbrt_example launch_example_gpu.launch
152
156
```
153
157
158
+ If CUDA is not being used, you can start the CPU based setup instead:
159
+ ``` bash
160
+ roslaunch dbrt_example launch_example_cpu.launch
161
+ ```
162
+ Note that the tracking performance is significantly better with the GPU version.
163
+
154
164
This will start the data playback, the visualization and the robot tracker.
165
+ You should see a point cloud in white, the robot model using only joint
166
+ encoders in red, and the corrected robot model in blue. It should be visible
167
+ that the blue robot model is significantly better aligned with the point cloud than
168
+ the red one.
169
+
155
170
156
- If CUDA is not being used, you can start the CPU based setup by launching
157
- ` launch_example_cpu.launch ` instead. Note that the CPU version will run slower.
158
171
159
172
### Addition documentation
160
173
0 commit comments