Skip to content

Commit 1e5a72c

Browse files
authored
Update README.md
1 parent 5259718 commit 1e5a72c

File tree

1 file changed

+20
-7
lines changed

1 file changed

+20
-7
lines changed

README.md

+20-7
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ If no CUDA enabled device is available, you can build without the GPU implementa
6363
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
6464
```
6565

66-
### Install and run example
66+
### Install and run the example
6767

6868
The getting started repository contains a ROS bagfile (a depth image sequence of an object being moved),
6969
and mesh models of some objects. Additionally it contains launch files, which allow
@@ -90,6 +90,9 @@ If you did not install CUDA, you can run instead:
9090
```bash
9191
roslaunch dbot_example launch_example_cpu.launch
9292
```
93+
Note that the tracking performance is significantly better with the GPU version.
94+
95+
9396
As soon as you launch the example, an interactive marker should show up in
9497
rviz. This is for initialization of the tracker, you can move it to align it
9598
with the point cloud, but it should already be approximately aligned. Once you
@@ -117,9 +120,10 @@ inproceedings{wuthrich-iros-2013,
117120

118121
## Robot Tracking
119122

123+
### Workspace setup and compilation
120124
The robot tracking setup builds on top of the object tracking, i.e. follow
121-
first the workspace setup and of the object tracking above. Then checkout
122-
the following package to the workspace
125+
first the workspace setup of the object tracking above. Then continue
126+
with the instructions below:
123127

124128
```bash
125129
cd $HOME
@@ -133,7 +137,7 @@ Again, if no CUDA enabled device is available, you can deactivate the GPU implem
133137
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=Off
134138
```
135139

136-
### Example Robot Tracking Project using MPI Apollo Robot
140+
### Install and run the example
137141

138142
Add the following example project to the workspace
139143

@@ -144,17 +148,26 @@ cd ..
144148
catkin_make -DCMAKE_BUILD_TYPE=Release -DDBOT_BUILD_GPU=On
145149
source devel/setup.bash
146150
```
147-
Once compile you can run the robot tracker along with the
151+
Now you can run the robot tracker along with the
148152
recorded sensory data:
149153

150154
```bash
151155
roslaunch dbrt_example launch_example_gpu.launch
152156
```
153157

158+
If CUDA is not being used, you can start the CPU based setup instead:
159+
```bash
160+
roslaunch dbrt_example launch_example_cpu.launch
161+
```
162+
Note that the tracking performance is significantly better with the GPU version.
163+
154164
This will start the data playback, the visualization and the robot tracker.
165+
You should see a point cloud in white, the robot model using only joint
166+
encoders in red, and the corrected robot model in blue. It should be visible
167+
that the blue robot model is significantly better aligned with the point cloud than
168+
the red one.
169+
155170

156-
If CUDA is not being used, you can start the CPU based setup by launching
157-
`launch_example_cpu.launch` instead. Note that the CPU version will run slower.
158171

159172
### Addition documentation
160173

0 commit comments

Comments
 (0)