-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixed normal accumulator vector normalization when there is no normal information (0,0,0). #1728
Fixed normal accumulator vector normalization when there is no normal information (0,0,0). #1728
Conversation
Well there is a note in the |
This is an issue in a runtime configurable pipeline in which the points may have normal information or may remain with an uninitialized value of (0,0,0). Since the pipeline may or may not use normal information for computing keypoints, i must compile it with point types that include normal information. Assume that the localization system mentioned above is in tracking mode, so there is no need to compute normals for the sensor data.
Long story short, if the normal fields are uninitialized, they should remain uninitialized. |
Thanks for the explanation. I agree with what you propose. Do I understand right that upstream Eigen already has zero check? If so, can we add conditional compilation so that with "new" Eigen we do not do extra work? |
Yes, please! On Oct 1, 2016 13:59, "Carlos Miguel Correia da Costa" <
|
e4079b7
to
06007dc
Compare
information (0,0,0).
06007dc
to
7197fed
Compare
I improved the previous code in order to perform the eigen version check at compite time.
|
I'm pretty sure every modern compiler will be able to optimize |
When accumulating points with normal fields but without being initialized (or used yet), the accumulator will fill the normal fields with nan values due to a division by zero (when normalizing the normal vector).
This is a bug that was corrected in recent versions of Eigen (they now check if the divisor is > 0).
https://bitbucket.org/eigen/eigen/commits/12f866a74661131a38c71516007ebf6fc51abd3b
https://bitbucket.org/eigen/eigen/src/e8c837cc9c68df7675b6590e559d1636fb5d8205/Eigen/src/Core/Dot.h?at=default&fileviewer=file-view-default#Dot.h-139
However, even who is using the latest version of Eigen from Ubuntu 16.04 PPA is still affected by this issue.
This can break a typical point cloud processing pipeline in the preprocessing stage, for example:
-> capture Kinect data
-> remove all points with nans
-> voxel grid downsampling
-> more preprocessing stages (such as normal estimation, that might be skipped depending on the runtime user configuration)
-> remove all points with nans
-> cloud registration
-> post processing
After the second removal of points with nans there will be no points left (since they all had nan values in their normals when the user configured the system to skip the normal estimation).
Should this be fixed on the PCL side or should it be added to a warning list for the PCL users?