I am dealing with a point cloud, i.e. a vector of points, as a result of computation, which contains duplicate points (up to 10% of the size of the cloud).
My implementation was to sort those points according to x,y, and z value and then use the std::unique
function. The resulting cloud however still contains duplicates, even though the sorting itself seems to be working.
Here's the crucial code
bool comparePoint(pcl::PointXYZINormal p1, pcl::PointXYZINormal p2){
if (p1.x != p2.x)
return p1.x > p2.x;
else if (p1.y != p2.y)
return p1.y > p2.y;
else
return p1.z > p2.z;
}
bool equalPoint(pcl::PointXYZINormal p1, pcl::PointXYZINormal p2){
if (p1.x == p2.x && p1.y == p2.y && p1.z == p2.z)
return true;
return false;
}
void KDsearch::cullDuplePoints(){
std::sort(points.begin(), points.end(), comparePoint);
std::unique(points.begin(), points.end(), equalPoint);
}
And here is a partial extraction of the output pointcloud (x,y, and z coordinates):
1.96828 -535.09515 2794.8391
1.96627 -636.95264 2914.0366
1.96627 -636.95264 2914.0366
1.9651 108.77433 2350.9841
1.9651 108.77433 2350.9841
1.9642299 -206.19427 5618.4629
1.9642299 -206.19427 5618.4629
1.96386 -1880.3784 1346.0654
So is unique not working properly or is there a mistake in my equal condition?
The points itself also contain normal coordinates, but they are not important for culling, so I did not use them in the code.
To remove duplicate values, click Data > Data Tools > Remove Duplicates. To highlight unique or duplicate values, use the Conditional Formatting command in the Style group on the Home tab.
Yes, but sorting a vector modifies the original content.
std::unique
doesn't remove anything, it only moves elements around and returns an iterator "past the end" of the unique interval in the modified collection.
(The actual contents of the collection past the returned iterator is unspecified.)
You need to remove the duplicates explicitly:
std::sort(points.begin(), points.end(), comparePoint);
auto unique_end = std::unique(points.begin(), points.end(), equalPoint);
points.erase(unique_end, points.end());
You also need to be careful about floating point comparisons.
Your problem is that comparing floating point numbers for equality is always a difficult exercise. You will probably find that your points are (for example) actually:
1.965100000001 108.77433 2350.9841
1.965099999999 108.77433 2350.9841
... and those aren't equal.
If you want to treat points as "equal" if they are within 0.00001 of each other, you get the problem that your "equality" condition is not transitive. (0.0000,0,0) is "close" to (0.000009999,0,0) and to (-0.00009999,0,0), but those latter two points are "far" from each other. This is a hard problem to solve in general. Good luck!
If you know something about the values of the coordinates (for example, that they are in millimetres, and values are accurate to 100 nanometres), you could round to the nearest 100 nm, and store as long long. So:
struct IntPoint {
const static double ScaleFactor = 10000;
long long x,y,z;
IntPoint(const pcl::PointXYZINormal &p)
: x(llround(p.x*ScaleFactor ))
, y(llround(p.y*ScaleFactor ))
, z(llround(p.z*ScaleFactor ))
{}
};
Convert your point cloud to IntPoint, and then your sort + unique (+erase), should work.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With