I have to find all closest pairs of points in a plane from a given set.
I've successfully implemented a naive algorithm O(n²)
similar to this pseudocode function, where set
is a list of all points on input, count
is count of all points on input, which returns dist
which is minimum distance found and result
is list of all point-pairs with such distance.
function naive(Points[] set, int count)
result = []
distance = INF
for i=0 to count-1:
for j=i+1 to count:
newDistance = distance(set[i], set[j])
if newDistance < distance:
result = []
result.append({set[i], set[j]})
else if newDistance == distance:
result.append({set[i], set[j]})
return (dist, result)
This solution works well, but is, thanks to high O(n²)
complexity, very slow for larger inputs. I want to find a faster, optimised solution, so I've implemented an O(n logn)
solution using recursive divide-and-conquer algorithm based on this article, which works well for most inputs, but since such approach does not iterate through all the points it fails on inputs like this one (order of pairs and order of points inside pairs does not matter):
Input:
{A[0,0], B[1,0], C[1,1] D[0,1]}
Current (wrong) output:
{A[0,0], D[0,1]}, {B[1,0], C[1,1]}
Desired output:
{A[0,0], B[0,1]}, {A[0,0], D[0,1]}, {C[1,1], B[1,0]}, {C[1,1], D[1,0]}
and also since it's recursive, stack overflows easily for larger inputs. What's a better way to solve this problem?
Thank you
But you don't need to compare everything to everything else. For any given pair of points, imagine they lie on a rectangle's [the rectangle being aligned with the coordinate axes] diagonal with length d
.
x = d
will be farther and should not be considered.y = d
will be farther and should not be considered.x = d
will be farther and should not be considered.y = d
will be farther and should not be considered.Similar constraints can be imagined to give you a bounding box in every case which should serve to eliminate a large percentage of points to consider in your inner loop unless you have a particularly dense constellation.
You definitely need quite a bit of dynamic programming here to give you reasonable runtimes for large sets.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With