Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

known algorithm to reduce the number of points in a closed contour

Tags:

algorithm

I have a library that produce a contour, as a list of coordinates like :

  [[464.5, 551. ],
   [464.5, 550. ],
   [464. , 549.5],
   [463.5, 549. ],
   [463. , 548.5],
   [462. , 548.5],
   [461. , 548.5],
   [460.5, 549. ],
   [460. , 549.5],
   [459. , 549.5],
   [458. , 549.5],
   [457. , 549.5],
   ...

Coordinates are connected by straight lines, defining a closed irregular not self-intersecting polygon.

From the example above, we can see that some points could be removed without losing any surface area, but I don't care if the algorithm has some loss, as long as it is configurable (like intersection of area over union of area > x, or something else ?)

Are there some known algorithm that will reduce the number of points of a closed contour ?

PS: the naive algorithm is to test all subsets of points, and take the smallest subset that is above the acceptable loss. The issue is that i might have hundreds of coordinates, and the number of subsets is exponential (2^(coord_count)). Even computing the loss is expensive : i need to compute the intersection and union of 2 polygons and then compute their surface.

EDIT :

  • Removing consecutive points that are aligned is easy and will certainly be the first step to decrease the time complexity of the following steps.
  • what i wish is a new polygon whose surface coverage is nearly the same but that has far fewer coordinates : i don't even care if the new polygon is not using any coordinates of the original one (but this seems even more complex than removing some points of the original polygon).
like image 384
Thierry Avatar asked Sep 14 '25 14:09

Thierry


2 Answers

I suggest the following procedure:

  1. For each 3 consecutive points, check that the line joining the two points either side does not intersect the polygon.

enter image description here

  1. Calculate the "area contribution" if the middle point is removed; this will be negative if they are convex and positive if concave.

enter image description here

  1. If you want the optimum result with the fewest number of points, always remove the point which minimizes the net change in area at any stage. Be careful with signs.

enter image description here

  1. Repeat this until the next optimal net change exceeds the specified tolerance.

  • The naive version of this algorithm is O(N^2) in the worst case. You can optimize it somewhat by using a BST / heap to keep track of area deltas corresponding to each point, although updates might be fiddly. A quadtree for intersection testing might also be useful, although it incurs a setup penalty of O(N log N) which can only be negated if a large number of points is removed.

  • Douglas-Peucker doesn't always produce the optimum result (as many points removed as possible without exceeding the area difference threshold); nor does the original algorithm take self-intersection into account.

like image 154
meowgoesthedog Avatar answered Sep 17 '25 08:09

meowgoesthedog


The Alex Kemper comment on the question answered my question :

The Ramer-Douglas-Peucker algorithm is quite good in reducing points of a route. The desired accuracy (= max error) can be specified

I used this algorithm, implemented in scikit-image lib : skimage.measure.approximate_polygon

like image 32
2 revs, 2 users 94%Thierry Avatar answered Sep 17 '25 09:09

2 revs, 2 users 94%Thierry