I am implementing a contour-finding algorithm for pixel-wide contours in binary images. It needs to be robust against deletion of individual pixels (i.e. pixel-wide gaps).
Various attempts at dilation & erosion kernels have not yielded a reliable solution.
Instead the reliable solution I want ot implement is to pass a pattern matching kernel over the image, which can directly fill in the gaps based on surrounding pixels. For example, when the exact pattern on the left is observed at a location, it is replaced with the right (where * means wildcard):
[1 * *] [1 * *]
[* 0 *] ==> [* 1 *]
[* * 1] [* * 1]
[1 0 *] [1 0 *]
[* 0 1] ==> [* 1 1]
[* * *] [* * *]
[* 1 *] [* 1 *]
[* 0 *] ==> [* 1 *]
[* 1 *] [* 1 *]
And define the ~14 or so replacements necessary to fill in the possible gaps in each 3x3 window.
It could be implemented in raw Python but likely to be extremely slow without low level vectorized operations.
Can this be done through OpenCV or some other fast operation?
Thanks to @beaker comment above I implemented a solution. Design a kernel with the neighbouring pixels of interest as 0.5 and center as 1 and it will fill in the center with a 1 if it is missing, although some other pixels will be 2. Then clip the values to 1 and you get the desired result.
It needs to be applied independently for each direction of gap which isn't ideal but still works.
img_with_gap = np.array(
[[1,0,0,0,0],
[0,1,0,0,0],
[0,0,0,0,0],
[0,0,0,1,0],
[0,0,0,0,1]], dtype=np.uint8)
kernel = np.array(
[[0.5, 0, 0],
[0 , 1, 0],
[0 , 0, 0.5]])
connected_img = np.minimum(cv2.filter2D(img_with_gap, -1, kernel), 1)
connected_img
An even tighter implementation is to do an exact pattern match by penalizing the zero ones, clipping to {0,1} and ensuring that nothing deleted from the original image:
kernel = np.array([[0.5, -10.0, -10.0],
[-10.0 , 1, -10.0],
[-10.0 , -10.0, 0.5]])
connected_img = np.maximum(img, np.clip(cv2.filter2D(img, -1, kernel), 0, 1))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With