Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Finding matching submatrices inside a matrix

Tags:

python

numpy

I have a 100x200 2D array expressed as a numpy array consisting of black (0) and white (255) cells. It is a bitmap file. I then have 2D shapes (it's easiest to think of them as letters) that are also 2D black and white cells.

I know I can naively iterate through the matrix but this is going to be a 'hot' portion of my code so speed is an concern. Is there a fast way to perform this in numpy/scipy?

I looked briefly at Scipy's correlate function. I am not interested in 'fuzzy matches', only exact matches. I also looked at some academic papers but they are above my head.

like image 805
DaveO Avatar asked Jun 18 '12 06:06

DaveO


1 Answers

You can use correlate. You'll need to set your black values to -1 and your white values to 1 (or vice-versa) so that you know the value of the peak of the correlation, and that it only occurs with the correct letter.

The following code does what I think you want.

import numpy
from scipy import signal

# Set up the inputs
a = numpy.random.randn(100, 200)
a[a<0] = 0
a[a>0] = 255

b = numpy.random.randn(20, 20)
b[b<0] = 0
b[b>0] = 255

# put b somewhere in a
a[37:37+b.shape[0], 84:84+b.shape[1]] = b

# Now the actual solution...

# Set the black values to -1
a[a==0] = -1
b[b==0] = -1

# and the white values to 1
a[a==255] = 1
b[b==255] = 1

max_peak = numpy.prod(b.shape)

# c will contain max_peak where the overlap is perfect
c = signal.correlate(a, b, 'valid')

overlaps = numpy.where(c == max_peak)

print overlaps

This outputs (array([37]), array([84])), the locations of the offsets set in the code.

You will likely find that if your letter size multiplied by your big array size is bigger than roughly Nlog(N), where N is corresponding size of the big array in which you're searching (for each dimension), then you will probably get a speed up by using an fft based algorithm like scipy.signal.fftconvolve (bearing in mind that you'll need to flip each axis of one of the datasets if you're using a convolution rather than a correlation - flipud and fliplr). The only modification would be to assigning c:

c = signal.fftconvolve(a, numpy.fliplr(numpy.flipud(b)), 'valid')

Comparing the timings on the sizes above:

In [5]: timeit c = signal.fftconvolve(a, numpy.fliplr(numpy.flipud(b)), 'valid')
100 loops, best of 3: 6.78 ms per loop

In [6]: timeit c = signal.correlate(a, b, 'valid')
10 loops, best of 3: 151 ms per loop
like image 55
Henry Gomersall Avatar answered Sep 28 '22 14:09

Henry Gomersall