I have a bottleneck in a 2D median filter (3x3 window) I use on a very large set of images, and I'd like to try and optimize it. I've tested scipy.ndimage
median_filter, as well as PIL
, scipy.signal
and scikits-image
. However, browsing in SO I've learned that there's a fast O(n) median filter out there in C (Median Filtering in Constant Time see Rolling median algorithm in C), and I wondered whether I can implement it in Python using scipy.weave.inline ?
Any suggestions on an alternative route?
Try this: Rolling median in C - Turlach implementation
http://ideone.com/8VVEa
Usage:
Mediator* m = MediatorNew(9);
for (...)
{
MediatorInsert(m, value);
median = MediatorMedian(m);
}
I believe this is the same as the R algo, but cleaner (amazingly so, in fact).
You can either wrap this, or port it and use Numba (or Cython). I think I'd recommend Numba over Cython, if nothing else because it is plain old python code.
I suggest adding this to scikits, if it runs faster than the one in scikits already :)
If your still interested I'd try numpy's reshape and median:
a= some big array
a.reshape(N,3,3) #N being specific to your array
[numpy.median(m) for m in a]
I don't know how this scales compared to your testet methods but if you want to optimize with C you could fasten the for loop in the list comprehension...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With