How can I plot an 2D array as an image with Matplotlib having the y scale relative to the power of two of the y value?
For instance the first row of my array will have a height in the image of 1, the second row will have a height of 4, etc. (units are irrelevant) It's not simple to explain with words so look at this image please (that's the kind of result I want):
alt text http://support.sas.com/rnd/app/da/new/802ce/iml/chap1/images/wavex1k.gif
As you can see the first row is 2 times smaller that the upper one, and so on.
For those interested in why I am trying to do this:
I have a pretty big array (10, 700000) of floats, representing the discrete wavelet transform coefficients of a sound file. I am trying to plot the scalogram using those coefficients. I could copy the array x times until I get the desired image row size but the memory cannot hold so much information...
Use the extent parameter of imshow to map the image buffer pixel coordinates to a data space coordinate system. Next, set the aspect ratio of the image manually by supplying a value such as "aspect=4" or let it auto-scale by using aspect='auto'. This will prevent stretching of the image.
cmap : This parameter is a colormap instance or registered colormap name. norm : This parameter is the Normalize instance scales the data values to the canonical colormap range [0, 1] for mapping to colors. vmin, vmax : These parameter are optional in nature and they are colorbar range.
To change the range of X and Y axes, we can use xlim() and ylim() methods.
This example displays the difference between interpolation methods for imshow . If interpolation is None, it defaults to the rcParams["image. interpolation"] (default: 'antialiased' ). If the interpolation is 'none' , then no interpolation is performed for the Agg, ps and pdf backends.
Have you tried to transform the axis? For example:
ax = subplot(111)
ax.yaxis.set_ticks([0, 2, 4, 8])
imshow(data)
This means there must be gaps in the data for the non-existent coordinates, unless there is a way to provide a transform function instead of just lists (never tried).
Edit:
I admit it was just a lead, not a complete solution. Here is what I meant in more details.
Let's assume you have your data in an array, a
. You can use a transform like this one:
class arr(object):
@staticmethod
def mylog2(x):
lx = 0
while x > 1:
x >>= 1
lx += 1
return lx
def __init__(self, array):
self.array = array
def __getitem__(self, index):
return self.array[arr.mylog2(index+1)]
def __len__(self):
return 1 << len(self.array)
Basically it will transform the first coordinate of an array or list with the mylog2
function (that you can transform as you wish - it's home-made as a simplification of log2). The advantage is, you can re-use that for another transform should you need it, and you can easily control it too.
Then map your array to this one, which doesn't make a copy but a local reference in the instance:
b = arr(a)
Now you can display it, for example:
ax = subplot(111)
ax.yaxis.set_ticks([16, 8, 4, 2, 1, 0])
axis([-0.5, 4.5, 31.5, 0.5])
imshow(b, interpolation="nearest")
Here is a sample (with an array containing random values):
alt text http://img691.imageshack.us/img691/8883/clipboard01f.png
The best way I've found to make a scalogram using matplotlib is to use imshow
, similar to the implementation of specgram
. Using rectangles is slow, because you're having to make a separate glyph for each value. Similarly, you don't want to have to bake things into a uniform NumPy array, because you'll probably run out of memory fast, since your highest level is going to be about as long as half your signal.
Here's an example using SciPy and PyWavelets:
from pylab import *
import pywt
import scipy.io.wavfile as wavfile
# Find the highest power of two less than or equal to the input.
def lepow2(x):
return 2 ** floor(log2(x))
# Make a scalogram given an MRA tree.
def scalogram(data):
bottom = 0
vmin = min(map(lambda x: min(abs(x)), data))
vmax = max(map(lambda x: max(abs(x)), data))
gca().set_autoscale_on(False)
for row in range(0, len(data)):
scale = 2.0 ** (row - len(data))
imshow(
array([abs(data[row])]),
interpolation = 'nearest',
vmin = vmin,
vmax = vmax,
extent = [0, 1, bottom, bottom + scale])
bottom += scale
# Load the signal, take the first channel, limit length to a power of 2 for simplicity.
rate, signal = wavfile.read('kitten.wav')
signal = signal[0:lepow2(len(signal)),0]
tree = pywt.wavedec(signal, 'db5')
# Plotting.
gray()
scalogram(tree)
show()
You may also want to scale values adaptively per-level.
This works pretty well for me. The only problem I have is that matplotlib creates a hairline-thin space between levels. I'm still looking for a way to fix this.
P.S. - Even though this question is pretty old now, I figured I'd respond here, because this page came up on Google when I was looking for a method of creating scalograms using MPL.
You can look at matplotlib.image.NonUniformImage. But that only assists with having nonuniform axis - I don't think you're going to be able to plot adaptively like you want to (I think each point in the image is always going to have the same area - so you are going to have to have the wider rows multiple times). Is there any reason you need to plot the full array? Obviously the full detail isn't going to show up in any plot - so I would suggest heavily downsampling the original matrix so you can copy rows as required to get the image without running out of memory.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With