I want to plot a 2D array (roughly 1000x1000) with the values corresponding to a color scale. So I used matplotlib.pcolor, which did just that but for some reason it is super slow when it gets to those dimensions (like 2 minutes or so just to plot). What is the reason for that? Would converting the float values to int16 or so help? Are there any alternatives to pcolor?
from pylab import *
data=genfromtxt('data.txt',autostrip=True, case_sensitive=True)
pcolor(data,cmap='hot')
colorbar()
show()
data.txt is containing the array. The loading process does take a few seconds, but the main computing time is definitely used by BOTH the pcolor() and show() function (roughly maybe 60-90 secs each).
As a note for future googlers, there is also pcolormesh
and pcolorfast
.
The documentation for pcolormesh
states that:
pcolormesh is similar to pcolor(), but uses a different mechanism and returns a different object; pcolor returns a PolyCollection but pcolormesh returns a QuadMesh. It is much faster, so it is almost always preferred for large arrays.
imshow
should be even faster, but is a little less flexible with regards to e.g. non-rectilinear axes.
See this page for a nice comparison between pcolor
, pcolormesh
, and imshow
.
imshow
will be much faster. pcolor
returns a PolyCollection, which is going to be fairly slow with a million elements, whereas imshow is just an image.
Note that the indexing in pcolor is slightly different than imshow, though you may not need to worry about it depending on how you used pcolor. Also, often when going from pcolor to imshow one wants to set interpolation="nearest"
in imshow (but for such large images this may not matter either).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With