Currently I'm using PIL and NumPy. I have a colored png
image and I want to:
This is what I'm trying (in IPython w/ --pylab
flag):
In [1]: import Image
In [2]: img = Image.open('ping.png').convert('LA')
In [3]: img_as_np = np.asarray(img)
In [4]: img_as_np
Out[4]: array(<Image.Image image mode=LA size=1000x1000 at 0x105802950>, dtype=object)
In [5]: img_fft = fft.fft2(img_as_np) // IndexError: index out of range for array
Images are an easier way to represent the working model. In Machine Learning, Python uses the image data in the format of Height, Width, Channel format. i.e. Images are converted into Numpy Array in Height, Width, Channel format.
Method 1: Using the cv2.Import the OpenCV and read the original image using imread() than convert to grayscale using cv2. cvtcolor() function. destroyAllWindows() function allows users to destroy or close all windows at any time after exiting the script.
You want to use the mode 'L' instead of 'LA' as the parameter to the convert() method. 'LA' leaves an alpha channel and then the numpy.asarray doesn't work as you intended. If you need the alpha channel, then you will need a different method to convert to a numpy array. Otherwise, use mode 'L'.
It looks like you're using a version of PIL prior to 1.1.6, where they introduced the methods so that numpy would know what to do with an Image
. So you're just getting img_as_np
as a one-element array containing an Image
object (which is what Out[4]
is showing you).
You instead need to do something like np.asarray(img.getdata())
, which will give you a num_pixels x num_channels
array of integers between 0 and 255 (at least for the png I tried). You may want to do
img_as_np = np.asarray(img.getdata()).reshape(img.size[1], img.size[0], -1)
to lay it out like the image (transposed). You might also want to divide by 255 to get float values between 0 and 1, if that's the format you're expecting (as does e.g. matplotlib's imshow
).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With