Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Efficiently resize batch of np.array images

I have a 4D np.array size (10000,32,32,3) that represents a set of 10000 RGB images.

How can I use skimage.transform.resize or other function to resize all images efficiently so that the (32,32) is interpolated to (224,224)? I'd prefer to do this with skimage, but I'm open to any solutions that don't use tf.image.resize_images.

My current solution is using tf.image.resize_images, but it's causing GPU memory issues later down in my pipeline (won't free up memory after finishing in jupyter notebook) so I'd like to replace it.

Example:

import tensorflow as tf
X = tf.image.resize_images(X,[224, 224])
with tf.Session() as sess:
    X = X.eval()
like image 357
Austin Avatar asked Oct 31 '18 00:10

Austin


1 Answers

I won't likely accept my own answer, but it seems that a simple for loop is actually fairly fast (says ~300% cpu utilization from top).

from skimage.transform import resize

imgs_in = np.random.rand(100, 32, 32, 3)
imgs_out = np.zeros((100,224,224,3))

for n,i in enumerate(imgs_in):
    imgs_out[n,:,:,:] = resize(imgs_in[n,:,:,:], imgs_out.shape[1:], anti_aliasing=True)

print(imgs_out.shape)

Seems to be 7-8x faster than ndi.zoom on my machine. Trying to parallelize this further with multiprocessing would be even better I think.

like image 181
Austin Avatar answered Sep 22 '22 17:09

Austin