I'm using a set of 32x32x32 grayscale images and I want to apply random rotations on the images as a part of data augmentation while training a CNN by tflearn + tensorflow. I was using the following code to do so:
# Real-time data preprocessing
img_prep = ImagePreprocessing()
img_prep.add_featurewise_zero_center()
img_prep.add_featurewise_stdnorm()
# Real-time data augmentation
img_aug = ImageAugmentation()
img_aug.add_random_rotation(max_angle=360.)
# Input data
with tf.name_scope('Input'):
X = tf.placeholder(tf.float32, shape=(None, image_size,
image_size, image_size, num_channels), name='x-input')
Y = tf.placeholder(tf.float32, shape=(None, label_cnt), name='y-input')
# Convolutional network building
network = input_data(shape=[None, 32, 32, 32, 1],
placeholder = X,
data_preprocessing=img_prep,
data_augmentation=img_aug)
(I'm using a combination of tensorflow and tflearn to be able to use the features from both, so please bear with me. Let me know if something is wrong with the way I'm using placeholders, etc.)
I found that using the add_random_rotation (which itself uses scipy.ndimage.interpolation.rotate) treats the third dimension of my grayscale images as channels (like RGB channels) and rotates all 32 images of the third dimension by a random angel around z-axis(treats my 3D image as a 2D image with 32 channels). But I want the image to be rotated in the space (around all three axes). Do you have any idea how can I do that? Is there a function or package for easily rotating the 3D images in space?!
def random_rotation_3d(batch, max_angle):
""" Randomly rotate an image by a random angle (-max_angle, max_angle).
Arguments:
max_angle: `float`. The maximum rotation angle.
Returns:
batch of rotated 3D images
"""
size = batch.shape
batch = np.squeeze(batch)
batch_rot = np.zeros(batch.shape)
for i in range(batch.shape[0]):
if bool(random.getrandbits(1)):
image1 = np.squeeze(batch[i])
# rotate along z-axis
angle = random.uniform(-max_angle, max_angle)
image2 = scipy.ndimage.interpolation.rotate(image1, angle, mode='nearest', axes=(0, 1), reshape=False)
# rotate along y-axis
angle = random.uniform(-max_angle, max_angle)
image3 = scipy.ndimage.interpolation.rotate(image2, angle, mode='nearest', axes=(0, 2), reshape=False)
# rotate along x-axis
angle = random.uniform(-max_angle, max_angle)
batch_rot[i] = scipy.ndimage.interpolation.rotate(image3, angle, mode='nearest', axes=(1, 2), reshape=False)
# print(i)
else:
batch_rot[i] = batch[i]
return batch_rot.reshape(size)
It is more difficult to incorporate in the ImageAugmentation()
but the scipy.ndimage.rotate
function by default rotates 3D images correctly and takes the axes argument which specifies which specify the plane of rotation (https://docs.scipy.org/doc/scipy-0.16.1/reference/generated/scipy.ndimage.interpolation.rotate.html). Rotating around the first axis (x) means you pass axes=(1,2)
, to rotate around the second axis (y) useaxes=(0,2)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With