How do I save my augmented images with their class label in the filename? Or, is there a way it knows which class the new image belongs to?
EDIT:
datagen = ImageDataGenerator(horizontal_flip=True, vertical_flip=True)
i = 0
for batch in datagen.flow_from_directory('data/train', target_size=
(100,100),
shuffle=False, batch_size=batch_size, save_to_dir='data/train/'):
i += 1
if i > 20: # save 20 images
break # otherwise the generator would loop indefinitely
print("Saved flipped images")
I have three class subdirectories inside of data/train. After running this, I can't tell which images have been augmented, although I do see that about a third of the total # of images have been saved. Is there something missing in my code to specify that the images be named by class, and that each class be looped through to create new images?
EDIT #2: Folder structure: data/train 3 classes in separate folders: n02088364, n02096585, n02108089 The newly created images are saved to data/train, not to individual class folders.
In case you want to save the images under a folder having same name as label then you can loop over a list of labels and call the augmentation code within the loop.
from tensorflow.keras.preprocessing.image import ImageDataGenerator
# Augmentation + save augmented images under augmented folder
IMAGE_SIZE = 224
BATCH_SIZE = 500
LABELS = ['lbl_a','lbl_b','lbl_c']
for label in LABELS:
datagen_kwargs = dict(rescale=1./255)
dataflow_kwargs = dict(target_size=(IMAGE_SIZE, IMAGE_SIZE),
batch_size=BATCH_SIZE, interpolation="bilinear")
train_datagen = tf.keras.preprocessing.image.ImageDataGenerator(
rotation_range=40,
horizontal_flip=True,
width_shift_range=0.1, height_shift_range=0.1,
shear_range=0.1, zoom_range=0.1,
**datagen_kwargs)
train_generator = train_datagen.flow_from_directory(
'original_images', subset="training", shuffle=True, save_to_dir='aug_images/'+label, save_prefix='aug', classes=[label], **dataflow_kwargs)
# Following line triggers execution of train_generator
batch = next(train_generator)
So why do this when generator can directly be passed to model? In case, you want to use the tflite-model-maker
which does not accept a generator and accepts labelled data under folder for each label:
from tflite_model_maker import ImageClassifierDataLoader
data = ImageClassifierDataLoader.from_folder('aug_images')
Result
aug_images
|
|__ lbl_a
| |
| |_____aug_img_a.png
|
|__ lbl_b
| |
| |_____aug_img_b.png
|
|__ lbl_c
| |
| |_____aug_img_c.png
Note: You need to ensure the folders already exist.
# We draw a batch of images from the generator
batch = next(datagen)
# batch[0] is the list of images
# batch[1] is the list of associated classes
# We display the batch (here the batch is size 16) and their class
fig, m_axs = plt.subplots(1, 16, figsize = (26, 6))
for img, class_index_one_hot, ax1 in zip(batch[0], batch[1], m_axs.T):
ax1.imshow(img)
class_index = np.argmax(class_index_one_hot)
ax1.set_title(str(class_index) + ':' + index_to_classes[class_index])
ax1.axis('off')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With