The method I am aware of is something like this
from tensorflow.keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale=1./255)
val_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
train_dir,
target_size=(150, 150),
batch_size=20, <---------------------------
class_mode='binary')
But I want to change batch size while training in model.fit() method and it won't happen because batch_size has already been set in flow_from_directory()
So how do I load this dataset so that I've the freedom to change the batch_size while training?
All the efforts are highly appreciated
You can change the batch size after creating the ImageDataGenerator object:
train_generator.batch_size = 2
The batches will then be of size 2.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With