I have just started with Keras and was doing some image pre-processing where I observed that the generator received from ImageDataGenerator
is being iterated indefinitely in for-loop
.
image_gen = tf.keras.preprocessing.image.ImageDataGenerator(rescale=1/255, rotation_range=45)
train_data_gen = image_gen.flow_from_directory(train_dir,
shuffle=True,
target_size=(IMG_SHAPE, IMG_SHAPE),
batch_size=batch_size
)
print('Total number of batches - {}'.format(len(train_data_gen)))
for n, i in enumerate(train_data_gen):
if n >= 30:
# I have to add explicit break statement to get out of loop when done with iterating over all the items present in generator.
break
batch_data = i[0]
print(n, batch_data[0].shape)
# TRY to access element out of bound to see if there really exists more than 30 elements.
print(''.format(train_data_gen[32]))
Output
Found 2935 images belonging to 5 classes.
Total number of batches - 30
0 (150, 150, 3)
1 (150, 150, 3)
2 (150, 150, 3)
.
.
.
29 (150, 150, 3)
---------------------------------------------------------------------------
ValueError: Traceback (most recent call last)
<ipython-input-20-aed377bb98f7> in <module>
13 batch_data = i[0]
14 print(n, batch_data[0].shape)
---> 15 print(''.format(train_data_gen[32]))
~/.virtualenvs/pan_demo/lib/python3.6/site-packages/keras_preprocessing/image/iterator.py in __getitem__(self, idx)
55 'but the Sequence '
56 'has length {length}'.format(idx=idx,
---> 57 length=len(self)))
58 if self.seed is not None:
59 np.random.seed(self.seed + self.total_batches_seen)
ValueError: Asked to retrieve element 32, but the Sequence has length 30
Question
ImageDataGenerator
is meant to work? If so, Can I avoid if n >=30
checking part somehow?Keras version: tf.keras.__version__
---> 2.2.4-tf
Tensorflow version: tf.VERSION
---> 1.13.1
Actually, train_data_gen
will generate data batch by batch infinitely.
When we call model.fit_generator()
, we specify the train_data_gen
as generator, and set steps_per_epoch
(should be len(train_data)/batch_size
). Then the model would know when a single epoch is finished.
From the documentation:
for e in range(epochs):
print('Epoch', e)
batches = 0
for x_batch, y_batch in datagen.flow(x_train, y_train, batch_size=32):
model.fit(x_batch, y_batch)
batches += 1
if batches >= len(x_train) / 32:
# we need to break the loop by hand because
# the generator loops indefinitely
break
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With