During training, at each epoch, I'd like to change the batch size (for experimental purpose).
Creating a custom Callback
seems appropriate but batch_size
isn't a member of the Model
class.
The only way I see would be to override fit_loop
and expose batch_size
to the callback at each loop. Is there a cleaner or faster way to do it without using a callback ?
For others who land here, I found the easiest way to do batch size adjustment in Keras is just to call fit more than once (with different batch sizes):
model.fit(X_train, y_train, batch_size=32, epochs=20)
# ...continue training with a larger batch size
model.fit(X_train, y_train, batch_size=512, epochs=10)
I think it will be better to use a custom data generator to have control over the data you pass to the training loop, so you can generate batches of different sizes, process data on the fly etc. Here is an outline:
def data_gen(data):
while True: # generator yields forever
# process data into batch, it could be any size
# it's your responsibility to construct a batch
yield x,y # here x and y are a single batch
Now you can train with model.fit_generator(data_gen(data), steps_per_epoch=100)
which will yield 100 batches per epoch. You can also use a Sequence if you want to encapsulate this inside a class.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With