Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to have dynamic batchsize in keras?

Keras codes I have looked or wrote have fixed batchsize during training (i.e. 32, 64, 128 ...). I am wondering if it is possible to have dynamic batchsize. (For example, 104 in the first iteration, 82 in the next iteration, 95 in next, and so on.)

I am currently using tensorflow backend.

like image 783
user3377018 Avatar asked Nov 06 '17 00:11

user3377018


1 Answers

It is possible if you train on a loop vs training with fit. an example

from random import shuffle    

dataSlices = [(0,104),(104,186),(186,218)]

for epochs in range(0,10):
    shuffle(dataSlices)
    for i in dataSlices:
        x,y = X[i[0]:i[1],:],Y[i[0]:i[1],:]
        model.fit(x,y,epochs=1,batchsize=x.shape[0])
        #OR as suggest by Daniel Moller
        #model.train_on_batch(x,y)

This would assume your data is 2d numpy arrays. This idea can be further expanded to use a fit_generator() inplace of the for loop if you so choose (see docs).

like image 69
DJK Avatar answered Sep 28 '22 20:09

DJK