Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras learning rate not changing despite decay in SGD

For some reason my learning rate does not appear to change eventhough I set a decay factor. I added a callback to view the learning rate and it appears to be the same after each epoch. Why is it not changing

class LearningRatePrinter(Callback):
    def init(self):
        super(LearningRatePrinter, self).init()

    def on_epoch_begin(self, epoch, logs={}):
        print('lr:', self.model.optimizer.lr.get_value())

lr_printer = LearningRatePrinter()

model = Sequential()
model.add(Flatten(input_shape = (28, 28)))
model.add(Dense(200, activation = 'tanh'))
model.add(Dropout(0.5))
model.add(Dense(20, activation = 'tanh'))
model.add(Dense(10, activation = 'softmax'))

print('Compiling Model')
sgd = SGD(lr = 0.01, decay = 0.1, momentum = 0.9, nesterov = True)
model.compile(loss = 'categorical_crossentropy', optimizer = sgd)
print('Fitting Data')
model.fit(x_train, y_train, batch_size = 128, nb_epoch = 400, validation_data = (x_test, y_test), callbacks = [lr_printer])


lr: 0.009999999776482582
Epoch 24/400
60000/60000 [==============================] - 0s - loss: 0.7580 - val_loss: 0.6539
lr: 0.009999999776482582
Epoch 25/400
60000/60000 [==============================] - 0s - loss: 0.7573 - val_loss: 0.6521
lr: 0.009999999776482582
Epoch 26/400
60000/60000 [==============================] - 0s - loss: 0.7556 - val_loss: 0.6503
lr: 0.009999999776482582
Epoch 27/400
60000/60000 [==============================] - 0s - loss: 0.7525 - val_loss: 0.6485
lr: 0.009999999776482582
Epoch 28/400
60000/60000 [==============================] - 0s - loss: 0.7502 - val_loss: 0.6469
lr: 0.009999999776482582
Epoch 29/400
60000/60000 [==============================] - 0s - loss: 0.7494 - val_loss: 0.6453
lr: 0.009999999776482582
Epoch 30/400
60000/60000 [==============================] - 0s - loss: 0.7483 - val_loss: 0.6438
lr: 0.009999999776482582
Epoch 31/400
like image 821
chasep255 Avatar asked May 07 '16 17:05

chasep255


1 Answers

This is changing just fine, the problem is the field you are trying to access stores initial learning rate, not current one. Current one is calculated from scratch during each iteration through equation

lr = self.lr * (1. / (1. + self.decay * self.iterations))

and it is never stored, thus you cannot monitor it this way, you simply have to calculate it on your own, using this equation.

see line :126 of https://github.com/fchollet/keras/blob/master/keras/optimizers.py

like image 71
lejlot Avatar answered Sep 22 '22 16:09

lejlot