Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I print the Learning Rate at each epoch with Adam optimizer in Keras?

Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()), I want to see if I can just manually set it. However, in order to do that, I need to find out what the learning rate was at the last epoch.

That said, how can I print the learning rate at each epoch? I think I can do it through a callback but it seems that you have to recalculate it each time and I'm not sure how to do that with Adam.

I found this in another thread but it only works with SGD:

class SGDLearningRateTracker(Callback):
    def on_epoch_end(self, epoch, logs={}):
        optimizer = self.model.optimizer
        lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        print('\nLR: {:.6f}\n'.format(lr))
like image 349
Zach Avatar asked Nov 25 '17 21:11

Zach


1 Answers

I am using the following approach, which is based on @jorijnsmit answer:

def get_lr_metric(optimizer):
    def lr(y_true, y_pred):
        return optimizer._decayed_lr(tf.float32) # I use ._decayed_lr method instead of .lr
    return lr

optimizer = keras.optimizers.Adam()
lr_metric = get_lr_metric(optimizer)

model.compile(
    optimizer=optimizer,
    metrics=['accuracy', lr_metric],
    loss='mean_absolute_error', 
    )

It works with Adam.

like image 142
Andrey Avatar answered Oct 31 '22 17:10

Andrey