Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()
), I want to see if I can just manually set it. However, in order to do that, I need to find out what the learning rate was at the last epoch.
That said, how can I print the learning rate at each epoch? I think I can do it through a callback but it seems that you have to recalculate it each time and I'm not sure how to do that with Adam.
I found this in another thread but it only works with SGD:
class SGDLearningRateTracker(Callback):
def on_epoch_end(self, epoch, logs={}):
optimizer = self.model.optimizer
lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
print('\nLR: {:.6f}\n'.format(lr))
I am using the following approach, which is based on @jorijnsmit answer:
def get_lr_metric(optimizer):
def lr(y_true, y_pred):
return optimizer._decayed_lr(tf.float32) # I use ._decayed_lr method instead of .lr
return lr
optimizer = keras.optimizers.Adam()
lr_metric = get_lr_metric(optimizer)
model.compile(
optimizer=optimizer,
metrics=['accuracy', lr_metric],
loss='mean_absolute_error',
)
It works with Adam.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With