Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to change the learning rate based on the previous epoch accuracy using keras I am using an SGD optimizer?

Tags:

python

keras

This is the below code what I am trying to implement

def scheduler(epoch):
  init_lr=0.1
  #after every third epoch I am changing the learning rate
  if (epoch+1)%3==0:
    changed_lr=init_lr*(1-0.05)**epoch
    return changed_lr

  #I tried this to change the learning rate based on accuracy of previous epoch 
  #if the present epoch accuracy is less than previous epoch's accuracy
  else:
    changed_lr=init_lr-(0.1)*init_lr
  return changed_lr
like image 627
Uttam Dey Avatar asked Dec 01 '25 10:12

Uttam Dey


1 Answers

If you want to change the learning rate in relation to number of epochs, use LearningRateScheduler:

import tensorflow as tf

def scheduler(epoch, lr):
  if epoch < 10:
    return lr
  else:
    return lr * tf.math.exp(-0.1)

model = <YOUR_MODEL>
model.compile(tf.keras.optimizers.SGD(), loss=<YOUR_LOSS>)

callback = tf.keras.callbacks.LearningRateScheduler(scheduler)
history = model.fit(X, y, epochs=15, callbacks=[callback])

If you want to change the learning rate in relation to some metric, use ReduceLROnPlateau:

callback = tf.keras.callbacks.ReduceLROnPlateau(
  monitor='acc',
  factor=0.6,
  patience=5,
  min_lr=3e-6,
)
like image 182
Yoskutik Avatar answered Dec 04 '25 01:12

Yoskutik



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!