Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Keras: change learning rate

I'm trying to change the learning rate of my model after it has been trained with a different learning rate.

I read here, here, here and some other places i can't even find anymore.

I tried:

model.optimizer.learning_rate.set_value(0.1) model.optimizer.lr = 0.1 model.optimizer.learning_rate = 0.1 K.set_value(model.optimizer.learning_rate, 0.1) K.set_value(model.optimizer.lr, 0.1) model.optimizer.lr.assign(0.1) 

... but none of them worked! I don't understand how there could be such confusion around such a simple thing. Am I missing something?

EDIT: Working example

Here is a working example of what I'd like to do:

from keras.models import Sequential from keras.layers import Dense import keras import numpy as np  model = Sequential()  model.add(Dense(1, input_shape=(10,)))  optimizer = keras.optimizers.Adam(lr=0.01) model.compile(loss='mse',               optimizer=optimizer)  model.fit(np.random.randn(50,10), np.random.randn(50), epochs=50)  # Change learning rate to 0.001 and train for 50 more epochs  model.fit(np.random.randn(50,10), np.random.randn(50), initial_epoch=50, epochs=50) 
like image 815
Luca Avatar asked Jan 14 '20 16:01

Luca


People also ask

How do I change the learning rate in keras?

The constant learning rate is the default schedule in all Keras Optimizers. For example, in the SGD optimizer, the learning rate defaults to 0.01 . To use a custom learning rate, simply instantiate an SGD optimizer and pass the argument learning_rate=0.01 .

What is the default learning rate in keras?

LearningRateSchedule , or a callable that takes no arguments and returns the actual value to use, The learning rate. Defaults to 0.001.

Does Adam Optimizer have a learning rate?

Adam optimizer is an adoptive learning rate optimizer that is very popular for deep learning, especially in computer vision. I have seen some papers that after specific epochs, for example, 50 epochs, they decrease its learning rate by dividing it by 10.


1 Answers

You can change the learning rate as follows:

from keras import backend as K K.set_value(model.optimizer.learning_rate, 0.001) 

Included into your complete example it looks as follows:

from keras.models import Sequential from keras.layers import Dense from keras import backend as K import keras import numpy as np  model = Sequential()  model.add(Dense(1, input_shape=(10,)))  optimizer = keras.optimizers.Adam(lr=0.01) model.compile(loss='mse', optimizer=optimizer)  print("Learning rate before first fit:", model.optimizer.learning_rate.numpy())  model.fit(np.random.randn(50,10), np.random.randn(50), epochs=50, verbose=0)  # Change learning rate to 0.001 and train for 50 more epochs K.set_value(model.optimizer.learning_rate, 0.001) print("Learning rate before second fit:", model.optimizer.learning_rate.numpy())  model.fit(np.random.randn(50,10),            np.random.randn(50),            initial_epoch=50,            epochs=50,           verbose=0) 

I've just tested this with keras 2.3.1. Not sure why the approach didn't seem to work for you.

like image 157
Timo.S Avatar answered Sep 20 '22 03:09

Timo.S