Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ReduceLROnPlateau gives error with ADAM optimizer

Is it because adam optimizer changes the learning rate by itself. I get an error saying 'Attempting to use uninitialized value Adam_1/lr' I guess there is no point in using ReduceLRonPlateau as Adam will automatically change learning rate.Anyways i have updated the codee Update: Code:

from keras.optimizers import Adam
model.compile(optimizer='adam',loss='mse')

 callback_reduce_lr=ReduceLROnPlateau(monitor='val_loss',
                                     factor=0.1, 
                                     min_lr=1e-4,
                                     patience=0,
                                     verbose=1
                                    )
model.fit(xTrain,yTrain,epochs=100,batch_size=10,validation_data=(xTest,yTest),verbose=2,callbacks=[callback_reduce_lr])

Error://Attempting to use uninitialized value Adam_1/lr

I read somewhere that initializing adam doesnt work while working with ReduceLROnPlateau,,i have tried to initialize the weights too but i got the same error

like image 476
kerastf Avatar asked Sep 02 '18 07:09

kerastf


1 Answers

As discussed in the question's comments, keras' ReduceLROnPleteau, does appear to work for its default parameters:

# keras' ReduceLROnPlateau callback default parameters:
from keras.callbacks import ReduceLROnPlateau
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.1, patience=10,
                              verbose=0, mode='auto', min_delta=0.0001,
                              cooldown=0, min_lr=0)

I tried to recreate the error to try and identify which parameter causes it, but I couldn't. Due to this, I believe that the error doesn't appear for all input shapes or models.

However, I can say for sure that, with the correct parameters, ReduceLROnPlateau does work with Adam.

like image 82
Djib2011 Avatar answered Oct 20 '22 14:10

Djib2011