I've been trying out TensroFlow v2 beta and I'm trying out the tf.keras models.
When I compile a model and select the optimizer to be the string 'adam'
. The model can be trained properly:
model.compile(optimizer='adam', loss='categorical_crossentropy')
model.fit(x, y)
But when I try to use the default optimizer tf.keras.optimizers.Adam()
it can't be trained and outputs a nan
loss at each iteration.
adam = tf.keras.optimizers.Adam()
model.compile(optimizer=adam, loss='categorical_crossentropy')
model.fit(x, y)
Isn't the string 'adam'
supposed to be the default adam optimizer or am I missing something? I've tried several hyperparameters (learning_rate
, beta_1
, beta_2
, etc.), but none seem to work. This is a big deal because I might not want to use the default hyperparameters all the time.
Can anyone explain this behaviour?
After a bit of digging it seems that when you type the string 'adam'
it calls another adam, which it refers to as adam_v2.
This can be found here.
from tensorflow.python.keras.optimizer_v2.adam import Adam
adam = Adam()
model.compile(optimizer=adam, loss='categorical_crossentropy')
model.fit(x, y)
From what I can gather there are now 2 different implementations of the optimizers. Using optimizer='adam' seems to use this implementation:
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/optimizer_v2/adam.py
You can import explicitly as:
from tensorflow.python.keras.optimizer_v2 import adam as adam_v2
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With