Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow adam optimizer in Keras

I have a net in Tensorflow and I am trying to reimplement it in Keras. Currently compared to the Tensorflow model the Keras model completly underperforms. The loss is much higher and decreases slower compared to the original model. My best guess is that I am using the wrong Optimizer. In the Tensorflow code the Optimizer looks like this:

global_step = tf.Variable(0, trainable=False)
learning_rate = tf.train.exponential_decay(0.0001,
                                           global_step,
                                           decay_steps=10000,
                                           decay_rate=0.33,   
                                           staircase=True)
optimizer = tf.train.AdamOptimizer(learning_rate, epsilon=1e-8)
train_op = optimizer.minimize(total_loss, global_step)

In Keras it looks like this:

adam = keras.optimizers.Adam(lr=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-8)
model.compile(loss=get_loss_funcs(), optimizer=adam)

Is there a way to implement the Tensorflow optimizer in Keras?

like image 212
SimpleNotGood Avatar asked Sep 04 '18 14:09

SimpleNotGood


People also ask

What is the Adam Optimizer in keras?

Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments.

What is Optimizer in Tensorflow?

Optimizers are the extended class, which include added information to train a specific model. The optimizer class is initialized with given parameters but it is important to remember that no Tensor is needed. The optimizers are used for improving speed and performance for training a specific model.

What is Optimizer Adam?

Adam optimizer is the extended version of stochastic gradient descent which could be implemented in various deep learning applications such as computer vision and natural language processing in the future years. Adam was first introduced in 2014.


1 Answers

yes there is! - TFOptimizer

class TFOptimizer(Optimizer):
"""Wrapper class for native TensorFlow optimizers.
"""

it's called like this:

keras.optimizers.TFOptimizer(optimizer)

the wrapp will help you see if the issue is due to the optimiser.

like image 121
Frayal Avatar answered Oct 14 '22 22:10

Frayal