Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does make sense use dynamic learning rate in AdamOptimizer?

I'm dvelopping a convolutional neural network for images recognition based on three own classes. I built an AlexNet-based model to train. I'd like to know two things:

  1. AdamOptimizer performs a learning rate decay internally (from a fixed given value) or not ?
  2. In case of not, can I use tf.train.exponential_decay to perform decay ?

Small examples are apprecciated. Thanks

like image 661
Kyrol Avatar asked Jan 03 '17 11:01

Kyrol


1 Answers

As you can see in adam.py AdamOptimizer will adjust its learning rate.

The learning rate you pass to the constructor just gives the initial value to start with.

So yes, it does not make much sense to use exponential decay on AdamOptimizer but on gradient descent or momentum optimizer. See here for an example.

like image 155
bodokaiser Avatar answered Oct 04 '22 10:10

bodokaiser