I'm dvelopping a convolutional neural network for images recognition based on three own classes. I built an AlexNet-based model to train. I'd like to know two things:
tf.train.exponential_decay
to perform decay ?Small examples are apprecciated. Thanks
As you can see in adam.py AdamOptimizer
will adjust its learning rate.
The learning rate you pass to the constructor just gives the initial value to start with.
So yes, it does not make much sense to use exponential decay on AdamOptimizer
but on gradient descent or momentum optimizer. See here for an example.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With