For my Reinforcement Learning application, I need to be able to apply custom gradients / minimize changing loss function. According to documentation, it should be possible with Optimizer.minimize() function. However, my pip-installed version appears not to have this feature at all.
My code:
from tensorflow.python.keras.optimizers import Adam, SGD
print(tf.version.VERSION)
optim = Adam()
optim.minimize(loss, var_list=network.weights)
output:
2.0.0-alpha0
Traceback (most recent call last):
File "/Users/ikkamens/Library/Preferences/PyCharmCE2018.3/scratches/testo.py", line 18, in <module>
optim.minimize(loss, var_list=network.weights)
AttributeError: 'Adam' object has no attribute 'minimize'
Calling minimize() takes care of both computing the gradients and applying them to the variables. If you want to process the gradients before applying them you can instead use the optimizer in three steps: Compute the gradients with tf. GradientTape .
To use a custom learning rate, simply instantiate an SGD optimizer and pass the argument learning_rate=0.01 .
Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments.
Actually there is a difference. If you print both classes, you'll see:
from tensorflow.python.keras.optimizers import Adam
print(Adam)
print(tf.optimizers.Adam)
<class 'tensorflow.python.keras.optimizers.Adam'>
<class 'tensorflow.python.keras.optimizer_v2.adam.Adam'>
So in the first case Adam inherits from some other class. It's meant to be used inside Keras training loop, therefore, it doesn't have minimize method. To make sure, let's get all class methods
import inspect
from tensorflow.python.keras.optimizers import Adam
print(inspect.getmembers(Adam(), predicate=inspect.ismethod))
Output shows that this class doesn't even have minimize
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With