I'm training a VAE with TensorFlow Keras backend and I'm using Adam as the optimizer. the code I used is attached below.
def compile(self, learning_rate=0.0001):
optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate)
self.model.compile(optimizer=optimizer,
loss=self._calculate_combined_loss,
metrics=[_calculate_reconstruction_loss,
calculate_kl_loss(self)])
The TensorFlow version I'm using is 2.11.0. The error I'm getting is
AttributeError: 'Adam' object has no attribute 'get_updates'
I'm suspecting the issues arise because of the version mismatch. Can someone please help me to sort out the issue? Thanks in advance.
Try replacing your 2nd line "optimizer = tf.keras.optimizers.Adam(learning_rate=learning_rate)" by "optimizer = tf.keras.optimizers.legacy.Adam(learning_rate=learning_rate)"
For further information, check tf 2.11.0 Release 11/28/2022 in https://github.com/tensorflow/tensorflow/releases It indicates in particular that: "The tf.keras.optimizers.Optimizer base class now points to the new Keras optimizer, while the old optimizers have been moved to the tf.keras.optimizers.legacy namespace."
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With