I have a model that, based on certain conditions, has some unconnected gradients, and this is exactly what I want. But Tensorflow is printing out a Warning every time it encounters the unconnected gradient.
WARNING:tensorflow:Gradients do not exist for variables
Is there any way to only suppress this specific warning? I don't want to blindly suppress all warnings since there might be unexpected (and potentially useful) warnings in the future as I'm still working on my model.
So to knock out these warnings in a single blow, do import warnings then warnings. filterwarnings('ignore') , then run your tensorflow imports and and code that relies on the broken alpha-tensorflow code, then turn warnings back on via warnings.
Kinda hacky way:
gradients = tape.gradient(loss, model.trainable_variables)
optimizer.apply_gradients([
(grad, var)
for (grad, var) in zip(gradients, model.trainable_variables)
if grad is not None
])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With