Is there a master list of Tensorflow ops that are differentiable (i.e., will auto-differentiate)?
Two other ways to phrase this:
ops.NoGradient
set.LookupError
.For example, I'd assume that all the Control Flow ops are not differentiable (e.g., tf.where
). How would I find this other than by manually running them all through tf.gradients
to see if they throw the LookupError
.
"Commonsense" is not a valid answer.
Thanks.
EDIT:
tf.where
is differentiable so my intuitions are wrong. Perhaps the correct question here is which ops in Tensorflow are not differentiable.
Thanks.
I have devised the entire list of Differentiable and Non-Differentiable Ops using python code.
You will find the compact list here. Also the code which generated it.
https://github.com/Mainak431/List-of-Differentiable--OPs-and-Non-differentiable-OPs--in-Tensorflow
No, there is no list (you can be the first one to create it). Also as far as I am aware, documentation of each function also does not tell it (tf.size
is non-differentiable but does not tell about it).
Apart from the way you suggested, you can also extract this data from the source code. For example all the ops that have gradient implemented, have @ops.RegisterGradient
in front of the method declaration. For ops which do not have gradient you will have ops.NotDifferentiable(
Not related, but probably helpful.
It appears that, for TensorFlow 2, such a list is available in the documentation for the tf.raw_ops
module.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With