Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to compute gradient of output wrt input in Tensorflow 2.0

I have a trained Tensorflow 2.0 model (from tf.keras.Sequential()) that takes an input layer with 26 columns (X) and produces an output layer with 1 column (Y).

In TF 1.x I was able to calculate the gradient of the output with respect to the input with the following:

model = load_model('mymodel.h5')
sess = K.get_session()
grad_func = tf.gradients(model.output, model.input)
gradients = sess.run(grad_func, feed_dict={model.input: X})[0]

In TF2 when I try to run tf.gradients(), I get the error:

RuntimeError: tf.gradients is not supported when eager execution is enabled. Use tf.GradientTape instead.

In the question In TensorFlow 2.0 with eager-execution, how to compute the gradients of a network output wrt a specific layer?, we see an answer on how to calculate gradients with respect to intermediate layers, but I don't see how to apply this to gradients with respect to the inputs. On the Tensorflow help for tf.GradientTape, there are examples with calculating gradients for simple functions, but not neural networks.

How can tf.GradientTape be used to calculate the gradient of the output with respect to the input?

like image 447
maurera Avatar asked Dec 02 '19 19:12

maurera


People also ask

How does gradient work in TensorFlow?

Gradient tapesTensorFlow "records" relevant operations executed inside the context of a tf. GradientTape onto a "tape". TensorFlow then uses that tape to compute the gradients of a "recorded" computation using reverse mode differentiation.

What is TF gradient?

TensorFlow is open-source Python library designed by Google to develop Machine Learning models and deep learning neural networks. gradients() is used to get symbolic derivatives of sum of ys w.r.t. x in xs. It doesn't work when eager execution is enabled.

Does TensorFlow use Autograd?

Behind the scenes, TensorFlow is a tensor library with automatic differentiation capability. Hence you can easily use it to solve a numerical optimization problem with gradient descent. In this post, you will learn how TensorFlow's automatic differentiation engine, autograd, works.

How does TensorFlow compute derivatives?

Tensorflow calculates derivatives using automatic differentiation. This is different from symbolic differentiation and numeric differentiation (aka finite differences). More than a smart math approach, it is a smart programming approach.


1 Answers

This should work in TF2:

inp = tf.Variable(np.random.normal(size=(25, 120)), dtype=tf.float32)

with tf.GradientTape() as tape:
    preds = model(inp)

grads = tape.gradient(preds, inp)

Basically you do it the same way as TF1, but using GradientTape.

like image 64
Dr. Snoopy Avatar answered Sep 30 '22 07:09

Dr. Snoopy