Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TensorFlow: How can I inspect gradients and weights in eager execution?

I am using TensorFlow 1.12 in eager execution, and I want to inspect the values of my gradients and my weights at different points during training for debugging purposes. This answer uses TensorBoard to get nice graphs of weight and gradient distribution over epochs, which is what I would like. However, when I use Keras' TensorBoard callback, I get this:

WARNING:tensorflow:Weight and gradient histograms not supported for eagerexecution, setting `histogram_freq` to `0`.

In other words, this is not compatible with eager execution. Is there any other way to print gradients and/or weigths? Most non-TensorBoard answers seem to rely on graph-based execution.

like image 704
EmielBoss Avatar asked Sep 08 '19 23:09

EmielBoss


People also ask

How do you find the gradient in TensorFlow?

If you want to access the gradients that are computed for the optimizer, you can call optimizer. compute_gradients() and optimizer. apply_gradients() manually, instead of calling optimizer.

Is TensorFlow eager execution slower?

Eager execution is slower than graph execution! Since eager execution runs all operations one-by-one in Python, it cannot take advantage of potential acceleration opportunities .

What is eager execution mode in TensorFlow?

With eager execution enabled, TensorFlow functions execute operations immediately (as opposed to adding to a graph to be executed later in a tf. compat. v1. Session ) and return concrete values (as opposed to symbolic references to a node in a computational graph).

What does with TF GradientTape () as tape do?

Gradient tapes TensorFlow "records" relevant operations executed inside the context of a tf. GradientTape onto a "tape". TensorFlow then uses that tape to compute the gradients of a "recorded" computation using reverse mode differentiation.


1 Answers

In eager execution, you can directly print the weights. As for the gradients, you can use tf.GradientTape to get the gradients of the loss function with respect to some weights. Here is an example showing how to print gradients and weights:

import tensorflow as tf

tf.enable_eager_execution()

x = tf.ones(shape=(4, 3))
y = tf.ones(shape=(4, 1))
dense = tf.layers.Dense(1)

# Print gradients
with tf.GradientTape() as t:
    h = dense(x)
    loss = tf.losses.mean_squared_error(y, h)
gradients = t.gradient(loss, dense.kernel)
print('Gradients: ', gradients)

# Print weights
weights = dense.get_weights()
print('Weights: ', weights)
like image 77
rvinas Avatar answered Sep 22 '22 00:09

rvinas