I am using TensorFlow 1.12 in eager execution, and I want to inspect the values of my gradients and my weights at different points during training for debugging purposes. This answer uses TensorBoard to get nice graphs of weight and gradient distribution over epochs, which is what I would like. However, when I use Keras' TensorBoard callback, I get this:
WARNING:tensorflow:Weight and gradient histograms not supported for eagerexecution, setting `histogram_freq` to `0`.
In other words, this is not compatible with eager execution. Is there any other way to print gradients and/or weigths? Most non-TensorBoard answers seem to rely on graph-based execution.
If you want to access the gradients that are computed for the optimizer, you can call optimizer. compute_gradients() and optimizer. apply_gradients() manually, instead of calling optimizer.
Eager execution is slower than graph execution! Since eager execution runs all operations one-by-one in Python, it cannot take advantage of potential acceleration opportunities .
With eager execution enabled, TensorFlow functions execute operations immediately (as opposed to adding to a graph to be executed later in a tf. compat. v1. Session ) and return concrete values (as opposed to symbolic references to a node in a computational graph).
Gradient tapes TensorFlow "records" relevant operations executed inside the context of a tf. GradientTape onto a "tape". TensorFlow then uses that tape to compute the gradients of a "recorded" computation using reverse mode differentiation.
In eager execution, you can directly print the weights. As for the gradients, you can use tf.GradientTape to get the gradients of the loss function with respect to some weights. Here is an example showing how to print gradients and weights:
import tensorflow as tf
tf.enable_eager_execution()
x = tf.ones(shape=(4, 3))
y = tf.ones(shape=(4, 1))
dense = tf.layers.Dense(1)
# Print gradients
with tf.GradientTape() as t:
h = dense(x)
loss = tf.losses.mean_squared_error(y, h)
gradients = t.gradient(loss, dense.kernel)
print('Gradients: ', gradients)
# Print weights
weights = dense.get_weights()
print('Weights: ', weights)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With