Why is TensorFlow giving me the runtime error (in the title)?
I'm using WinPython3.5.4.2 and have installed TensorFlow 1.8.0. I've been following the tutorial at https://www.tensorflow.org/get_started/eager up to the section titled "Training loop".
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-9-e08164fd8374> in <module>()
14 for x, y in train_dataset:
15 # Optimize the model
---> 16 grads = grad(model, x, y)
17 optimizer.apply_gradients(zip(grads, model.variables),
18 global_step=tf.train.get_or_create_global_step())
<ipython-input-7-08164b502799> in grad(model, inputs, targets)
6 with tf.GradientTape() as tape:
7 loss_value = loss(model, inputs, targets)
----> 8 return tape.gradient(loss_value, model.variables)
C:\[deleted]\WinPython3.5.4.2\python-3.5.4.amd64\lib\site-packages\tensorflow\python\eager\backprop.py in gradient(self, target, sources, output_gradients)
765 flat_grad = imperative_grad.imperative_grad(
766 _default_vspace, self._tape, [target], flat_sources,
--> 767 output_gradients=output_gradients)
768
769 if not self._persistent:
C:\[deleted]\WinPython3.5.4.2\python-3.5.4.amd64\lib\site-packages\tensorflow\python\eager\imperative_grad.py in imperative_grad(vspace, tape, target, sources, output_gradients)
61 """
62 return pywrap_tensorflow.TFE_Py_TapeGradient(
---> 63 tape._tape, vspace, target, sources, output_gradients) # pylint: disable=protected-access
RuntimeError: Trying to call tape.gradient on a non-persistent tape while it is still active.
I suspect in your sample you're invoking tape.gradient()
within the with tf.GradientTape()
context as opposed to outside of it. Changing from:
with tf.GradientTape() as tape:
loss_value = loss(model, inputs, targets)
return tape.gradient(loss_value, model.variables)
to
with tf.GradientTape() as tape:
loss_value = loss(model, inputs, targets)
# Notice the change in indentation of the line below
return tape.gradient(loss_value, model.variables)
should cause the error to go away.
TensorFlow operations executed within the context of a GradientTape
are "recorded" so that the recorded computation can be later differentiated. This recording costs memory (since tensors materialized by intermediate operations have to be kept alive). Invoking tape.gradient()
within the GradientTape
context manager would mean that the gradient computation should be recorded as well and tensors created during the gradient computation need to be kept alive. Often this isn't what the user intended - the tape.gradient()
call is only accidentally inside the context manager, leading to a larger memory footprint than necessary. Hence the error. Though, arguably the error message string isn't particularly well phrased (and I believe will be improved in releases after TensorFlow 1.8).
Quoting from the documentation
By default, the resources held by a
GradientTape
are released as soon asGradientTape.gradient()
method is called. To compute multiple gradients over the same computation, create apersistent
gradient tape. This allows multiple calls to thegradient()
method as resources are released when the tape object is garbage collected.
So, if you really do want to record the gradient computation (for example, to compute second order derivatives), then you could create a persistent tape and keep the .gradient()
call inside the context manager. For example:
x = tfe.Variable(3.0)
with tf.GradientTape(persistent=True) as g:
y = x * x
dy = g.gradient(y, x)
d2y = g.gradient(dy, x)
print(dy)
print(d2y)
Eager execution is a relatively new feature in TensorFlow, feedback on it is more than welcome. If you think that the error message could be better (it could be!) and/or the default should be changed (for example, persistent by default and users particularly concerned about memory overheads could explicitly choose a non-persistent tape) - don't hesitate to chime in by providing feedback on GitHub
Hope that helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With