I am trying to adapt the tf DeepDream tutorial code to work with another model. Right now when I call tf.gradients():
t_grad = tf.gradients(t_score, t_input)[0]
g = sess.run(t_grad, {t_input:img0})
I am getting a type error:
TypeError: Fetch argument None of None has invalid type <type 'NoneType'>,
must be a string or Tensor. (Can not convert a NoneType into a Tensor or
Operation.)
Where should I even start to look for fixing this error?
Is it possible to use tf.gradients() with a model that has an Optimizer in it?
I'm guessing your t_grad
has some None
s. None
is mathematically equivalent to 0 gradient, but is returned for the special case when the cost doesn't depend on the argument it is differentiated against. There are various reasons why we don't just return 0 instead of None
which you can see in discussion here
Because None
can be annoying in cases like above, or when computing second derivatives, I use helper function below
def replace_none_with_zero(l):
return [0 if i==None else i for i in l]
The following is a helpful tip for debugging tf.gradients()
for an invalid pair of tensors:
grads = tf.gradients(<a tensor>, <another tensor that doesn't depend on the first>)
even before you try to run tf.gradients
in a session you can see it is invalid using print
print grads
It will return [None]
a list with a single None
in it.
If you try to run it in a session anyways:
results = sess.run(grads)
You will not get None
again, instead you get the error message described in the question.
For a valid pair of tensors:
grads = tf.gradients(<a tensor>, <a related tensor>)
print grads
You will get something like:
Tensor("gradients_1/sub_grad/Reshape:0", dtype=float32)
In a valid situation:
results = sess.run(grads, {<appropriate feeds>})
print results
you get something like
[array([[ 4.97156498e-06, 7.87349381e-06, 9.25197037e-06, ...,
8.72526925e-06, 6.78442757e-06, 3.85240173e-06],
[ 7.72772819e-06, 9.26370740e-06, 1.19129227e-05, ...,
1.27088233e-05, 8.76379818e-06, 6.00637532e-06],
[ 9.46506498e-06, 1.10620931e-05, 1.43903117e-05, ...,
1.40718612e-05, 1.08670165e-05, 7.12365863e-06],
...,
[ 1.03536004e-05, 1.03090524e-05, 1.32107480e-05, ...,
1.40605653e-05, 1.25974075e-05, 8.90011415e-06],
[ 9.69486427e-06, 8.18045282e-06, 1.12702282e-05, ...,
1.32554378e-05, 1.13317501e-05, 7.74569162e-06],
[ 5.61043908e-06, 4.93397192e-06, 6.33513537e-06, ...,
6.26539259e-06, 4.52598442e-06, 4.10689108e-06]], dtype=float32)]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With