Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is gradient in the tensorflow's graph calculated incorrectly?

A very simple example in tensorflow: min (x + 1)^2 where x is a scalar. The code is:

import tensorflow as tf

x = tf.Variable(initial_value=3.0)
add = tf.add(x, 1)
y = tf.square(add)
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(y)

Then write the graph to disk:

graph = tf.get_default_graph()
writer = tf.summary.FileWriter("some/dir/to/write/events")
writer.add_graph(graph=graph)

Finally, visualizing it in tensorboard, it looks like this:

enter image description here

The question is, why is the "Add" node connected with the gradients? I think since I am trying to minimize y, the "Square" node should be. Is this a bug? Can anyone explain it?

like image 523
Jie.Zhou Avatar asked Jun 03 '17 09:06

Jie.Zhou


1 Answers

There is no bug involved. You just need to understand what is a gradient and know how to compute one yourself. So (x+1)^2' = 2*(x+1). Which means that you do not need to calculate (x+1)^2 to calculate the gradient. If you will zoom in the gradient part you will see that it calculated the gradient of your square and figured out that it does which part of the graph is needed there :enter image description here

Here is a more interesting and more intuitive example:

import tensorflow as tf

x = tf.Variable(initial_value=3.0)
y = tf.cos(x)

train = tf.train.GradientDescentOptimizer(learning_rate=0.01).minimize(y)

with tf.Session() as sess:
    writer = tf.summary.FileWriter('logs', sess.graph)
    writer.close()

You should know that cos(x)' = - sin(x). Which means that only x is needed to calculate gradient. And this is what you see in the graph:enter image description here

like image 152
Salvador Dali Avatar answered Oct 09 '22 19:10

Salvador Dali