I ran the following code to
W = tf.Variable(tf.zeros([1, 3]), dtype=tf.float32, name="W")
B = tf.constant([[1, 2, 3]], dtype=tf.float32, name="B")
act = tf.add(W, B)
init = tf.global_variables_initializer()
with tf.Session() as sess:
sess.run(init)
sess.run(act)
writer = tf.summary.FileWriter("./graphs", sess.graph)
writer.close()
And verified it with tensorboard:
What confuses me is the read
operation and the operation prior to that which is denoted as (W)
. Constant B
is directed straight to the Add
operation while tf.variable
has all these operation nodes inside. Here are my questions:
What is (W)
operation? Constant B
is a regular circle which denotes a constant. Oval shaped nodes denote Operation node. (W)
doesn't seem like any operation yet it is denoted with the same oval shaped node? What is that node's job?
Add
node explicitly reads (W)
node with a read
operation as opposed to constant node B
. Why is read
necessary for variable nodes?
A variable can be assigned to, it's value can be changed. A constant is constant. More subtly: A constant's value is stored in the graph and its value is replicated wherever the graph is loaded. A variable is stored separately, and may live on a parameter server.
tf. constant is useful for asserting that the value can be embedded that way. If the argument dtype is not specified, then the type is inferred from the type of value . # Constant 1-D Tensor from a python list.
placeholder is used for input data, and tf. Variable is used to store the state of data.
A tf. Variable represents a tensor whose value can be changed by running ops on it. Specific ops allow you to read and modify the values of this tensor. Higher level libraries like tf. keras use tf.
For lack of a link to any intermediate-level documentation, this is my pragmatic, conceptual model of tensorflow variables.
The following, from https://www.tensorflow.org/programmers_guide/graphs#visualizing_your_graph seems to at least imply an answer to your question.
"Executing v = tf.Variable(0) adds to the graph a tf.Operation that will store a writeable tensor value that persists between tf.Session.run calls. The tf.Variable object wraps this operation, and can be used like a tensor, which will read the current value of the stored value. The tf.Variable object also has methods such as assign and assign_add that create tf.Operation objects that, when executed, update the stored value."
And this from https://www.tensorflow.org/programmers_guide/variables
"Internally, a tf.Variable stores a persistent tensor. Specific ops allow you to read and modify the values of this tensor. "
And this from http://www.goldsborough.me/tensorflow/ml/ai/python/2017/06/28/20-21-45-a_sweeping_tour_of_tensorflow/
"variables are in-memory buffers containing tensors."
Note that the lines between nodes of a graph MUST be tensors. tf.constant(...) returns a (instance of class) Tensor. However, tf.Variable(...) returns not a Tensor instance, but an instance of a class Variable
x = tf.Variable(...)
print(type(x)) # <class 'tensorflow.python.ops.variables.Variable'>
y = tf.constant(...)
print(type(y)) # <class 'tensorflow.python.framework.ops.Tensor'>
To use a tf variable in an operation (whose arguments must be tensors), its value must first be "transformed" into a tensor, and the "read" operation returns the "hidden" tensor that the variable represents. I believe the value is returned as a tf.constant (which is a tensor).
Note the capital V in tf.Variable(...), and the small c in tf.constant(..). A tf.Variable(...) returns an instance of a tf.Variable class, so tf.Variable(...) instantiates a class instance, and read() is a (visualization of a) method on this class which returns a value. When a value is assigned to a variable, it modifies this "hidden" tensor.
On the other hand, at least conceptually, tf.constant(...) is a factory function which returns an instance of class Tensor.
It would be nice to have a link to some intermediate-level documentation about this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With