Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Difference between node and operations

Tags:

tensorflow

I cannot understand what is the difference between node and operations in Tensorflow.

For example:

with open('myfile_1','w') as myfile:
     for n in tf.get_default_graph().as_graph_def().node:
         myfile.write(n.name+'\n')



with open('myfile_2','w') as myfile:
     for op in tf.get_default_graph().get_operations():
         myfile.write(op.name+'\n')

what goes into myfile_1 and what goes into myfile_2 ? and Variables to which class/file belong?

Can we call "Tensors" all of them ? I am a bit confused about nomenclature here...

I add here, following the suggestion in the comments, the result on a simple graph:

tf.reset_default_graph()
x=tf.placeholder(tf.float32,[1])
y=2*x
z=tf.constant(3.0,dtype=tf.float32)
w=tf.get_variable('w',[2,3], initializer = tf.zeros_initializer())

with open('myfile_1','w') as myfile:
     for n in tf.get_default_graph().as_graph_def().node:
         myfile.write(n.name+'\n')

with open('myfile_2','w') as myfile:
     for op in tf.get_default_graph().get_operations():
         myfile.write(op.name+'\n')

with tf.Session() as sess:
     print(sess.run(y,feed_dict={x : [3]}))

In this case myfile_1 and myfile_2 are both equal to:

Placeholder
mul/x
mul
Const
w/Initializer/zeros
w
w/Assign
w/read
like image 320
Thomas Avatar asked Nov 27 '25 02:11

Thomas


2 Answers

Tensorflow graph is a directed graph such that:

  • Nodes - operations (ops).
  • Directed edges - tensors.

For example, when you define:

x = tf.placeholder(tf.float32, shape=(None, 2))

x is a Tensor and it is the output of the Placeholder op:

print(x.op.type) # Placeholder

as_graph_def() returns SERIALIZED version (think of it as a text version) of the graph. get_operation() returns the actual operations, not their serialized representation. When you print these operations (or write them to a file) you get the same values because the __str__() method of the operation returns its serialized form.

You're not going to get always the same values. For example:

import tensorflow as tf
import numpy as np
tf.reset_default_graph()

v = tf.Variable(np.random.normal([1]))
res1, res2 = [], []

for n in v.graph.as_graph_def(add_shapes=False).node:
    res1.append(n.__str__())

for op in tf.get_default_graph().get_operations():
    res2.append(op.__str__())
print(set(res1) == set(res2)) # True <-- exact same representation
res1, res2 = [], []

for n in v.graph.as_graph_def(add_shapes=True).node:
    res1.append(n.__str__())

for op in tf.get_default_graph().get_operations():
    res2.append(op.__str__())
print(set(res1) == set(res2)) # False <-- not the same in this case!

For more you could refer to the original tensorflow paper.

like image 139
Vlad Avatar answered Nov 30 '25 00:11

Vlad


I am going to answer the question directly:

Operations are nodes they just do the computations.

An operation (tensorflow): An Operation is a node in a TensorFlow Graph that takes zero or more Tensor objects as input, and produces zero or more Tensor objects as output.

you can see this

like image 26
deep_geek Avatar answered Nov 30 '25 00:11

deep_geek