I have a variable that changes with train iterations. The variable is not computed as a part of the computational graph.
Is it possible to add it to the tensorflow summary in order to visualize it together with the loss function?
In TensorFlow, machine learning algorithms are represented as computational graphs. A computational graph is a type of directed graph where nodes describe operations, while edges represent the data (tensor) flowing between those operations.
Graphs are data structures that contain a set of tf. Operation objects, which represent units of computation; and tf. Tensor objects, which represent the units of data that flow between operations. They are defined in a tf. Graph context.
Why tensorflow uses computational graphs? Exp: Tensorflow uses computational graphs because calculations can be done in parallel.
Yes, you can create summaries outside the graph.
Here is an example where the summary is created outside the graph (not as a TF op):
output_path = "/tmp/myTest"
summary_writer = tf.summary.FileWriter(output_path)
for x in range(100):
myVar = 2*x
summary=tf.Summary()
summary.value.add(tag='myVar', simple_value = myVar)
summary_writer.add_summary(summary, x)
summary_writer.flush()
if you have other summary, you can add new placeholder for the variable what is not computed as a part of the computational graph.
...
myVar_tf = tf.placeholder(dtype=tf.float32)
tf.summary.scalar('myVar', myVar_tf)
merged_summary = tf.summary.merge_all()
...
...
myVar = 0.1
feed_dict = { myVar_tf : myVar}
summary, step = sess.run([merged_summary, global_step],feed_dict=feed_dict)
summary_writer.add_summary(summary, step)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With