Following code using TensorFlow v2.0
import tensorflow as tf
a = tf.constant(6.0, name = "constant_a")
b = tf.constant(3.0, name = "constant_b")
c = tf.constant(10.0, name = "constant_c")
d = tf.constant(5.0, name = "constant_d")
mul = tf.multiply(a, b , name = "mul")
div = tf.divide(c,d, name ="div")
addn = tf.add_n([mul, div], name = "addn")
writer = tf.summary.create_file_writer("./tensorboard")
with writer.as_default():
tf.summary.scalar("addn_harsha", addn, step=1)
I am new to Python and Tensorflow. I could able to create Scalar in Tensorboard, out of the code above. But I could not generate graph for the same.
In TensorFlow v1.0, we write: writer = tf.summary.FileWriter("./tensorboard", sess.graph)
But in TensorFlow v2.0, Session is no longer used. So, what we can write to create graph in TensorBoard using TensorFlow v2.0
TensorBoard is a tool for providing the measurements and visualizations needed during the machine learning workflow. It enables tracking experiment metrics like loss and accuracy, visualizing the model graph, projecting embeddings to a lower dimensional space, and much more.
It is possible to do this, in a couple of ways, but there are two problems. The main thing is that TensorFlow 2.0 generally works in eager mode, so there is no graph to log at all. The other issue that I have found, at least in my installation, is that with 2.0 Tensorboard crashes when I try to load a log directory with a graph. I imagine that will get fixed, but for now I could only check the resulting graphs written in 2.0 with Tensorboard 1.15.
There are, as far as I know, at least two ways to write graphs in TensorFlow 2.0. The most straighforward way right now is to use a Keras model and a TensorBoard
callback with write_graph=True
on training. This may look like this:
import tensorflow as tf
import numpy as np
# Make Keras model
model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(1, input_shape=(10,)))
model.compile(optimizer=tf.keras.optimizers.SGD(), loss='MSE')
# Make callback
log_dir = '...'
tb_cbk = tf.keras.callbacks.TensorBoard(log_dir, write_graph=True)
# Fit to some data using the callback
x, y = np.ones((100, 10)), np.ones((100, 1))
model.fit(x, y, batch_size=5, epochs=2, callbacks=[tb_cbk])
If you want to just convert some arbitrary piece of TensorFlow code into a graph, you can use tf.function
. This will convert a regular Python function into a graph, or better, a callable that generates graphs on demand, which you can then save. To do this, though, you would need a presumed tf.summary.graph
graph function that is not there yet. The function exists, however, it is just not exposed in the main API (not sure if they will incorporate it in the future), but you can access it through the summary_ops_v2
module. You can use it like this:
import tensorflow as tf
from tensorflow.python.ops import summary_ops_v2
# Some function to convert into a graph
@tf.function
def my_fun(x):
return 2 * x
# Test
a = tf.constant(10, tf.float32)
b = my_fun(a)
tf.print(b)
# 20
# Log the function graph
log_dir = '...'
writer = tf.summary.create_file_writer(log_dir)
with writer.as_default():
# Get concrete graph for some given inputs
func_graph = my_fun.get_concrete_function(a).graph
# Write the graph
summary_ops_v2.graph(func_graph.as_graph_def(), step=0)
writer.close()
Again, in both cases I could only visualize the results in a 1.x version of Tensorboard, but the log files produced are correct.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With