I have a rather complicated Tensorflow graph that I'd like to visualize for optimization purposes. Is there a function that I can call that will simply save the graph for viewing in Tensorboard without needing to annotate variables?
I Tried this:
merged = tf.merge_all_summaries()
writer = tf.train.SummaryWriter("/Users/Name/Desktop/tf_logs", session.graph_def)
But no output was produced. This is using the 0.6 wheel.
This appears to be related: Graph visualisaton is not showing in tensorboard for seq2seq model
Select the Graphs dashboard by tapping “Graphs” at the top. You can also optionally use TensorBoard. dev to create a hosted, shareable experiment. By default, TensorBoard displays the op-level graph.
TensorFlow uses graphs as the format for saved models when it exports them from Python. Graphs are also easily optimized, allowing the compiler to do transformations like: Statically infer the value of tensors by folding constant nodes in your computation ("constant folding").
For efficiency, the tf.train.SummaryWriter
logs asynchronously to disk. To ensure that the graph appears in the log, you must call close()
or flush()
on the writer before the program exits.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With