I have big graph with two parts, I run in turn. Both has summaries.
I was calling summaries with node
merged_summary = tf.summary.merge_all()
but noticed, that it causes tensors in second half of graph evalueted before it has sense.
So, how to merge only summaries of one half of my graph?
You can use tf.summary.merge
, passing a list of the summaries that you want to merge. For example, if you have the summaries:
cost_summary = tf.summary.scalar('cost_sum', cost) # for some 'cost' tensor
grad_summary = tf.summary.scalar('grad_sum', grad) # for some 'grad' tensor
you can merge them by name with:
merged = tf.summary.merge([cost_summary, grad_summary])
So just make merged summary operators for each part of your graph and call them when it makes sense to do so.
assuming you have two lists of summaries of the first and the second graphs, i.e:
summaries_first = [tf.summary.image("my_first_graph_input", image), ...]
summary_second = [tf.summary.scalar("my_second_graph_loss"), ..]
merge each list into a single summary op:
first_graph_summary_op = tf.summary.merge(summaries_first)
second_graph_summary_op = tf.summary.merge(summary_second)
now, whenever you execute a sess.run()
on each graph, evaluate it's corresponding summary op and write it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With