answer moved from edit to self-answer as requested
I just played around with this a little bit, and it seems that if one combines tf.control_dependencies
with tf.record_summaries_every_n_global_steps
it behaves as expected and the summary only gets recorded every nth step. But if they are run together within a session, such as session.run([train, summs])
, the summaries are stored every once in a while, but not exactly every nth step. I tested this with n=2 and with the second approach the summary was often written at odd steps, while with the control dependency approach it was always on an even step.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With