Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using tf.name_scope in Tensorboard with Tensorflow Estimator

I have some code to calculate performance metrics within my Estimator model_fn written in a function that returns a dictionary of metrics

def __model_eval_metrics(self, classes, labels, mode):
    if mode == tf.estimator.ModeKeys.TRAIN or mode == tf.estimator.ModeKeys.EVAL:
        return {
                'accuracy': tf.metrics.accuracy(labels=tf.argmax(input=labels, axis=1), predictions=classes),
                'precision': tf.metrics.precision(labels=tf.argmax(input=labels, axis=1), predictions=classes),
                'recall': tf.metrics.recall(labels=tf.argmax(input=labels, axis=1), predictions=classes)
                }
    else:
        return None

During estimator training, these are logged as scalers within model_fn grouped by the name scope "train_metrics"

if mode == tf.estimator.ModeKeys.TRAIN:
    with tf.name_scope('train_metrics') as scope:
        tf.summary.scalar('model_accuracy', eval_metrics['accuracy'][1])
        tf.summary.scalar('model_precision', eval_metrics['precision'][1])
        tf.summary.scalar('model_recall', eval_metrics['recall'][1])
        tf.summary.scalar('model_loss', loss)

This produces the desired grouping in Tensorboard Tensorboard Example

For Estimator evaluation, the metrics are passed as a dictionary to EstimatorSpec eval_metric_ops argument as the result of __model_eval_metrics()

return tf.estimator.EstimatorSpec(
    mode=mode,
    predictions={"predictions": predictions, "classes": classes},
    loss=loss,
    train_op=train_op,
    eval_metric_ops=eval_metrics,
)

The problem is that in Tensorboard, these metrics are no longer grouped by name scope and I cannot figure out where to add a name scope to make this happen. You can see that evaluation metrics are ungrouped.

Tensorboard wrong

Question

  1. Is there an approach to leveraging name_scope for evaluation metrics with Estimator?
  2. Should I just be ignoring name_scope altogether and just toggle the runs on the bottom-left of the Tensorboard screen?
like image 965
ElPresidente Avatar asked Aug 25 '17 19:08

ElPresidente


1 Answers

I solved this by using the same "folder" prefix for the metric name (eval) as the name scope for summary (train):

r2 = metrics_r2(labels, predictions)
metrics = {'metrics/r2': r2}
with tf.name_scope('metrics'):
    tf.summary.scalar('r2', r2[1])

if mode == tf.estimator.ModeKeys.EVAL:
    return tf.estimator.EstimatorSpec(mode, loss=m_loss, eval_metric_ops=metrics)

enter image description here

like image 77
Andy Bosyi Avatar answered Sep 28 '22 08:09

Andy Bosyi