Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Printing extra training metrics with Tensorflow Estimator

Tags:

Is there a way to let Tensorflow print extra training metrics (e.g. batch accuracy) when using the Estimator API?

One can add summaries and view the result in Tensorboard (see another post), but I was wondering if there is an elegant way to get the scalar summary values printed while training. This already happens for training loss, e.g.:

loss = 0.672677, step = 2901 (52.995 sec) 

but it would be nice to have e.g.

loss = 0.672677, accuracy = 0.54678, step = 2901 (52.995 sec) 

without to much trouble. I am aware that most of the time it is more useful to plot test set accuracy (I am already doing this with a validation monitor), but in this case I am also interested in training batch accuracy.

like image 853
dumkar Avatar asked Jul 27 '17 14:07

dumkar


People also ask

What kind of estimator model does TensorFlow recommend using for classification?

It is recommended using pre-made Estimators when just getting started. To write a TensorFlow program based on pre-made Estimators, you must perform the following tasks: Create one or more input functions. Define the model's feature columns.

Which value is required as an input to an evaluation EstimatorSpec?

The “train_op” and the scalar loss tensor are the minimum required arguments to create an “EstimatorSpec” for training.


2 Answers

From what I've read it is not possible to change it by passing parameter. You can try to do by creating a logging hook and passing it into to estimator run.

In the body of model_fn function for your estimator:

logging_hook = tf.train.LoggingTensorHook({"loss" : loss,      "accuracy" : accuracy}, every_n_iter=10)  # Rest of the function  return tf.estimator.EstimatorSpec(     ...params...     training_hooks = [logging_hook]) 

EDIT:

To see the output you must also set logging verbosity high enough (unless its your default): tf.logging.set_verbosity(tf.logging.INFO)

like image 191
Xyz Avatar answered Oct 16 '22 21:10

Xyz


You can also use the TensorBoard to see some graphics of the desired metrics. To do that, add the metric to a TensorFlow summary like this:

accuracy = tf.metrics.accuracy(labels=labels, predictions=predictions["classes"]) tf.summary.scalar('accuracy', accuracy[1]) 

The cool thing when you use the tf.estimator.Estimator is that you don't need to add the summaries to a FileWriter, since it's done automatically (merging and saving them periodically by default - on average every 100 steps).

Don't forget to change this line as well, based on the accuracy parameter you just added:

eval_metric_ops = { "accuracy": accuracy } return tf.estimator.EstimatorSpec(     mode=mode, loss=loss, eval_metric_ops=eval_metric_ops) 

In order to see the TensorBoard you need to open a new terminal and type:

tensorboard --logdir={$MODEL_DIR} 

After that you will be able to see the graphics in your browser at localhost:6006.

like image 21
tsveti_iko Avatar answered Oct 16 '22 21:10

tsveti_iko