Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to have predictions AND labels returned with tf.estimator (either with predict or eval method)?

I am working with Tensorflow 1.4.

I created a custom tf.estimator in order to do classification, like this:

def model_fn():
    # Some operations here
    [...]

    return tf.estimator.EstimatorSpec(mode=mode,
                           predictions={"Preds": predictions},
                           loss=cost,
                           train_op=loss,
                           eval_metric_ops=eval_metric_ops,
                           training_hooks=[summary_hook])

my_estimator = tf.estimator.Estimator(model_fn=model_fn, 
                       params=model_params,
                       model_dir='/my/directory')

I can train it easily:

input_fn = create_train_input_fn(path=train_files)
my_estimator.train(input_fn=input_fn)

where input_fn is a function that reads data from tfrecords files, with the tf.data.Dataset API.

As I am reading from tfrecords files, I don't have labels in memory when I am making predictions.

My question is, how can I have predictions AND labels returned, either by the predict() method or the evaluate() method?

It seems there is no way to have both. predict() does not have access (?) to labels, and it is not possible to access the predictions dictionary with the evaluate() method.

like image 612
Benjamin Larrousse Avatar asked Nov 17 '17 11:11

Benjamin Larrousse


People also ask

What does TF estimator do?

Estimators simplify sharing implementations between model developers. You can develop a great model with high-level intuitive code, as they usually are easier to use if you need to create models compared to the low-level TensorFlow APIs. Estimators are themselves built on tf.

What kind of estimator model does TensorFlow recommend using for classification?

It is recommended using pre-made Estimators when just getting started. To write a TensorFlow program based on pre-made Estimators, you must perform the following tasks: Create one or more input functions. Define the model's feature columns.

What are the benefits of using the estimator API?

What r the benefits of using estimator API ? You can train both locally and in a distributed model training environment. It provides a high-level API, simplifying model and development. It automatically saves summaries to TensorBoard.


1 Answers

After you finished your training, in '/my/directory' you have a bunch of checkpoint files.

You need to set up your input pipeline again, manually load one of those files, then start looping through your batches storing the predictions and the labels:

# Rebuild the input pipeline
input_fn = create_eval_input_fn(path=eval_files)
features, labels = input_fn()

# Rebuild the model
predictions = model_fn(features, labels, tf.estimator.ModeKeys.EVAL).predictions

# Manually load the latest checkpoint
saver = tf.train.Saver()
with tf.Session() as sess:
    ckpt = tf.train.get_checkpoint_state('/my/directory')
    saver.restore(sess, ckpt.model_checkpoint_path)

    # Loop through the batches and store predictions and labels
    prediction_values = []
    label_values = []
    while True:
        try:
            preds, lbls = sess.run([predictions, labels])
            prediction_values += preds
            label_values += lbls
        except tf.errors.OutOfRangeError:
            break
    # store prediction_values and label_values somewhere

Update: changed to use directly the model_fn function you already have.

like image 128
GPhilo Avatar answered Oct 26 '22 05:10

GPhilo