Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to Do a Simple CLI Query for a Saved Estimator Model?

I have successfully trained a DNNClassifier to classify texts (posts from an online discussion board). I've saved the model and I now want to classify texts using the TensorFlow CLI.

When I run saved_model_cli show for my saved model, I get this output:

saved_model_cli show --dir /my/model --tag_set serve --signature_def predict
The given SavedModel SignatureDef contains the following input(s):
  inputs['examples'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['class_ids'] tensor_info:
      dtype: DT_INT64
      shape: (-1, 1)
      name: dnn/head/predictions/ExpandDims:0
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 1)
      name: dnn/head/predictions/str_classes:0
  outputs['logistic'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 1)
      name: dnn/head/predictions/logistic:0
  outputs['logits'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 1)
      name: dnn/logits/BiasAdd:0
  outputs['probabilities'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 2)
      name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/predict

I cannot figure out the correct parameters for saved_model_cli run to get a prediction.

I have tried several approaches, for example:

saved_model_cli run --dir /my/model --tag_set serve --signature_def predict --input_exprs='examples=["klassifiziere mich bitte"]'

Which gives me this error message:

InvalidArgumentError (see above for traceback): Could not parse example input, value: 'klassifiziere mich bitte'
 [[Node: ParseExample/ParseExample = ParseExample[Ndense=1, Nsparse=0, Tdense=[DT_STRING], dense_shapes=[[1]], sparse_types=[], _device="/job:localhost/replica:0/task:0/device:CPU:0"](_arg_input_example_tensor_0_0, ParseExample/ParseExample/names, ParseExample/ParseExample/dense_keys_0, ParseExample/ParseExample/names)]]

What is the correct way to pass my input string to the CLI to get a classification?

You can find the code of my project, including the training data, on GitHub: https://github.com/pahund/beitragstuev

I'm building and saving my model like this (simplified, see GitHub for original code):

embedded_text_feature_column = hub.text_embedding_column(
    key="sentence",
    module_spec="https://tfhub.dev/google/nnlm-de-dim128/1")
feature_columns = [embedded_text_feature_column]
estimator = tf.estimator.DNNClassifier(
    hidden_units=[500, 100],
    feature_columns=feature_columns,
    n_classes=2,
    optimizer=tf.train.AdagradOptimizer(learning_rate=0.003))
feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
estimator.export_savedmodel(export_dir_base="/my/dir/base", serving_input_receiver_fn=serving_input_receiver_fn)
like image 991
Patrick Hund Avatar asked Jul 06 '18 14:07

Patrick Hund


People also ask

What is a saved model?

A SavedModel contains a complete TensorFlow program, including trained parameters (i.e, tf. Variable s) and computation. It does not require the original model building code to run, which makes it useful for sharing or deploying with TFLite, TensorFlow. js, TensorFlow Serving, or TensorFlow Hub.

How do I save a model in TensorFlow 1?

To save weights manually, use tf.keras.Model.save_weights . By default, tf.keras —and the Model.save_weights method in particular—uses the TensorFlow Checkpoint format with a .ckpt extension. To save in the HDF5 format with a .h5 extension, refer to the Save and load models guide.

What are signatures in TensorFlow?

TensorFlow Lite supports converting TensorFlow model's input/output specifications to TensorFlow Lite models. The input/output specifications are called "signatures". Signatures can be specified when building a SavedModel or creating concrete functions.


2 Answers

The ServingInputReceiver you're creating for the model export is is telling the saved model to expect serialized tf.Example protos instead of the raw strings you wish to classify.

From the Save and Restore documentation:

A typical pattern is that inference requests arrive in the form of serialized tf.Examples, so the serving_input_receiver_fn() creates a single string placeholder to receive them. The serving_input_receiver_fn() is then also responsible for parsing the tf.Examples by adding a tf.parse_example op to the graph.

....

The tf.estimator.export.build_parsing_serving_input_receiver_fn utility function provides that input receiver for the common case.

So your exported model contains a tf.parse_example op that expects to receive serialized tf.Example protos satisfying the feature specification you passed to build_parsing_serving_input_receiver_fn, i.e. in your case it expects serialized examples that have the sentence feature. To predict with the model, you have to provide those serialized protos.

Fortunately, Tensorflow makes it fairly easy to construct these. Here's one possible function to return an expression mapping the examples input key to a batch of strings, which you can then pass to the CLI:

import tensorflow as tf

def serialize_example_string(strings):

  serialized_examples = []
  for s in strings:
    try:
      value = [bytes(s, "utf-8")]
    except TypeError:  # python 2
      value = [bytes(s)]

    example = tf.train.Example(
                features=tf.train.Features(
                  feature={
                    "sentence": tf.train.Feature(bytes_list=tf.train.BytesList(value=value))
                  }
                )
              )
    serialized_examples.append(example.SerializeToString())

  return "examples=" + repr(serialized_examples).replace("'", "\"")

So using some strings pulled from your examples:

strings = ["klassifiziere mich bitte",
           "Das Paket „S Line Competition“ umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Zöller und LED-Lampen.",
           "(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]

print (serialize_example_string(strings))

the CLI command would be:

saved_model_cli run --dir /path/to/model --tag_set serve --signature_def predict --input_exprs='examples=[b"\n*\n(\n\x08sentence\x12\x1c\n\x1a\n\x18klassifiziere mich bitte", b"\n\x98\x01\n\x95\x01\n\x08sentence\x12\x88\x01\n\x85\x01\n\x82\x01Das Paket \xe2\x80\x9eS Line Competition\xe2\x80\x9c umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Z\xc3\xb6ller und LED-Lampen.", b"\np\nn\n\x08sentence\x12b\n`\n^(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]'

which should give you the desired results:

Result for output key class_ids:
[[0]
 [1]
 [0]]
Result for output key classes:
[[b'0']
 [b'1']
 [b'0']]
Result for output key logistic:
[[0.05852016]
 [0.88453305]
 [0.04373989]]
Result for output key logits:
[[-2.7780817]
 [ 2.0360758]
 [-3.0847695]]
Result for output key probabilities:
[[0.94147986 0.05852016]
 [0.11546692 0.88453305]
 [0.9562601  0.04373989]]
like image 174
0xsx Avatar answered Sep 19 '22 22:09

0xsx


Alternatively, saved_model_cli provides another option --input_examples, instead of --input_exprs, so that you can pass the tf.Examples data directly in the cmd line, without the manual serialization.

For example:

--input_examples 'examples=[{"sentence":["this is a sentence"]}]'

See https://www.tensorflow.org/guide/saved_model#--input_examples for details.

like image 40
Happy Gene Avatar answered Sep 20 '22 22:09

Happy Gene