Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Correct payload for TensorFlow Serving REST API

I have converted a Keras model to a Tensorflow estimator, added Tensorflow Transform to the graph and then exported the model for serving.

When I check the model signature, I can see the following info:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['examples'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_example_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['specialities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 154)
        name: specialities/Softmax:0
  Method name is: tensorflow/serving/predict

I converted the feature specifications with tf.estimator.export.build_parsing_serving_input_receiver_fn therefore the name of the input node in the signature is example. The name of the input node in my model is procedures.

I then use saved_model_cli to manually test the exported model and everything looks good (I got a list of probabilities)

!saved_model_cli run --dir=/model_dir/1533849825 
                     --tag_set serve 
                     --signature_def serving_default  
                     --input_examples 'examples=[{"procedures": ["99214,17000,17000,13121,99203"]}]'

Now, I load this model into TF Serving, the model server starts up ok.

When I request a model prediction with the json payload below (application/json), I am getting the following error:

{
  "signature_name":"serving_default",
  "instances":[
    {
       "examples":["99214,17000,17000,13121,99203"]
    }
  ]
}

Error:

"error": "Expected serialized to be a vector, got shape: [1,1]

A different payload structure, leads to this error

{
 "signature_name":"serving_default",
 "examples":[
    {
      "procedure":["99214,17000,17000,13121,99203"]
    }
  ]
}

Error:

"error": "JSON Value: {\n    \"signature_name\": \"serving_default\",\n    
\"examples\": [\n        {\n            \"procedures\": 
["99214,17000,17000,13121,99203"]]\n        }\n    ]\n} not formatted 
correctly. Expecting object with \'instances\' key and a list/array as the value." 

What is the correct payload format for the TensorFlow Serving request in this prediction case?

Does the payload need to be formatted in the tf.Example structure?

like image 369
neurix Avatar asked Aug 09 '18 22:08

neurix


1 Answers

I give an example here with Estimator api, hope it can help someone who come up with the similar problems.

To export a SavedModel with Estimator you need a input_receiver_fn to accpect inputs when serving. The input_receiver_fn in my application is as following:

def _serving_input_receiver_fn():
  serialized_tf_sample = tf.placeholder(dtype=tf.string,
                                        shape=None, name='input_example_tensor')
  receiver_tensors = {'example': serialized_tf_sample}
  # example_proto: {'feature_name': tf.VarLenFeature(tf.int64),}
  features = tf.parse_example(serialized_tf_sample, example_proto)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

You can put the following code under train_and_evaluate to export a SavedModel

estimator.export_savedmodel(model_export_path, _serving_input_receiver_fn)

To serve the model you can pull the tensorflow/serving docker image, you can refer https://www.tensorflow.org/serving/docker for help. (I suggest you pull the image with devel tag, since it's better to debug)

Simply run the below command to start serving

/usr/local/bin/tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=my_model --model_base_path my_model_path

The client code is simple, but should take care of. Because the serialized example should be encoded with base64 and added a b64 key.

import requests
resp = requests.post('http://host:8501/v1/models/my_model:predict', json={
        'instances': [
            {'example': {'b64':base64.b64encode(single_example.SerializeToString())}}
        ]})
resp.json()

If you have any question, just comment below.

like image 164
Wenmin Wu Avatar answered Sep 26 '22 01:09

Wenmin Wu