Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to send a tf.example into a TensorFlow Serving gRPC predict request

I have data in tf.example form and am attempting to make requests in predict form (using gRPC) to a saved model. I am unable to identify the method call to effect this.

I am starting with the well known Automobile pricing DNN regression model (https://github.com/tensorflow/models/blob/master/samples/cookbook/regression/dnn_regression.py) which I have already exported and mounted via the TF Serving docker container

import grpc
import numpy as np
import tensorflow as tf
from tensorflow_serving.apis import predict_pb2, prediction_service_pb2_grpc

stub = prediction_service_pb2_grpc.PredictionServiceStub(grpc.insecure_channel("localhost:8500"))

tf_ex = tf.train.Example(
    features=tf.train.Features(
        feature={
            'curb-weight': tf.train.Feature(float_list=tf.train.FloatList(value=[5.1])),
            'highway-mpg': tf.train.Feature(float_list=tf.train.FloatList(value=[3.3])),
            'body-style': tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"wagon"])),
            'make': tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"Honda"])),
        }
    )
)

request = predict_pb2.PredictRequest()
request.model_spec.name = "regressor_test"

# Tried this:
request.inputs['inputs'].CopyFrom(tf_ex)

# Also tried this:
request.inputs['inputs'].CopyFrom(tf.contrib.util.make_tensor_proto(tf_ex))

# This doesn't work either:
request.input.example_list.examples.extend(tf_ex)

# If it did work, I would like to inference on it like this:
result = self.stub.Predict(request, 10.0)

Thanks for any advice

like image 590
jpjenk Avatar asked Oct 28 '22 23:10

jpjenk


1 Answers

I assume your savedModel has an serving_input_receiver_fn taking string as input and parse to tf.Example. Using SavedModel with Estimators

def serving_example_input_receiver_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string)
    receiver_tensors = {'inputs': serialized_tf_example}   
    features = tf.parse_example(serialized_tf_example, YOUR_EXAMPLE_SCHEMA)
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

so, serving_input_receiver_fn accepts a string, so you have to SerializeToString your tf.Example(). Besides, serving_input_receiver_fn works like input_fn to training, data dump into model in a batch fashion.

The code may change to :

request = predict_pb2.PredictRequest()
request.model_spec.name = "regressor_test"
request.model_spec.signature_name = 'your method signature, check use saved_model_cli'
request.inputs['inputs'].CopyFrom(tf.make_tensor_proto([tf_ex.SerializeToString()], dtype=types_pb2.DT_STRING))
like image 66
hakunami Avatar answered Oct 31 '22 08:10

hakunami