I have exported a DNNClassifier model and run it on tensorflow-serving server using docker. After that I have written a python client to interact with that tensorflow-serving for new prediction.
I have written the following code to get the response from tensorflow-serving server.
host, port = FLAGS.server.split(':')
channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
request = predict_pb2.PredictRequest()
request.model_spec.name = FLAGS.model
request.model_spec.signature_name = 'serving_default'
feature_dict = {'a': _float_feature(value=400),
'b': _float_feature(value=5),
'c': _float_feature(value=200),
'd': _float_feature(value=30),
'e': _float_feature(value=60),
'f': _float_feature(value=5),
'g': _float_feature(value=7500),
'h': _int_feature(value=1),
'i': _int_feature(value=1234),
'j': _int_feature(value=1),
'k': _int_feature(value=4),
'l': _int_feature(value=1),
'm': _int_feature(value=0)}
example= tf.train.Example(features=tf.train.Features(feature=feature_dict))
serialized = example.SerializeToString()
request.inputs['inputs'].CopyFrom(
tf.contrib.util.make_tensor_proto(serialized, shape=[1]))
result_future = stub.Predict.future(request, 5.0)
print(result_future.result())
I m not able to figure out how to parse that float_val number because that is my output. Pls help.
You can do the following
result = stub.Predict(request, 5.0)
float_val = result.outputs['outputs'].float_val
Note that this method calls stub.Predict
instead of stub.Predict.future
In case you have more than one outputs, you do something like the following which basically creates a dictionary with keys corresponding to the outputs and values corresponding to a list of whatever the model returns.
results = dict()
for output in output_names:
results[output] = response.outputs[output].float_val
What you are looking for is probably tf.make_ndarray
, which creates a numpy array from a TensorProto (i.e. is the inverse of tf.make_tensor_proto
). This way your output recovers the shape it is supposed to have, so building upon Jasmine's answer you can store multiple outputs in a dictionary with:
response = prediction_service.Predict(request, 5.0)
results = {}
for output in response.outputs.keys():
results[output] = tf.make_ndarray(response.outputs[output])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With