Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to parse the output received by gRPC stub client from tensorflow serving server?

I have exported a DNNClassifier model and run it on tensorflow-serving server using docker. After that I have written a python client to interact with that tensorflow-serving for new prediction.

I have written the following code to get the response from tensorflow-serving server.

host, port = FLAGS.server.split(':')
  channel = implementations.insecure_channel(host, int(port))
  stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)

  request = predict_pb2.PredictRequest()
  request.model_spec.name = FLAGS.model
  request.model_spec.signature_name = 'serving_default'

  feature_dict = {'a': _float_feature(value=400),
                  'b': _float_feature(value=5),
                  'c': _float_feature(value=200),
                  'd': _float_feature(value=30),
                  'e': _float_feature(value=60),
                  'f': _float_feature(value=5),
                  'g': _float_feature(value=7500),
                  'h': _int_feature(value=1),
                  'i': _int_feature(value=1234),
                  'j': _int_feature(value=1),
                  'k': _int_feature(value=4),
                  'l': _int_feature(value=1),
                  'm': _int_feature(value=0)}
  example= tf.train.Example(features=tf.train.Features(feature=feature_dict))
  serialized = example.SerializeToString()

  request.inputs['inputs'].CopyFrom(
        tf.contrib.util.make_tensor_proto(serialized, shape=[1]))

  result_future = stub.Predict.future(request, 5.0)
  print(result_future.result())
Now the output i m getting as my output is:-

enter image description here I m not able to figure out how to parse that float_val number because that is my output. Pls help.

like image 594
user3457384 Avatar asked Sep 15 '17 06:09

user3457384


3 Answers

You can do the following

result = stub.Predict(request, 5.0)
float_val = result.outputs['outputs'].float_val

Note that this method calls stub.Predict instead of stub.Predict.future

like image 93
Maxime De Bruyn Avatar answered Nov 14 '22 21:11

Maxime De Bruyn


In case you have more than one outputs, you do something like the following which basically creates a dictionary with keys corresponding to the outputs and values corresponding to a list of whatever the model returns.

results = dict()
for output in output_names:
    results[output] = response.outputs[output].float_val
like image 35
Mewtwo Avatar answered Nov 14 '22 22:11

Mewtwo


What you are looking for is probably tf.make_ndarray, which creates a numpy array from a TensorProto (i.e. is the inverse of tf.make_tensor_proto). This way your output recovers the shape it is supposed to have, so building upon Jasmine's answer you can store multiple outputs in a dictionary with:

response = prediction_service.Predict(request, 5.0)

results = {}
for output in response.outputs.keys():
    results[output] = tf.make_ndarray(response.outputs[output])
like image 1
Fidel I. Schaposnik Avatar answered Nov 14 '22 23:11

Fidel I. Schaposnik