Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Get info of exposed models in Tensorflow Serving

Once I have a TF server serving multiple models, is there a way to query such server to know which models are served?

Would it be possible then to have information about each of such models, like name, interface and, even more important, what versions of a model are present on the server and could potentially be served?

like image 453
5agado Avatar asked Jan 05 '18 12:01

5agado


2 Answers

It is really hard to find some info about this, but there is possibility to get some model metadata.

request = get_model_metadata_pb2.GetModelMetadataRequest()
request.model_spec.name = 'your_model_name'
request.metadata_field.append("signature_def")
response = stub.GetModelMetadata(request, 10)
    
print(response.model_spec.version.value)
print(response.metadata['signature_def'])

Hope it helps.

Update

Is is possible get these information from REST API. Just get

http://{serving_url}:8501/v1/models/{your_model_name}/metadata

Result is json, where you can easily find model specification and signature definition.

like image 87
Thran Avatar answered Nov 04 '22 16:11

Thran


It is possible to get model status as well as model metadata. In the other answer only metadata is requested and the response, response.metadata['signature_def'] still needs to be decoded.

I found the solution is to use the built-in protobuf method MessageToJson() to convert to json string. This can then be converted to a python dictionary with json.loads()

import grpc
import json
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc
from tensorflow_serving.apis import model_service_pb2_grpc
from tensorflow_serving.apis import get_model_status_pb2
from tensorflow_serving.apis import get_model_metadata_pb2
from google.protobuf.json_format import MessageToJson

PORT = 8500
model = "your_model_name"

channel = grpc.insecure_channel('localhost:{}'.format(PORT))

request = get_model_status_pb2.GetModelStatusRequest()
request.model_spec.name = model
result = stub.GetModelStatus(request, 5)  # 5 secs timeout
print("Model status:")
print(result)

stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
request = get_model_metadata_pb2.GetModelMetadataRequest()
request.model_spec.name = model
request.metadata_field.append("signature_def")
result = stub.GetModelMetadata(request, 5)  # 5 secs timeout
result = json.loads(MessageToJson(result))
print("Model metadata:")
print(result)
like image 25
Tyler Avatar answered Nov 04 '22 18:11

Tyler