Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to serve pytorch or sklearn models using tensorflow serving

I have found tutorials and posts which only says to serve tensorflow models using tensor serving. In model.conf file, there is a parameter model_platform in which tensorflow or any other platform can be mentioned. But how, do we export other platform models in tensorflow way so that it can be loaded by tensorflow serving.

like image 865
user3742631 Avatar asked Apr 03 '18 07:04

user3742631


2 Answers

I'm not sure if you can. The tensorflow platform is designed to be flexible, but if you really want to use it, you'd probably need to implement a C++ library to load your saved model (in protobuf) and give a serveable to tensorflow serving platform. Here's a similar question.

I haven't seen such an implementation, and the efforts I've seen usually go towards two other directions:

  1. Pure python code serving a model over HTTP or GRPC for instance. Such as what's being developed in Pipeline.AI
  2. Dump the model in PMML format, and serve it with a java code.
like image 64
adrin Avatar answered Oct 01 '22 11:10

adrin


Not answering the question, but since no better answers exist yet: As an addition to the alternative directions by adrin, these might be helpful:

  • Clipper (Apache License 2.0) is able to serve PyTorch and scikit-learn models, among others
  • Further reading:
    • https://www.andrey-melentyev.com/model-interoperability.html
    • https://medium.com/@vikati/the-rise-of-the-model-servers-9395522b6c58
like image 32
hyttysmyrkky Avatar answered Oct 01 '22 13:10

hyttysmyrkky