Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I use tensorflow serving for multiple models

How can I use multiple tensorflow models? I use docker container.

model_config_list: {

  config: {
    name: "model1",
    base_path: "/tmp/model",
    model_platform: "tensorflow"
  },
  config: {
     name: "model2",
     base_path: "/tmp/model2",
     model_platform: "tensorflow"
  }
}
like image 869
onuryartasi Avatar asked Aug 18 '17 05:08

onuryartasi


People also ask

How does TensorFlow serving work?

Introduction. TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs.

How do I start a TensorFlow server?

The command to start the TensorFlow Model Server is tensorflow_model_server. Include the -h parameter to get the full usage notes. By default, TensorFlow Model Server listens on port 8500 using the gRPC API. To use a different port, specify --port=<port number> on the command line.


1 Answers

Built a docker image from official tensorflow serving docker file

Then inside docker image.

/usr/local/bin/tensorflow_model_server --port=9000 --model_config_file=/serving/models.conf 

here /serving/models.conf is a similar file as yours.

like image 82
Sumsuddin Shojib Avatar answered Sep 17 '22 21:09

Sumsuddin Shojib