How can I use multiple tensorflow
models?
I use docker container.
model_config_list: {
config: {
name: "model1",
base_path: "/tmp/model",
model_platform: "tensorflow"
},
config: {
name: "model2",
base_path: "/tmp/model2",
model_platform: "tensorflow"
}
}
Introduction. TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs.
The command to start the TensorFlow Model Server is tensorflow_model_server. Include the -h parameter to get the full usage notes. By default, TensorFlow Model Server listens on port 8500 using the gRPC API. To use a different port, specify --port=<port number> on the command line.
Built a docker image from official tensorflow serving
docker file
Then inside docker image.
/usr/local/bin/tensorflow_model_server --port=9000 --model_config_file=/serving/models.conf
here /serving/models.conf
is a similar file as yours.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With