Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I specify the "model_config_file" variable to tensorflow-serving in docker-compose?

I will preface this by saying that I am inexperienced with docker and docker-compose. I am trying to convert my docker run ... command to a docker-compose.yml file, however, I cannot get the models.config file to be found.

I am able to correctly run a tensorflow-serving docker container using the following docker run ... command:

docker run -t --rm \
  tensorflow/serving \
  -p 8501:8501 \
  -v "$(pwd)/models/:/models/" \
  --model_config_file=/models/models.config \
  --model_config_file_poll_wait_seconds=60

This works as expected, and the models.config file is found in the container at /models/models.config as expected.

The tensorflow-serving pages do not mention anything about docker-compose, however, I would much rather use this than a docker run ... command. My attempt at a docker-compose file is:

version: '3.3'
services:
  server:
    image: tensorflow/serving
    ports:
      - '8501:8501'
    volumes:
      - './models:/models'
    environment:
      - 'model_config_file=/models/models.config'
      - 'model_config_file_poll_wait_seconds=60'

Using this docker-compose file, the container runs, however, it seems like the environment variables are completely ignored, so I'm not sure if this is how I should set them. The container image looks in the default location for the models.config file, and it doesn't exist there, and so it does not load the configuration defined in models.config.

So, how can I define these values, or run a tensorflow-serving container, using docker-compose?

I appreciate any help.

Thanks

like image 442
waddington Avatar asked Feb 25 '20 14:02

waddington


People also ask

How do you run a TensorFlow serving?

To get this model, first clone the TensorFlow Serving repo. This will run the docker container, launch the TensorFlow Serving Model Server, bind the REST API port 8501, and map our desired model from our host to where models are expected in the container.

What's the name of the package you install to get TensorFlow serving?

TensorFlow Serving Python API PIP package.

What is TF serving?

Introduction. TensorFlow Serving is a flexible, high-performance serving system for machine learning models, designed for production environments. TensorFlow Serving makes it easy to deploy new algorithms and experiments, while keeping the same server architecture and APIs.


1 Answers

So I have come across a solution elsewhere that I haven't found on any threads/posts/etc that talk about tensorflow/serving, so I will post my answer here.

Adding those options underneath a command section, as follows, works.

version: '3.3'
services:
  server:
    image: tensorflow/serving
    ports:
      - '8501:8501'
    volumes:
      - './models:/models'
    command:
      - '--model_config_file=/models/models.config'
      - '--model_config_file_poll_wait_seconds=60'

I don't know a lot about docker so I don't know if this is an obvious answer, but I didn't find a solution even after a lot of Google'ing.

like image 197
waddington Avatar answered Sep 20 '22 17:09

waddington