I have a flask app inside of a docker container. I would like to use the python package zappa to deploy that app to Amazon Web Services.
Unfortunately zappa requires that it and all of my apps dependencies be installed in a python virtual environment.
So I have rebuilt my docker image and moved everything into a virtual environment in it.
The problem is that now i can't run commands like:
docker exec <container> flask <sub command>
because flask is installed in a virtual environment which has not been activated.
I can still do this:
host$ docker exec -it <container> bash
container$ source venv/bin/activate
container$ flask <sub command>
Also, I can no longer run my default Dockerfile CMD (gunicorn) because that is also is my virtual environment.
Does this make any more sense?
While Docker provides an isolated environment for your Python application, you're better off by using virtualenv (or your tool of choice) nevertheless. It can help you to maintain control over your Python environment & dependencies.
To install virtualenv, just use pip install virtualenv . To create a virtual environment directory with it, type virtualenv /path/to/directory . Activating and deactivating the virtual environment works the same way as it does for virtual environments in Python 3 (see above).
Differences between CMD & ENTRYPOINT CMD commands are ignored by Daemon when there are parameters stated within the docker run command while ENTRYPOINT instructions are not ignored but instead are appended as command line parameters by treating those as arguments of the command.
As an alternative to just sourcing the script inline with the command, you could make a script that acts as an ENTRYPOINT
. An example entrypoint.sh
would look something like:
#!/bin/bash
source venv/bin/activate
exec "$@"
Then in your Dockerfile
you would copy this file and set it as the ENTRYPOINT
:
FROM myimage
COPY entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]
Now you can run it like docker run mynewimage flask <sub command>
or docker run mynewimage gunicorn
.
You don't need to activate the env. Prepend /path/to/virtualenv/bin
to $PATH
, then python
, pip
, etc. automatically point to the commands in the virtualenv.
FROM python:3.4-alpine
WORKDIR /deps
ENV PATH=/virtualenv/bin:$PATH
RUN pip install virtualenv && \
mkdir virtualenv && \
virtualenv /virtualenv
COPY . /deps
Example working:
#Build dockerfile
docker build . -t="venv_example"
#Run all python commands in virtualenv w/ no hassle
docker run --rm venv_example which python
>/virtualenv/bin/python
docker run --rm venv_example which pip
>/virtualenv/bin/pip
Try:
docker exec <container> sh -c 'source venv/bin/activate; flask <sub command>'
Your command can be:
CMD sh -c 'source venv/bin/activate; gunicorn...'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With