I have a Dockerfile where I try to activate python virtualenv after what, it should install all dependencies within this env. However, everything still gets installed globally. I used different approaches and non of them worked. I also do not get any errors. Where is a problem?
1.
ENV PATH $PATH:env/bin
2.
ENV PATH $PATH:env/bin/activate
3.
RUN . env/bin/activate
I also followed an example of a Dockerfile config for the python-runtime image on Google Cloud, which is basically the same stuff as above.
Setting these environment variables are the same as running source /env/bin/activate.
ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH
Additionally, what does ENV VIRTUAL_ENV /env
mean and how it is used?
While Docker provides an isolated environment for your Python application, you're better off by using virtualenv (or your tool of choice) nevertheless. It can help you to maintain control over your Python environment & dependencies.
The WORKDIR command is used to define the working directory of a Docker container at any given time. The command is specified in the Dockerfile. Any RUN , CMD , ADD , COPY , or ENTRYPOINT command will be executed in the specified working directory.
If you try to run virtualenv and find it isn't present, you can install it using pip. virtualenv.exe will likely now be found in your python installation directory under the Scripts subdirectory.
There are perfectly valid reasons for using a virtualenv within a container.
You don't necessarily need to activate the virtualenv to install software or use it. Try invoking the executables directly from the virtualenv's bin
directory instead:
FROM python:2.7
RUN virtualenv /ve
RUN /ve/bin/pip install somepackage
CMD ["/ve/bin/python", "yourcode.py"]
You may also just set the PATH environment variable so that all further Python commands will use the binaries within the virtualenv as described in https://pythonspeed.com/articles/activate-virtualenv-dockerfile/
FROM python:2.7
RUN virtualenv /ve
ENV PATH="/ve/bin:$PATH"
RUN pip install somepackage
CMD ["python", "yourcode.py"]
You don't need to use virtualenv inside a Docker Container.
virtualenv is used for dependency isolation. You want to prevent any dependencies or packages installed from leaking between applications. Docker achieves the same thing, it isolates your dependencies within your container and prevent leaks between containers and between applications.
Therefore, there is no point in using virtualenv inside a Docker Container unless you are running multiple apps in the same container, if that's the case I'd say that you're doing something wrong and the solution would be to architect your app in a better way and split them up in multiple containers.
EDIT 2022: Given this answer get a lot of views, I thought it might make sense to add that now 4 years later, I realized that there actually is valid usages of virtual environments in Docker images, especially when doing multi staged builds:
FROM python:3.9-slim as compiler
ENV PYTHONUNBUFFERED 1
WORKDIR /app/
RUN python -m venv /opt/venv
# Enable venv
ENV PATH="/opt/venv/bin:$PATH"
COPY ./requirements.txt /app/requirements.txt
RUN pip install -Ur requirements.txt
FROM python:3.9-slim as runner
WORKDIR /app/
COPY --from=compiler /opt/venv /opt/venv
# Enable venv
ENV PATH="/opt/venv/bin:$PATH"
COPY . /app/
CMD ["python", "app.py", ]
In the Dockerfile
example above, we are creating a virtualenv at /opt/venv
and activating it using an ENV
statement, we then install all dependencies into this /opt/venv
and can simply copy this folder into our runner
stage of our build. This can help with with minimizing docker image size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With