I have an environment.yml
in my applications folder
I have this in my dockerfile:
RUN conda env create
RUN source activate myenvfromymlfile
When I run the container though the env is not activated. If I do conda env list
Is see /opt/conda
is activated:
root@9c7181cf86aa:/app# conda env list
# conda environments:
#
myenvfromymlfile /opt/conda/envs/myenvfromymlfile
root * /opt/conda
If I attach to the container I can manually run source activate myenvfromymlfile
and it works, but why doesn't that work in the RUN directive??
In examples, I see this often in dockerfiles that require conda:
CMD [ "source activate your-environment && exec python application.py" ]
Can someone explain why it is necessary to use && to make it a single command? And why running "source activate" in a RUN directive does not work? I want to have my dockerfile look like this:
RUN conda env create
RUN source activate myenvfromymlfile
ENTRYPOINT ["python"]
CMD ["application.py"]
To create a docker environment with its corresponding dependencies (both conda and pip), you can follow the below steps: Create the yml file, which lists the conda environment details (conda and pip).
To avoid these issues without keeping track of what you have installed, you can ask conda with conda env export --from-history to export an environment file with only the package specifications you have explicitly requested on the command line. This tiny bit of RTFM is my most popular tweet to date.
We can replace activate by setting the appropriate environment variables: Docker’s ENV command applies both subsequent RUNs as well as to the CMD. The result is the following Dockerfile:
Specifically, to activate a Conda environment, you usually run conda activate . So let’s try that as our first attempt, and see how it fails. We’ll start with an environment.yml file defining the Conda environment:
Consider the below Dockerfile
RUN conda env create
RUN source activate myenvfromymlfile
ENTRYPOINT ["python"]
CMD ["application.py"]
Statement #1 conda env create
. Create the environment and changes files on the disk.
Statement #2 source activate myenvfromymlfile
. Loads some stuff in the bash sessions. No disk changes done here
Statement #3 and #4 specifies what happens when you run the container
ENTRYPOINT ["python"]
CMD ["application.py"]
So now when you run the container. Anything that you did in step#2 is not there, because a shell was launched to run step #2, when it completed the shell was closed. Now when you run the image a new shell is started and it is brand new shell with now no knowledge that in past inside your dockerfile you ran source activate myenvfromymlfile
Now you want to run this application.py
in the environment you created. The default shell of docker is sh -c
. So when you set CMD
as below
CMD [ "source activate your-environment && exec python application.py" ]
The final command executed at start of container becomes
sh -c "source activate your-environment && exec python application.py"
Which activates the environment in current shell and then runs your program.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With