Now I want to create a container to run a dummy command inside an image following this simple documentation/tutorial: https://docker-py.readthedocs.io/en/stable/containers.html#container-objects
import docker
client = docker.from_env()
client.containers.run(shm_size='1g', ulimits=[docker.types.Ulimit(name='memlock', hard=-1), docker.types.Ulimit(name='stack', hard=67108864)], image='ubuntu:16.04', auto_remove=True, command='date')
Here is the result:
--------------------------------------------------------------------------- ContainerError Traceback (most recent call last) in () ----> 1 client.containers.run(shm_size='1g', ulimits=[docker.types.Ulimit(name='memlock', hard=-1), docker.types.Ulimit(name='stack', hard=67108864)], image='ubuntu:16.04', auto_remove=True, command='date')
~/anaconda3/lib/python3.7/site-packages/docker/models/containers.py in run(self, image, command, stdout, stderr, remove, **kwargs) 812 if exit_status != 0: 813 raise ContainerError( --> 814 container, exit_status, command, image, out 815 ) 816
Although the following command works perfectly:
docker run --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 --rm -t ubuntu:16.04 "date"
What is the problem in the combination of options that I used ?
Your python and shell commands are not identical: in the shell command, you are specifying the soft limits and in the python you are specifying the hard limits. The syntax for the argument to the --ulimit command flag is:
<type>=<soft limit>[:<hard limit>]
And the documentation explains:
Note: If you do not provide a hard limit, the soft limit will be used for both values. If no ulimits are set, they will be inherited from the default ulimits set on the daemon.
To get the identical behavior, I would try changing your python ulimit declarations to
docker.types.Ulimit(name='stack', soft=67108864, hard=67108864)]
This sounds like a shortcoming of the python documentation, which says only that both soft and hard are optional arguments.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With