Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

docker stop spark container from exiting

I know docker only listens to pid 1 and in case that pid exits (or turns into a daemon) it thinks the program exited and the container is shut down.

When apache-spark is started the ./start-master.sh script how can I kept the container running?

I do not think: while true; do sleep 1000; done is an appropriate solution.

E.g. I used command: sbin/start-master.sh to start the master. But it keeps shutting down.

How to keep it running when started with docker-compose?

like image 628
Georg Heiler Avatar asked Apr 15 '26 11:04

Georg Heiler


2 Answers

As mentioned in "Use of Supervisor in docker", you could use phusion/baseimage-docker as a base image in which you can register scripts as "services".

The my_init script included in that image will take care of the exit signals management.

And the processes launched by start-master.sh would still be running.
Again, that supposes you are building your apache-spark image starting from phusion/baseimage-docker.

As commented by thaJeztah, using an existing image works too: gettyimages/spark/~/dockerfile/. Its default CMD will keep the container running.

Both options are cleaner than relying on a tail -f trick, which won't handle the kill/exit signals gracefully.

like image 132
VonC Avatar answered Apr 18 '26 09:04

VonC


Here is another solution. Create a file spark-env.sh with the following contents and copy it into the spark conf directory.

SPARK_NO_DAEMONIZE=true

If your CMD in the Dockerfile looks like this:

CMD ["/spark/sbin/start-master.sh"]

the container will not exit.

like image 34
canadadry Avatar answered Apr 18 '26 09:04

canadadry