Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Docker config : Celery + RabbitMQ

How do I run Celery and RabbitMQ in a docker container? Can you point me to sample dockerfile or compose files?

This is what I have:

Dockerfile:

FROM python:3.4
ENV PYTHONBUFFERED 1
WORKDIR /tasker
ADD requirements.txt /tasker/
RUN pip install -r requirements.txt
ADD . /tasker/

docker-compose.yml

rabbitmq:
  image: tutum/rabbitmq
  environment:
    - RABBITMQ_PASS=mypass
  ports:
    - "5672:5672"   
    - "15672:15672"
celery:
  build: .
  command: celery worker --app=tasker.tasks
  volumes:
    - .:/tasker
  links:
    - rabbitmq:rabbit

The issue I'm having is I cant get Celery to stay alive or running. It keeps exiting.

like image 832
blue_zinc Avatar asked Jun 16 '16 23:06

blue_zinc


2 Answers

I have similar Celery exiting problem while dockerizing the application. You should use rabbit service name ( in your case it's rabbitmq) as host name in your celery configuration.That is,
use broker_url = 'amqp://guest:guest@rabbitmq:5672//' instead of broker_url = 'amqp://guest:guest@localhost:5672//' .

In my case, major components are Flask, Celery and Redis.My problem is HERE please check the link, you may find it useful.

like image 110
JPG Avatar answered Sep 20 '22 20:09

JPG


Update 2018, as commented below by Floran Gmehlin, The celery image is now officially deprecated in favor of the official python image.

As commented in celery/issue 1:

Using this image seems ridiculous. If you have an application container, as you usually have with Django, you need all dependencies (things you import in tasks.py) installed in this container again.

That's why other projects (e.g. cookiecutter-django) reuse the application container for Celery, and only run a different command (command: celery ... worker) against it with docker-compose.

Note, now the docker-compose.yml is called local.yml and use start.sh.


Original answer:

You can try and emulate the official celery Dockerfile, which does a bit more setup before the CMD ["celery", "worker"].

See the usage of that image to run it properly.

start a celery worker (RabbitMQ Broker)

$ docker run --link some-rabbit:rabbit --name some-celery -d celery

check the status of the cluster

$ docker run --link some-rabbit:rabbit --rm celery celery status

If you can use that image in your docker-compose, then you can try building your own starting FROM celery instead of FROM python.

like image 39
VonC Avatar answered Sep 17 '22 20:09

VonC