Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Decouple and Dockerize Django and Celery

I am wondering what is the best way to decouple Celery from Django in order to dockerize the two parts and use docker swarm service? Typically one starts their celery workers and celery beat using a command that references there Django application:

celery worker -A my_app
celery beat -A my_app

From this I believe celery picks up config info from settings file and a celery.py file which is easy to move to a microservice. What I don't totally understand is how the tasks would leverage the Django ORM? Or is that not really the microservices mantra and Celery should be designed to make GET/POST calls to Django REST Framework API for the data it needs to complete the task?

like image 314
moku Avatar asked Jan 17 '17 15:01

moku


1 Answers

I use a setup where the code for both the django app and its celery workers is the same (as in a single repository).

When deploying I make sure to have the same code release everywhere, to avoid any surprises with the ORM, etc...

Celery starts with a reference to the django app, so that it has access to the models, etc...

Communication between the workers and the main app happens either through the messaging queue (rabbitmq or redis...) or via the database (as in, the celery worker works directly in the db, since it knows the models, etc...).

I'm not sure if that follows the microservices mantra, but it does work :)

like image 125
Laurent S Avatar answered Sep 29 '22 14:09

Laurent S