I have 3 Separate Django Projects sharing same DB running on the same Machine. What I require is to configure Celery For them. Now my question is:
1.) Whether should I run separate celery daemons for separate projects, and set different vhosts and users in rabbitmq which I don't want to opt as it would be a waste of resources or
2.) Is there a way I can target all the tasks from different projects to a single celery server.
Also, How handy would supervisord be in the solution?
Celery has the ability to communicate and store with many different backends (Result Stores) and brokers (Message Transports).
It's not officially supported, but you can run Celery natively on Windows (so no WSL, Docker, etc.). Tested with the latest version 4.4.
Yes, you can use same celery server to receive task from seperate projects.
Have a seperate celery app(or just a single file) say foo
which has all tasks which are used in different projects.
# foo.py
from celery import Celery
app = Celery(broker='amqp://guest@localhost//')
@app.task
def add(x, y):
return x + y
@app.task
def sub(x, y):
return x - y
Start a worker to run tasks
celery worker -l info -A foo
Now from Project A, you can call add
import celery
celery.current_app.send_task('foo.add', args=(1, 2))
And from Project B, you can call sub
import celery
celery.current_app.send_task('foo.sub', args=(1, 2))
You can use supervisord, to manage celery worker.
This approach might be slightly harder for testing as send_task
won't respect CELERY_ALWAYS_EAGER
. However you can use this snippet so that CELERY_ALWAYS_EAGER
will be honored by send_task
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With