We have two servers, Server A and Server B. Server A is dedicated for running django web app. Due to large number of data we decided to run the celery tasks in server B. Server A and B uses a common database. Tasks are initiated after post save in models from Server A,webapp. How to implement this idea using rabbitmq in my django project
You have 2 servers, 1 project and 2 settings(1 per server).
server A (web server + rabbit
)
server B (only celery
for workers)
Then you set up the broker url in both settings. Something like this:
BROKER_URL = 'amqp://user:password@IP_SERVER_A:5672//'
matching server A to IP of server A in server B settings.
For now, any task must be sent to rabbit
in server A to virtual server /.
In server B, you must just initialize celery
worker, something like this:
python manage.py celery worker -Q queue_name -l info
and thats it.
Explanation: django
sends messages to rabbit
to queue a task, then celery
workers request some message to execute a task.
Note: Is not required that rabbitMQ
have to be installed in server A, you can install rabbit in server C and reference it in the BROKER_URL
in both settings(A and B) like this: BROKER_URL='amqp://user:password@IP_SERVER_C:5672//'
.
Sorry for my English.
greetings.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With