Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django + Celery tasks on multiple worker nodes

I've deployed a django(1.10) + celery(4.x) on the same VM, with rabbitmq being the broker(on the same machine). I want to develop the same application on a multi-node architecture like I can just replicate a number of worker nodes, and scale the tasks to run quickly. Here,

  1. How to configure celery with rabbitmq for this architecture?
  2. On the other worker nodes, what should be the setup?
like image 778
Vatsal Parekh Avatar asked Feb 08 '17 11:02

Vatsal Parekh


1 Answers

You should have borker in one node and configure it so that, workers from other nodes can access it.

For that, you can create a new user/vhost on rabbitmq.

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart

From other nodes, you can queue up tasks or you can just run workers to consume tasks.

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
    return x + y

If you have a file(say task.py) like this, you can queue up tasks using add.delay().

You can also start worker with

celery worker -A task -l info

You can see my answer here to get a brief idea about how to run tasks on remote machines. For a step by step process, you can checkout a post i have written on scaling celery.

like image 139
Pandikunta Anand Reddy Avatar answered Oct 10 '22 11:10

Pandikunta Anand Reddy