I am using celery-django to queue tasks on my site backend. I am trying to create a setup where I have two queues named "low" and "high" and two workers W1 and W2. I want them to consume the tasks from the queue in the following way:
W1 <-- low, high
W2 <-- high
Normally it can be done like this.
Open terminal 1 and enter
$celery worker -n W1 -Q low,high
Open terminal 2 and enter
$celery worker -n W2 -Q high
However I am trying to do the same via celeryd daemon.
I am following the steps given in the link: http://celery.readthedocs.org/en/latest/tutorials/daemonizing.html#example-configuration But the available options don't seem enough to fit the requirement.
Please help me with some configs that I am unaware of which could make it possible. I would prefer not to run multiple daemons or use additional tools like supervisord unless really necessary (Maybe you could advice me on this as well).
Note: The goal of this article is to show how to work with 3 different type of celery tasks in multiple queues: - Small in numbers, but high priority tasks, default queue, - long running tasks, long queue, - huge amount of small tasks, numerous queue. Other parts of the app are for illustration purposes and can be far from being optimal.
Open docker-compose.yml and add 2 blocks under services: celery-long and celery-numerous so the whole file looks like this: Take a closer look at commands for celery-long and celery-numerous, the -Q parameter says which queue worker will consume tasks from.
Our hypotetical app will have 3 categories of Celery tasks: - one that are basic tasks that power the interface of the app - long running tasks that process uploaded files - tasks that involve interaction with a 3rd party API and are large in numbers.
Latest log records will start from celery-numberous_1, and it means that process_contact_mx_records went into the numerous queue. If you scroll up and search for process_uploaded_file you will see that it went to the long queue and the container that runs it is called celery-long_1.
You can specify multiple nodes in CELERY_NODES and pass node names to CELERYD_OPTS arguments, for example:
CELERY_NODES="W1 W2"
CELERYD_OPTS="-Q:W1 low,high -Q:W2 high"
You can use CELERYD_OPTS
option passing -Q parameter similar to those examples from Celery reference:
# Advanced example starting 10 workers in the background:
# * Three of the workers processes the images and video queue
# * Two of the workers processes the data queue with loglevel DEBUG
# * the rest processes the default' queue.
$ celery multi start 10 -l INFO -Q:1-3 images,video -Q:4,5 data
-Q default -L:4,5 DEBUG
# You can show the commands necessary to start the workers with
# the 'show' command:
$ celery multi show 10 -l INFO -Q:1-3 images,video -Q:4,5 data
-Q default -L:4,5 DEBUG
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With