Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery - Can a message in RabbitMQ be consumed by two or more workers at the same time?

Perhaps I'm being silly asking the question but I need to wrap my head around the basic concepts before I do further work.

I am processing a few thousand RSS feeds, using multiple Celery worker nodes and a RabbitMQ node as the broker. The URL of each feed is being written as a message in the queue. A worker just reads the URL from the queue and starts processing it. I have to ensure that a single RSS feed does not get processed by two workers at the same time.

The article Ensuring a task is only executed one at a time suggests a Memcahced-based solution for locking the feed when it's being processed.

But what I'm trying to understand is that why do I need to use Memcached (or something else) to ensure that a message on a RabbitMQ queue not be consumed by multiple workers at the same time. Is there some configuration change in RabbitMQ (or Celery) that I can do to achieve this goal?

like image 814
rubayeet Avatar asked Aug 28 '12 05:08

rubayeet


3 Answers

A single MQ message will certainly not be seen by multiple consumers in a normal working setup. You'll have to do some work for the cases involving failing/crashing workers, read up on auto-acks and message rejections, but the basic case is sound.

I don't see a synchronized queue (read: MQ) in the article you've linked, so (as far as I can tell) they're using the lock mechanism (read: memcache) to synchronize, as an alternative. And I can think of a few problems which wouldn't be there in a proper MQ setup.

like image 97
aib Avatar answered Oct 19 '22 02:10

aib


As noted by others you are mixing apples and oranges.

Being a celery task and a MQ message.

You can ensure that a message will be processed by only one worker at the same time.

eg.

@task(...)
def my_task(

my_task.apply(1)

the .apply publishes a message to the message broker you are using (rabbit, redis...). Then the message will get routed to a queue and consumed by one worker at time. you dont need locking for this, you have it for free :)

The example on the celery cookbook shows how to prevent two messages like that (my_task.apply(1)) from running at the same time, this is something you need to ensure within the task itself.

You need something which you can access from all workers of course (memcached, redis ...) as they might be running on different machines.

like image 4
Tommaso Barbugli Avatar answered Oct 19 '22 04:10

Tommaso Barbugli


Mentioned example typically used for other goal: it prevents you from working with different messages with the same meaning (not the same message). Eg, I have two processes: first one puts to queue some URLs, and second one - takes URL from queue and fetch them. What will be if first process puts to queue one URL twice (or even more times)?

P.S. I use for this purpose Redis storage and setnx operation (which can set key only once).

like image 2
Alexey Kachayev Avatar answered Oct 19 '22 02:10

Alexey Kachayev