Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery Group task for use in a map/reduce workflow

Tags:

celery

Can I use a Celery Group primitive as the umbrella task in a map/reduce workflow?

Or more specific: Can the subtasks in a Group be run on multiple workers on multiple servers?

From the docs:

However, if you call apply_async on the group it will send a special 
grouping task, so that the action of calling the tasks happens in a worker 
instead of the current process

That seems to imply the tasks are all send to one worker...

Before 3.0 (and still) one could fire off the subtasks in a TaskSet which would run on multiple servers. The problem is determining whether all tasks have finished executing. That is normally done by polling all subtasks which is not really elegant. I am wondering if the Group primitive can be used to mitigate this problem.

like image 611
RickyA Avatar asked Oct 10 '12 14:10

RickyA


1 Answers

I found out it is possible to use Chords for such a map reduce like problem.

@celery.task(name='ic.mapper')
def mapper():
    #split your problem in embarrassingly parallel maps 
    maps = [map.s(), map.s(), map.s(), map.s(), map.s(), map.s(), map.s(), map.s()]
    #and put them in a chord that executes them in parallel and after they finish calls 'reduce'
    mapreduce = celery.chord(maps)(reduce.s())    
    return "{0} mapper ran on {1}".format(celery.current_task.request.id, celery.current_task.request.hostname)

@celery.task(name='ic.map')
def map():
    #do something useful here
    import time
    time.sleep(10.0)
    return "{0} map ran on {1}".format(celery.current_task.request.id, celery.current_task.request.hostname)

@celery.task(name='ic.reduce')
def reduce(results):
    #put the maps together and do something with the results
    return "{0} reduce ran on {1}".format(celery.current_task.request.id, celery.current_task.request.hostname)

When the mapper is executed on a cluster of three workers/servers it first executes the mapper which splits your problem and the creates new subtasks that are again submitted to the broker. These run in parallel because the queue is consumed by all brokers. Also an chord task is created that polls all maps to see if they have finished. When done the reduce task is executed where you can glue your results back together.

In all: yes it is possible. Thanks for the vegetable guys!

like image 91
RickyA Avatar answered Oct 26 '22 09:10

RickyA