Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Share a common utility function between Celery tasks

I' ve got a bunch of tasks in Celery all connected using a canvas chain.

@shared_task(bind=True)
def my_task_A(self):
    try:
        logger.debug('running task A')
        do something
    except Exception:
        run common cleanup function

@shared_task(bind=True)
def my_task_B(self):
    try:
        logger.debug('running task B')
        do something else
    except Exception:
        run common cleanup function

...

So far so good. The problem is that I'm looking for the best practice when it comes to using a common utility function like this:

def cleanup_and_notify_user(task_data):
    logger.debug('task failed')
    send email
    delete folders
    ...

What't the best way to do that without the tasks blocking? For example can I just replace run common cleanup function with a call to cleanup_and_notify_user(task_data)? And what would happen if multiple tasks from multiple workers attempt to call that function at the same time?

Does each worker get its own copy? I apparently am a bit confused over few of the concepts here. Any help is much appreciated.

Thank you all in advance.

like image 849
stratis Avatar asked Nov 01 '22 05:11

stratis


1 Answers

From inside a celery task you are just coding python therefore the task has is own process and the function will be just instantiated for each task like in any basic OOP logic. Off course if this cleanup function attempt to delete shared resources like system folders or database rows you end up in concurrent access problem of external resources that you need to solve in other way, for example in case of fylesystem you could create a sandbox for each task. Hope this helps

like image 195
Mauro Rocco Avatar answered Nov 08 '22 06:11

Mauro Rocco