Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use the @shared_task decorator for class based tasks

Tags:

python

celery

As seen on the documentation the @shared_task decorator lets you create tasks without having any concrete app instance. The given examples shows how to decorate a function based task.

How to decorate a class based task?

like image 405
Juan Riaza Avatar asked Jan 20 '14 11:01

Juan Riaza


People also ask

How do I register a task on celery?

Register a task in the task registry. The task will be automatically instantiated if not already an instance. Name must be configured prior to registration. Unregister task by name.

What is a celery shared task?

The "shared_task" decorator allows creation of Celery tasks for reusable apps as it doesn't need the instance of the Celery app. It is also easier way to define a task as you don't need to import the Celery app instance.

What is task decorator in Python?

In Python, decorators are functions that take another function as an argument and extend the behavior of that function. In the context of Airflow, decorators provide a simpler, cleaner way to define your tasks and DAG.

How does celery retry work?

In the above example, the task will retry after a 5 second delay (via countdown ) and it allows for a maximum of 7 retry attempts (via max_retries ). Celery will stop retrying after 7 failed attempts and raise an exception.


1 Answers

Quoting Ask from celery-users thread where he explained difference between @task a @shared_task. Here is link to the thread

TL;DR; @shared_task will create the independent instance of the task for each app, making task reusable.

There is a difference between @task(shared=True) and @shared_task

The task decorator will share tasks between apps by default so that if you do:

app1 = Celery()  @app1.task  def test():      pass   app2 = Celery()  

the test task will be registered in both apps:

 assert app1.tasks[test.name]   assert app2.tasks[test.name]  

However, the name ‘test’ will always refer to the instance bound to the ‘app1’ app, so it will be configured using app1’s configuration:

assert test.app is app1  

The @shared_task decorator returns a proxy that always uses the task instance in the current_app:

app1 = Celery()   @shared_task  def test():      pass  assert test.app is app1    app2 = Celery()  assert test.app is app2  

This makes the @shared_task decorator useful for libraries and reusable apps, since they will not have access to the app of the user.

In addition the default Django example project defines the app instance as part of the Django project:

from proj.celery import app

and it makes no sense for a Django reusable app to depend on the project module, as then it would not be reusable anymore.

like image 67
Saurabh Avatar answered Sep 23 '22 00:09

Saurabh