I am developing a Flask application and leveraging blueprints. I plan to use celery task queues. I am trying to understand the benefit or reason to use something like
def make_celery(app):
celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
and then doing
celery = make_celery(app)
and importing it into my tasks.py versus just importing and creating a celery instances in my tasks.py like
from celery import Celery
app = Celery('hello', broker='amqp://guest@localhost//')
@app.task
def mytask():
If You are writing a simple task, it is better to import celery & decorate your function.
If You are creating some complex tasks, it is better to sublcass Task. Here You will get the power of OOP. You can break your code into small blocks. That makes it easier to unit test your code. Also if you want some custom config for all of your tasks, you can have a custom baseclass & you can inherit from it for all the tasks.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With