I'm using Celery in Python to run background tasks and couldn't find any definitive answer to the question of whether I can split the Celery task definition from task implementation?
For example, take the really simple task below:
@celery_app.task
def add_numbers(num1, num2):
return num1 + num2
The definition and implementation are in the same file i.e. when the caller imports this module to call add_numbers
, both the definition and implementation are imported.
In this case, not so bad. But my tasks are a bit more complex, importing multiple modules and packages that the caller certainly doesn't need and I'd like to keep out of the caller.
So, does Celery provide a way to do this? Or am I going against the framework? Is this even a problem?
I have seen this question Celery dynamic tasks / hiding Celery implementation behind an interface implementation-behind-an-interface, but it is well over two years old - more than enough time for a lot to change.
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operations but supports scheduling as well. The execution units, called tasks, are executed concurrently on one or more worker servers using multiprocessing, Eventlet, or gevent.
So Celery (and other queue frameworks) has other benefits as well - Think of it as a 'task/function manager' rather then just a way of multithreading.
celery beats only trigger those 1000 tasks (by the crontab schedule), not run them. If you want to run 1000 tasks in parallel, you should have enough celery workers available to run those tasks.
There's a feature called signatures which allows calling tasks without importing them. You will need the Celery app instance to be available:
sig = celery_app.signature('myapp.add_numbers', args=(1,2))
sig.delay()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With