Say I have a celery worker that depends on a large module, X.
Since task definitions require a reference to the worker app def (e.g., @app.task) this implies that my "client" (the code scheduling the task) needs also to depend on this module.
This doesn't make sense to me -- have I got this wrong?
A). I don't want my task caller to have these dependencies (e.g., they might be in different docker containers).
B). For security reasons I don't want my task caller to have access to this code.
Is there a way around it?
Thanks,
RB
We have built-in support for JSON, YAML, Pickle, and msgpack. Every task is associated with a content type, so you can even send one task using pickle, another using JSON. The default serialization support used to be pickle, but since 4.0 the default is now JSON.
Celery workers are worker processes that run tasks independently from one another and outside the context of your main service. Celery beat is a scheduler that orchestrates when to run tasks. You can use it to schedule periodic tasks as well.
Not only CAN Celery run more than one worker, that is in fact the very point, and reason Celery even exists and it's whole job is to manage not just multiple workers but conceivably across machines and distributed.
celery beats only trigger those 1000 tasks (by the crontab schedule), not run them. If you want to run 1000 tasks in parallel, you should have enough celery workers available to run those tasks.
Your client code can start tasks remotely without having to import the implementation of the tasks. You must obviously configure the client to connect to the same broker as the workers but once that is done, then you can use signatures to invoke the tasks:
import celery
result = celery.signature("app.tasks.foo", args=(1, )).delay().get()
The first parameter to celery.signature
is the name of the task. It is typically the absolute name of the module that contains the task (e.g. app.tasks
in the code above) plus the task name (foo
).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With