I am pretty new to Celery and I thought I had read somewhere that the task results only stay around for a limited time. However my backend (redis) is getting pretty bloated after running a lot of tasks through it.
Is there a way to set a TTL on task results or is this something I need to manually purge (and how)?
celery beats only trigger those 1000 tasks (by the crontab schedule), not run them. If you want to run 1000 tasks in parallel, you should have enough celery workers available to run those tasks.
Process of Task Execution by Celery can be broken down into:Your application sends the tasks to the task broker, it is then reserved by a worker for execution & finally the result of task execution is stored in the result backend.
Celery will stop retrying after 7 failed attempts and raise an exception.
According to the celery documentation you can completely ignore all results using CELERY_IGNORE_RESULT
.
You can also expire results after a set amount of time using CELERY_RESULT_EXPIRES
, which defaults to 1 day. In the notes it says this should just work with the redis backend, whereas some of the other backends require celery beat
to be running.
There is also the CELERY_MAX_CACHED_RESULTS
setting that caches up to 5,000 results by default.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With