Hey :)
So I encountered a bug in my code and while trying to solve it - I wrote this little celery task:
@shared_task(bind=True, name='sometask', autoretry_for=(Exception,), default_retry_delay=1)
def sometask(self, items, *args, **kwargs):
print(self.max_retries)
raise Exception
and I tried to override the retries by envoking set() on the signature :
s = sometask.s(items=[]).set(max_retries=200, countdown=1)
s()
when I checked the output I could see that what was printed is 3 - the default max retries in celery. and not my 200..
Can anyone tell me what it is that I'm doing wrong?
I need to override the max retries and that seemed to be my only option , but it doesn't work as I expected.
Thanks!
Although the docs say that retry_policy
is a valid option to pass to apply_async
, it appears to be the retry policy to publish a task, not for the task itself to retry.
In addition, max_retries
for the task does not appear to be mutable at runtime. However, it can be set in the decorator: @shared_task(bind=True, name='sometask', autoretry_for=(Exception,), default_retry_delay=1, max_retries=200)
.
I can upon this when trying to change the max_retries at run time which doesn't seem possible, this issue, non-withstanding. One workaround:
@shared_task(bind=True, max_retries=200)
def sometask(self, items, *args, **kwargs):
print(self.max_retries)
raise Exception
# only want to retry normal 3 times here
sometask.apply_async(retries=sometask.max_retries - 3)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With