Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Celery will not flush memory when task has ended

I have Celery running with django-celery and basically i cannot figure out why the celery worker is keeping the memory after a task has been stopped?

Is there a parameter that can release the memory when the particular task has ended?

I am running Celery 3.1.16 and rabbitmq as broker.

from ps aux:

1000      6411  0.3  3.8 117288 39564 pts/0    S+   12:44   0:00 /usr/bin/python /usr/local/bin/celery -A Website3 worker -l info
1000      6454  6.7  6.1 143660 62760 pts/0    S+   12:44   0:12 /usr/bin/python /usr/local/bin/celery -A Website3 worker -l info

The data stays like this until i kill the worker and restart it.

Is there a parameter that can be set to release the memory?

like image 892
JavaCake Avatar asked May 10 '26 06:05

JavaCake


1 Answers

There isn't a parameter that does this for you. However, you can use the max tasks per child setting. This will cause the process to be replaced with a new one after X tasks have been completed.

like image 101
schillingt Avatar answered May 11 '26 22:05

schillingt



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!