Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django application memory usage

I am running Django application (built on Django Rest Framework) on Digital Ocean server with following characteristics:

  • 4gb RAM
  • 2 CPUs
  • 60 GB drive

I am using Gunicorn to run Django app and Celery to manage queue. Database is MySQL.

As I can see CPU usage is really low, but memory usage seems to be large.

enter image description here

enter image description here

After I deploy I noticed that python3 process uses even more memory (something around 75%). Whenever I deploy I am running after_deploy script, which contains following:

service nginx restart
service gunicorn restart
chmod +x /mnt/myapplication/current/myapplication/setup/restart.sh
source /mnt/env/bin/activate
cd /mnt/myapplication/current/
pip3 install -r requirements.txt
python3 manage.py migrate --noinput >>  /mnt/migrations/migrations.log
rm -f celerybeat.pid
rm -f celeryd.pid
celery -A myapplication beat -l info -f /var/log/celery/celery.log --detach
celery -A myapplication worker -l info -f /var/log/celery/celery.log --detach

Are these numbers expected? And if not, how can I investigate what is going wrong?

like image 697
Bob Avatar asked Dec 15 '17 14:12

Bob


1 Answers

Python processes tend to retain allocated memory, so if one of your python processes allocates a lot of memory for a given operation (a Django view, a celery task...) it will indeed keep it as long as it's running.

As long as memory usage stays mostly stable (I mean: grows to a certain amount after process startup then stays at this amount) and your server doesn't swap, there's usually nothing to worry about, as the processes will keep on reusing the already allocated memory.

Now if you find out the memory use keeps on growing ever and ever you possibly have some memory leak somewhere indeed.

Beware that running celery - or django FWIW - with settings.DEBUG will cause memory leaks - but you should never run your production processes with the `settings.DEBUG flag set anyway as this is also a security issue.

If that's not your case, then you can start searching here and elsewhere on the net for "debugging python memory leak". You may find a good starting point here:

It’s not so easy for a Python application to leak memory. Usually there are three scenarios:

  • some low level C library is leaking
  • your Python code have global lists or dicts that grow over time, and you forgot to remove the objects after use
  • there are some reference cycles in your app

and here:

For celery in particular, you can roll the celery worker processes regularly. This is exactly what the CELERYD_MAX_TASKS_PER_CHILD setting does.

like image 103
bruno desthuilliers Avatar answered Sep 21 '22 05:09

bruno desthuilliers