Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Django Performance / Memory usage [closed]

I am running an alpha version of my app on a EC2 Small instance (1.7 GB RAM) with postgres and apache (wsgi-mod not as daemon but directly) on it.

Performance is alright, but it could be better. I am also worried about memory usage if too many test users would join.

Is it wise to switch from Apache to nginx server? Has any Django developer done that and is happier with the results? Any other tips on the way are also welcome.

Thanks

like image 339
Houman Avatar asked Oct 28 '12 15:10

Houman


People also ask

Why is Django leaking memory?

Memory leaks in Python typically happen in module-level variables that grow unbounded. This might be an lru_cache with an infinite maxsize , or a simple list accidentally declared in the wrong scope. Leaks don't need to happen in your own code to affect you either.

What happens when Python runs out of memory?

Crashing is just one symptom of running out of memory. Your process might instead just run very slowly, your computer or VM might freeze, or your process might get silently killed. Sometimes if you're lucky you might even get a nice traceback, but then again, you might not.

How do I increase the memory limit in Python?

Python doesn't limit memory usage on your program. It will allocate as much memory as your program needs until your computer is out of memory. The most you can do is reduce the limit to a fixed upper cap. That can be done with the resource module, but it isn't what you're looking for.


2 Answers

We are using nginx together with our Django app in a gunicorn server. The performance is quite good so far, but I have not done any direct comparisons with an Apache setup. Memory usage is quite small, nginx takes about 10MB memory and gunicorn about 150MB (but it also servers more than one app). Of course this may vary from app to app.

I would suggest to simply give it a try, it should be quite easy to set up following some tutorials on the web and/or on the gunicorn website. Also get some comparable test case and use some kind of monitoring software like munin to see changes over time.

like image 60
j0nes Avatar answered Oct 05 '22 23:10

j0nes


Why aren't you using daemon mode of mod_wsgi? If you are using embedded mode you are setting yourself up for memory issues if you aren't careful with how you set up Apache.

Go have a read of:

http://blog.dscpl.com.au/2012/10/why-are-you-using-embedded-mode-of.html

and also watch my PyCon talk at:

http://lanyrd.com/2012/pycon/spcdg/

Also amend your question and indicate which Apache MPM you are using and what the MPM settings are.

As to using alternatives such as gunicorn or uWSGI, for a comparable configuration, the memory requirements aren't doing to be much different as the underlying server isn't going to be what dictates how much memory is used, it is going to be your specific Python web application running on top of it. It is a common misconception that gunicorn or uWSGI somehow magically solves all the problems and that Apache can't do as well. Set Apache up properly for a Python web application and don't rely on its defaults and it is just as capable as other solutions and can provide a lot more flexibility depending on your requirements.

Very much suggest you get in place some monitoring to work out what the real issues and bottlenecks are.

like image 28
Graham Dumpleton Avatar answered Oct 05 '22 22:10

Graham Dumpleton