I wrote a small web app using ruby on rails, its main purpose is to upload, store, and display results from xml(files can be up to several MB) files. After running for about 2 months I noticed that the mongrel process was using about 4GB of memory. I did some research on debugging ruby memory leaks and could not find much. So I have two questions.
Some tips to find memory leaks in Rails:
The first is a graphical exploration of memory usage by objects in the ObjectSpace.
The last two will help you identify specific usage patterns that are inflating memory usage, and you can work from there.
As for specific coding-patterns, from experience you have to watch anything that's dealing with file io, image processing, working with massive strings and the like.
I would check whether you are using the most appropriate XML library - ReXML is known to be slow and believed to be leaky (I have no proof of that!). Also check whether you can memoize expensive operations.
A super simple method to log memory usage after or before each request (only for Linux).
#Put this in applictation_controller.rb before_filter :log_ram # or use after_filter def log_ram logger.warn 'RAM USAGE: ' + `pmap #{Process.pid} | tail -1`[10,40].strip end
You might want to load up script/console and try the statement out first to make sure it works on your box.
puts 'RAM USAGE: ' + `pmap #{Process.pid} | tail -1`[10,40].strip
Then just monitor top, when a request makes your memory usage jump, go check the logs. This, of course, will only help if you have a memory leak that occurs in large jumps, not tiny increments.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With