Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory bloat when creating many new objects

When I run this and then watch the memory consumption of my ruby process in OSX Activity Monitor, the memory increases at about 3 MB/s.

If I remove the transaction it about halves the memory consumption but still, the memory footprint keeps going up. I have an issue on my production app where Heroku kills the process because of its memory consumption.

Is there a way of doing the below, in a way that won't increase memory? If I comment out the .save line then it's okay but of course this isn't a solution.

ActiveRecord::Base.transaction do
  10000000.times do |time|
    puts "---- #{time} ----"
    a = Activity.new(:name => "#{time} Activity")
    a.save!(:validate => false)
    a = nil
  end
end

I am running this using delayed_job.

like image 335
Morgz Avatar asked May 05 '12 14:05

Morgz


1 Answers

The a = nil line is unnecessary and you can remove that.

You're creating a lot of objects every time you loop - two strings, two hashes, and an Activity object so I'm not surprised you're experiencing high memory usage, especially as you're looping 10 million times! There doesn't appear to be a more memory efficient way to write this code.

The only way I can think of to reduce memory usage is to manually start the garbage collector every x number of iterations. Chances are Ruby's GC isn't being aggressive enough. You don't, however, want to invoke it every iteration as this will radically slow your code. Maybe you could use every 100 iterations as a starting point and go from there. You'll have to profile and test what is most effective.

The documentation for the GC is here.

like image 173
Matty Avatar answered Sep 21 '22 03:09

Matty