Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory leaks on postgresql server after upgrade to Rails 4

We are experiencing a strange problem on a Rails application on Heroku. Juste after migrate from Rails 3.2.17 to Rails 4.0.3 our postgresql server show an infinite increase of memory usage, then it returns the following error on every request :

ERROR: out of memory
DETAIL: Failed on request of size xxx

Juste after releasing the application with rails 4, postgresql memory start to increase.

As you can see in the screenshot below, It increase from 500 MO to more than 3,5Go in 3 hours

enter image description here

Simultaneously, commit per second doubled. It passed from 120 commit per second :

enter image description here

to 280 commit per second :

enter image description here

It is worth noting that when we restart the application, memory go down to a normal value of 600 Mo before going up to more than 3 Go few hours later (then every sql request show the 'out of memory' error). It is like if killing ActiveRecord connections were releasing memory on postgresql server.

We may well have a memory leak somewhere. However :

  • It was working very well with Rails 3.2. Maybe this problem is a conjunction between changes we made to adapt our code to Rails 4 and Rails 4 code itself.
  • Ihe increase of the number of commit per second juste after Rails 4 upgrade seems very odd.

Our stack is :

  • Heroku, x2 dynos
  • Postgresql, Ika plan on heroku
  • Unicorn, 3 workers per instance
  • Rails 4.0.3
  • Redis Cache.
  • Noteworthy Gems : Delayed jobs (4.0.0), Active Admin (on master branch), Comfortable Mexican Sofa (1.11.2)

Nothing seems really fancy in our code.

Our postgresql config is :

  • work_mem : 100MB
  • shared_buffers : 1464MB
  • max_connections : 500
  • maintenance_work_mem : 64MB

Did someone ever experienced such a behaviour when switching to Rails 4 ? I am looking for idea to reproduce as well.

All help is very welcome.

Thanks in advance.

like image 597
Antoine Joulie Avatar asked Mar 26 '14 10:03

Antoine Joulie


1 Answers

I don't know what is better : answer my question or update it ... so I choose to answer. Please let me know if it's better to update

We finally find out the problem. Since version 3.1, Rails added prepared statements on simple request like User.find(id). Version 4.0, added prepared statements to requests on associations (has_many, belongs_to, has_one). For exemple following code :

class User
  has_many :adresses
end
user.addresses

generate request

SELECT "addresses".* FROM "addresses" WHERE "addresses"."user_id" = $1  [["user_id", 1]]

The problem is that Rails only add prepared statement variables for foreign keys (here user_id). If you use custom sql request like

user.addresses.where("moved_at < ?", Time.now - 3.month) 

it will not add a variable to the prepared statements for moved_at. So it generate a prepared statements every time the request is called. Rails handle prepared statements with a pool of max size 1000.

However, postgresql prepared statements are not shared across connection, so in one or two hours each connection has 1000 prepared statements. Some of them are very big. This lead to very high memory consumption on postgreqsl server.

like image 126
Antoine Joulie Avatar answered Oct 10 '22 20:10

Antoine Joulie