Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ruby On Rails is slow...?

I'm writing a web application to monitor a furniture factory production flow. It has thousand of data to handle. So far, I run RoR on Mongrel + MySQL and it's really really slow (2-4min for some views). When I look at RoR logs, it seems that database queries aren't slow (0-10ms).

Is RoR slow when it converts database data to object? Is Mongrel slow?

Edit: First thing: I was in dev. env. In the production environment, the slowest view takes 2min (which would turn down to less than 1min on a good computer, mine is 5 years old). With ruby-prof and a bit of common sense, I've found out which methods were slowing down the application. The problem is that single SQL queries are called in loops on larges datasets:

ofs = Ofkb.find_by_sql ["..some large SQL query..."]

for of in ofs # About 700-1000 elements
   ops = Operation.find(..the single query..)
   etc.
end

Here are ruby-prof results on those methods:

 %self     total     self     wait    child    calls  name
 32.19     97.91    97.91     0.00     0.00       55  IO#gets (ruby_runtime:0}
 28.31     86.39    86.08     0.00     0.32    32128  Mysql#query (ruby_runtime:0}
  6.14     18.66    18.66     0.00     0.00    12432  IO#write (ruby_runtime:0}
  0.80      2.53     2.42     0.00     0.11    32122  Mysql::Result#each_hash (ruby_runtime:0}

The problem is: I can't really avoid those single queries. I've got thousands of events from which I have to compute complex data. Right now I'm using memcached on those methods which are OK unless you're the first to request the page.

like image 357
Giann Avatar asked Feb 19 '09 17:02

Giann


People also ask

Why is Ruby on Rails so slow?

For example, databases are really good at big data processing, R language is better suited for statistics, and so on. Memory is the #1 reason why any Ruby application is slow. The 80-20 rule of Rails performance optimization is: 80% of speedup comes from memory optimization, remaining 20% from everything else.

Is Ruby on Rails still slow?

It must be said that most of the time, developers claim that Ruby on Rails is slow because of the initially incorrect project architecture, data caching or database optimization. Rails is FAST in terms of the project development speed. That is the main advantage of Ruby on Rails.

Is Ruby on Rails fast?

Ruby on Rails is the world's fastest web framework for startups.


5 Answers

I'll agree with everyone else. You have to profile. There is no point in doing anything to your code until you know what specifically is causing the slowness. Trying to fixing a problem without understanding the cause is like feeling ill and deciding to have lots of surgery until you feel better. Diagnose your problem first. It might be something small like a network setting or it could be one bad line in your code.

Some tips for profiling:

How to Profile Your Rails Application

Performance Testing Rails Applications

At the Forge - Profiling Rails Applications

Once you have found the bottleneck you can figure out what to do.

I recommend these videos: Railslab Scaling Rails

Revised now based on prof results:

OK. Now that you can see that your problem is that you are doing some sort of calculation using a query based on looping through the results of another active record query I'd advise you to look into building a custom SQL statement combining your initial selection criteria and the loop calculation to get what you need. You can definitely speed this up by optimizing the SQL.

like image 111
srboisvert Avatar answered Oct 02 '22 17:10

srboisvert


How many of those 0-10ms queries are being executed per view access? What parts of your data model are being referenced? Are you using :include to get eager loading on your associations?

Rails is as slow as you make it. With understanding comes speed (usually!)

Expanding on the above, do you have has_many associations where, in particular, your view is referencing the "many" side without an :include? This causes your find(:all) on the master table to be executed with a join to the detail - if you have large numbers of detail records and are processing all of them individually, this can get expensive.

Something like this:

Master.find(:all, :include => :details)

...might help. Still guessing from sparse info, though.

There's an old Railscast on the subject here

like image 36
Mike Woodhouse Avatar answered Oct 02 '22 17:10

Mike Woodhouse


While R-n-R has a reputation of being slow, this sounds too extreme to be a simple problem with the language.

You should run a profiler to determine exactly what functions are slow and why. The most common thing slowing down a web application is the "n+1 problem". That is, when you have n data items in your database, the app makes n separate queries to the database instead of making one query which gets them. But you can't know until you run the profiler. ruby-prof is one profiler I've used.

Edit based on profile results edit:

I firmly believe that you can always remove a query loop. As Mike Woodhouse says, the Rails way to do this is to specify the relations between your tables with a has_many or other association and then let rails automatically generate the table join, this is clear, fast and "the Rails way". But if you are starting out with bare SQL or if the associations don't work in this case, you can simply generate the appropriate joins yourself. And If all else fails, you can create a view or denormalized table which holds the results which previously were found through a loop. Indeed, the fact that you have to iterate through generated queries might be a sign that your table design itself has some flaws.

All that said, if caching your query results works well enough for you, then stay with it. Optimize when needed.

like image 24
Joe Soul-bringer Avatar answered Oct 02 '22 18:10

Joe Soul-bringer


This is not normal. You have some logic that is slowing you down. Trying commenting out bits and pieces of your code that you think are taking a long time and see if that helps. If it does then you need to figure out how to optimize that logic.

If you are doing lots of calculations over a loop iterating through a very large number of objects, then of course it will be slow.

These types of issues can come up in any language or framework. While Ruby is not as fast as other languages, it's fast enough most of the time. If you need to constantly calculate with large data sets then Ruby may not be the right language for you. Look into writing a Ruby C extension that will handle your performance draining code. But first just try to diagnose and refactor.

Lastly, check out RubyProf to see if it can help you find the bottleneck.

like image 28
Alex Wayne Avatar answered Oct 02 '22 18:10

Alex Wayne


The previous two answers are helpful, especially using performance monitoring tools. I use New Relic RPM and it's helped me a great deal in the past.

However, these sorts of tools are really best when you're trying to speed up from, say, 3 seconds to under 1 second.

2-4 minutes for a view to render is absolutely not normal under any normal circumstances.

Could you show us some of your development logs to figure out where the bottlenecks are?

Are you including time the browser takes to load images, javascripts, or other files into this total measurement?

like image 45
btw Avatar answered Oct 02 '22 17:10

btw