Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to determine the best "framerate" (setInterval delay) to use in a JavaScript animation loop?

When writing a JavaScript animation, you of course make a loop using setInterval (or repeated setTimeout). But what is the best delay to use in the setInterval/setTimeout call(s)?

In the jQuery API page for the .animate() function, the user "fbogner" says:

Just if anyone is interested: Animations are "rendered" using a setInterval with a time out of 13ms. This is quite fast! Chrome's fastest possible interval is about 10ms. All other browsers "sample" at about 20-30ms.

Any idea how jQuery determined to use this specific number?


Started bounty. I'm hoping someone with knowledge of the source code behind Chromium or Firefox can provide some hard facts that might back up the decision of a specific framerate. Or perhaps a list of animations (or frameworks) and their delays. I believe this makes for an interesting opportunity to do a bit of research.


Interesting - I just took the time to look at Google's Pac-Man source to see what they did. They set up an array of possible FPSes (90, 45, 30), start at the first one, and then each frame they check the "slowness" of the frame (amount the frame exceeded its allotted time). If the slowness exceeds 50ms 20 times, the framerate is notched down to the next in the list (90 -> 45, 45 -> 30). It appears that the framerate is never raised back up, presumably because the game is so short-lived that it wouldn't be worth the trouble to code that.

Oh, and the setInterval delay is of course set to 1000 / framerate. They do, in fact, use setInterval and not repeated setTimeouts.

I think this dynamic framerate feature is pretty neat!

like image 326
Ricket Avatar asked May 30 '10 19:05

Ricket


2 Answers

I would venture to say that a substantial fraction of web users are using monitors that refresh at 60Hz, which translates to one frame every 16.66ms. So to make the monitor the bottleneck, you need to produce frames faster than 16.66ms.

There are two reasons you would pick a value like 13ms. First, the browser needs a little bit of time to repaint the screen (in my experience, never less than 1ms). Which puts you at, say, updating every 15ms, which happens to be a very interesting number - the standard timer resolution on Windows is 15ms (see John Resig's blog post). I suspect that an well-written 15ms animation looks very close to the same on a wide variety of browsers/operating systems.

FWIW, fbogner is plain wrong about non-Chrome browsers firing setInterval every 20-30ms. I wrote a test to measure the speed of setInterval firing, and got these numbers:

  • Chrome - 4ms
  • Firefox 3.5 - 15ms
  • IE6 - 15ms
  • IE8 - 15ms
like image 87
Long Ouyang Avatar answered Sep 22 '22 19:09

Long Ouyang


The pseudo-code for this is this one:

FPS_WANTED = 25  (just a number, it can be changed while executing, or it can be constant)  TIME_OF_DRAWING = 1000/FPS_WANTED  (this is in milliseconds, I believe it is accurate enough)  ( should be updated when FPS_WANTED changes)  UntilTheUserLeavesTheDrawingApplication() {    time1 = getTime();   doAnimation();   time2 = getTime();   animationTime = time2-time1;    if (animationTime > TIME_OF_DRAWING)   {      [the FPS_WANTED cannot be reached]      You can:      1. Decrease the number of FPS to see if a lower framerate can be achieved      2. Do nothing because you want to get all you can from the CPU   }   else   {      [the FPS can be reached - you can decide to]      1. wait(TIME_OF_DRAWING-animationTime) - to keep a constant framerate of FPS_WANTED      2. increase framerate if you want      3. Do nothing because you want to get all you can from the CPU   }  } 

Of course there can be variations of this but this is the basic algorithm that is valid in any case of animation.

like image 38
INS Avatar answered Sep 22 '22 19:09

INS