Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the best way to determine at runtime if a browser is too slow to gracefully handle complex JavaScript/CSS?

I'm toying with the idea of progressively enabling/disabling JavaScript (and CSS) effects on a page - depending on how fast/slow the browser seems to be.

I'm specifically thinking about low-powered mobile devices and old desktop computers -- not just IE6 :-)

Are there any examples of this sort of thing being done?

What would be the best ways to measure this - accounting for things, like temporary slowdowns on busy CPUs?

Notes:

  • I'm not interested in browser/OS detection.
  • At the moment, I'm not interested in bandwidth measurements - only browser/cpu performance.
  • Things that might be interesting to measure:
    • Base JavaScript
    • DOM manipulation
    • DOM/CSS rendering
  • I'd like to do this in a way that affects the page's render-speed as little as possible.

BTW: In order to not confuse/irritate users with inconsistent behavior - this would, of course, require on-screen notifications to allow users to opt in/out of this whole performance-tuning process.

[Update: there's a related question that I missed: Disable JavaScript function based on user's computer's performance. Thanks Andrioid!]

like image 576
Már Örlygsson Avatar asked Jan 19 '11 11:01

Már Örlygsson


People also ask

How can you detect the client's browser name which of the following is used in this regard?

How do I detect the browser name ? You can use the navigator. appName and navigator. userAgent properties.


2 Answers

Not to be a killjoy here, but this is not a feat that is currently possible in any meaningful way in my opinion.

There are several reasons for this, the main ones being:

  1. Whatever measurement you do, if it is to have any meaning, will have to test the maximum potential of the browser/cpu, which you cannot do and maintain any kind of reasonable user experience

  2. Even if you could, it would be a meaningless snapshot since you have no idea what kind of load the cpu is under from other applications than the browser while your test is running, and weather or not that situation will continue while the user is visiting your website.

  3. Even if you could do that, every browser has their own strengths and weaknesses, which means, you'd have to test every dom manipulation function to know how fast the browser would complete it, there is no "general" or "average" that makes sense here in my experience, and even if there was, the speed with which dom manipulation commands execute, is based on the context of what is currently in the dom, which changes when you manipulate it.

The best you can do is to either

  1. Let your users decide what they want, and enable them to easily change that decision if they regret it

    or better yet

  2. Choose to give them something that you can be reasonably sure that the greater part of your target audience will be able to enjoy.

Slightly off topic, but following this train of thought: if your users are not techleaders in their social circles (like most users in here are, but most people in the world are not) don't give them too much choice, ie. any choice that is not absolutely nescessary - they don't want it and they don't understand the technical consequences of their decision before it is too late.

like image 76
Martin Jespersen Avatar answered Oct 04 '22 13:10

Martin Jespersen


A different approach, that does not need explicit benchmark, would be to progressively enable features.

You could apply features in prioritized order, and after each one, drop the rest if a certain amount of time has passed.

Ensuring that the most expensive features come last, you would present the user with a somewhat appropriate selection of features based on how speedy the browser is.

like image 33
Guðmundur H Avatar answered Oct 04 '22 15:10

Guðmundur H