Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Reduce HTTP requests or not?

A theoretical question:
We all know about the pro's of minifying and combining javascript files in order to reduce HTTP requests to speed up a website. But when popular javascript libraries is used (jQuery for instance), it isn't too stupid to assume that these already have been downloaded to the clients computer from another page.

So what should be preferrered? How does the big guys in the industry handle it?

A) Combine and minify each script into a massive one and serve it from my own CDN.

B) Combine all "self-written" scripts into one file and utilize available CDN's of libraries where possible.

Thanks!

like image 956
Industrial Avatar asked Jun 24 '10 15:06

Industrial


People also ask

What does make fewer HTTP requests mean?

Generally, fewer HTTP requests mean a faster loading website. The loading speed of a website is now an important search engine ranking factor. On average, the media page loading speed for Google's 10 results is just 1.65 seconds. This highlights the importance of having a fast-loading website.

How many HTTP requests is too many?

How Many HTTP Requests Should Web Pages Have? You should strive to keep the number of HTTP requests under 50. If you can get requests below 25, you're doing amazing. By their nature, HTTP requests are not bad.

What does too many HTTP requests mean?

The HTTP 429 Too Many Requests response status code indicates the user has sent too many requests in a given amount of time ("rate limiting"). A Retry-After header might be included to this response indicating how long to wait before making a new request.


4 Answers

  • "Combine and minify each script into a massive one and serve it from my own CDN"

    If you have a CDN, what are we talking about? :) You mean server, right?

  • "How does the big guys in the industry handle it?"

    The big guys always use their own servers.

  • "...it isn't too stupid to assume that these already have been downloaded to the clients computer from another page."

    Unfortunately it is. Facts:

    • 40-60% of users have an empty cache experience
    • browsers' cache limits are small
    • different versions of libraries are in use, cache only happens if they match
    • resource from a new domain creates a DNS lookup, which is slow
    • +you need to manage dependencies
like image 77
25 revs, 4 users 83% Avatar answered Sep 30 '22 17:09

25 revs, 4 users 83%


I think it depends on you site:

  • If you site consists mainly of pages of the same type which need the same scripts I would go for A)
  • If you have a lot of scripts that differ from each sub site of you site I would go for B). Combine the most used scripts together in one script. If you have large scripts that are not used on every page, make a separate script for it.

The best way to really know what to do is to test which combination of techniques saves you the the most traffic / connections.

P.S.: I personally do not like the idea to let other people serve files for my webpage, because what will happen if the CDN fails, but your server stays alive? If this is not a problem for you, try to server all libraries you use from a reliable CDN.

like image 25
TheHippo Avatar answered Sep 30 '22 16:09

TheHippo


I think the best approach is to use a minified 'application.js' file (containing all application specific javascript) and then use a service such as Google AJAX Libraries API (found here) to load jQuery, Prototype, etc.

like image 43
Kevin Sylvestre Avatar answered Sep 30 '22 16:09

Kevin Sylvestre


I think it comes down to a couple of things:

  1. How many pages use the code throughout your site
  2. The quality of the CDN
  3. How much code it is

There's also a difference between using popular Javascript packages, such as jQuery, and using a personal package, which only visitors who have visited your site will have.

The performance enhancement may occur from two places: 1) browser cache and 2) dns cache, which even if the file isn't stored locally, the dns server has a route that minimizes the request time, or even temporarily serves the file.

I would advise using a CDN and hosting the files locally. Depending on your resources (hardware/bandwidth), you might need to use a CDN anyhow. It'd be nice to use server side schedulers to check on the CDN status and reroute the path when applicable.

Also, take a reminder that some users choose to turn off their browser cache. So minifying your JS is always a plus. You should separate your JS into two files: 1) Needed on Load and 2) Not needed on load. Basically, get the necessary code out there first, to improve the perceived load time. Then load all the other extras (eg slideshows, color changers, etc).

One last point is to make use of the Expires headers, since none of this is important if you don't optimize that. That is what will really reduce the speed for returned visitors with cache enabled. YSlow is a nice Firefox addon that will help evaluate your load performance.


To answer your question: Reduce HTTP requests, but do your own evaluation on the file size of the JS.

(Being Extreme) You don't want one 10MB JS file, or your site will take too long to load. Nor do you want 1000 10KB files, because of the HTTP overhead. Again, use this to illustrate the point that you want a balance between size and number of files - and as I said earlier, package them into performance needed vs wanted.

like image 21
vol7ron Avatar answered Sep 30 '22 17:09

vol7ron