Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Scripts/Stylesheets: Consolidate locally or use CDN?

I can obviously do some limited testing of my own, but I'm hoping to hear from some people with real-world experience on at least medium-scale web sites.

Two of the items on every "top 10" list for optimizing sites/bandwidth are:

  • Consolidate as much JS and CSS as possible into a single file (to reduce round trips); and
  • Use a Content Delivery Network (to save bandwidth/speed up downloads).

Nowadays you can find almost every important script on either Google's or Microsoft's CDN (or both). You can even find some pretty heavyweight items like jQuery ThemeRoller packages. One of the major advantages of using these CDNs over a private CDN account like Amazon S3 is that many visitors will already have these scripts cached from some other web site. Even if they don't, it's hard to beat the performance of a site like Google.

So let's say I want to use these public CDNs rather than paying extra for a personal CDN account for performance that could be better or worse or the same. And let's also say I'm using a bunch of JS/CSS files that are all on these public CDNs: jQuery, jquery-ui, one theme, jQuery TOOLS, and maybe a few others. In addition to those I may be using several scripts that are not available on a public CDN, like superfish and jquery.approach and hoverIntent. Pretty common stuff really. It seems I have two mutually exclusive options:

  • Refer to the public CDNs for the scripts listed above. The advantage is that I get to use somebody else's bandwidth and take advantage of the popularity of some of these scripts. The disadvantage is that the client must make no fewer than 7 round-trips for all these files (4 individual JS, 1 CSS, and 1 each CSS and JS for the consolidated/minified local stuff). Or...
  • Cram everything into two monolithic minified files (one for JS and one for CSS) and serve everything off my own web server. The advantage is that there are only two round trips (a 66% savings, at least). The disadvantage, of course, is that now I now have to serve a much bigger amount myself.

This seems like a tough choice, and I think the only way to really understand which will work better ("better" being a function of client load times and overall user experience) is to actually put it out there in the wild and see what happens.

So has anyone had to make this choice before, and if so, what did you choose, and why, and how well did it turn out? Or, if anyone has tried to tackle the task of testing in an isolated environment, how did you go about it, and what were the results?

P.S. This is all describing a "web application" type of site - in other words, not very media-heavy, but may contain a ton of styles and scripts that actually account for a significant chunk of the page size. Therefore I do consider this more than just an intellectual exercise.

like image 542
Aaronaught Avatar asked Dec 11 '09 19:12

Aaronaught


1 Answers

Check out Focus.com in Firebug

We took a hybrid approach. We pull jQuery off Google and the rest of our css and js are combined / minified and served off S3. We automated the combo / minification step via our build process and each version is deployed under its own (numbered) directory to allow browsers to aggressively cache our resources.

We choose to pull jQuery off Google to save the 40K hit given the frequency of code updates to our site (roughly every two weeks). That having been said, before we launched we did a bunch of testing of different configurations and didn't detect significant differences in performance.

like image 78
DrewM Avatar answered Oct 21 '22 16:10

DrewM