Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Serve jQuery UI from Google's CDN or as a local copy?

while it's better to serve jQuery from Google's CDN jQuery UI is a different beast. My local modified copy weighs 60kb and the one in Google's CDN ~200kb.

  • Are there any numbers on how many sites uses the CDN? (read: how many users have it in their cache). How do I know/calculate if it's better to serve it locally?
like image 916
CamelCamelCamel Avatar asked Oct 15 '11 13:10

CamelCamelCamel


1 Answers

Coming late to the party here, but allowing for gzip compression, you're basically comparing a download of ~51k from Google's CDN (the 197.14k content becomes 51.30k on-the-wire) vs. ~15.5k from your own servers (assuming your 60k file gzips at the same ratio as the full jQuery UI file does, and that you have gzip compression enabled). This takes us into a complex realm of:

  • pre-existing cache copy
  • latency
  • transfer time
  • number of requests
  • proper cache headers

And the answer to your question is a big: It depends, try each of them and measure the result in a real world scenario.

Pre-Existing Cache Copy

If a first-time visitor to your site has previously been to a site using jQuery UI from Google's CDN and it's still in their cache, that wins hands down. Full stop. No need to think about it any further. Google uses appropriate caching headers and the browser doesn't even have to send the request to the server, provided you link to a fully-specified version of jQuery UI (not one of the "any version of 1.8.x is fine" URLs — if you ask for jQuery UI 1.8.16, Google will return a resource that can be cached for up to a year, but if you ask for jQuery UI 1.8.x [e.g., any dot rev of 1.8], that resource is only good for an hour).

But let's suppose they haven't...

Latency and Transfer Time

Latency is how long it takes to set up the connection to the server, and transfer time is the time actually spent transferring the resource. Using my DSL connection (I'm not very close to my exchange, so I typically get about 4Mbit throughput on downloads; e.g., it's an okay connection, but nothing like what Londoners get, or those lucky FiOS people in the States), in repeated experiments downloading Google's copy of jQuery UI I typically spend ~50ms waiting for the connection (latency) and then 73ms doing data transfer (SSL would change that profile, but I'm assuming a non-SSL site here). Compare that with downloading Google's copy of jQuery itself (89.52k gzipped to 31.74k), which has the same ~50ms latency followed by ~45ms of downloading. Note how the download time is proportional to the size of the resource (31.74k / 51.30k = 0.61871345, and sure enough, 73ms x 0.61871345 = 45ms), but the latency is constant. So assuming your copy comes in at 15.5k, you could expect (for me) a 50ms latency plus about 22ms of actual downloading. All other things being equal, by hosting your own 60k copy vs. Google's 200k copy, you would save me a whopping 52ms. Let's just say that I wouldn't notice the difference.

All is not equal, however. Google's CDN is highly optimized, location-aware, and very fast indeed. For instance, let's compare downloading jQuery from Heroku.com. I chose them because they're smart people running a significant hosting business (currently using the AWS stack), and so you can expect they've at least spent some time optimizing their delivery of static content — and it happens they use a local copy of jQuery for their website; and they're in the U.S. (you'll see why in a moment). If I download jQuery from them (shockingly, they don't appear to have gzip enabled!), the latency is consistently in the 135ms range (with occasional outliers). That's consistently 2.7 times as much latency as to Google's CDN (and my throughput from them is slower, too, roughly half the speed; perhaps they only use AWS instances in the U.S., and since I'm in the UK I'm further away from them).

The point here being that latency may well wash out any benefit you get from the smaller file size.

Number of Requests

If you have any JavaScript files you're going to host locally, your users are still going to have to get those. Say you have 100k of your own script for your site. If you use Google's CDN, your users have to get 200k of jQuery UI from Google and 100k of your script from you. The browser may put those requests in parallel (barring your using async or defer on your script tags, the browser has to execute the scripts in strict document order, but that doesn't mean it can't download them in parallel). Or it may well not.

As we've established that for non-mobile users, at these sizes the actual data transfer time doesn't really matter that much, you may find that taking your local jQuery UI file and combining it with your own script, thus requiring only one download rather than two, may be more efficient even despite the Google CDN goodness.

This is the old "At most one HTML file, one CSS file, and one JavaScript file" rule. Minimizing HTTP requests is a Good ThingTM. Similarly, if you can use sprites rather than individual images for various things, that helps keep image requests down.

Proper Cache Headers

If you're hosting your own script, you'll want to be absolutely sure it's cacheable, which means paying attention to the cache headers. Google's CDN basically doesn't trust HTTP/1.0 caches (it sets the Expires header to the current date/time), but does trust HTTP/1.1 caches — the overwhelming majority — because it sends a max-age header (of a year for fully-specified resources). I'm guessing they have a reason for that, you might consider following suit.

Since you want to change your own scripts sometimes, you'll want to put a version number on them, e.g. "my-nifty-script-1.js" and then "my-nifty-script-2.js", etc. That's so you can set long max-age headers, but know that when you update your script, your users will get the new one. (This goes for CSS files, too.) Do not use the query string for the versioning, put the number actually in the resource name.

Since your HTML presumably changes regularly, you probably want short expirations on that, but of course it totally depends on your content.

Conclusion

It depends. If you don't want to combine your script with your local copy of jQuery UI, you're probably better off using Google for jQuery UI. If you're happy to combine them, you'll want to do real-world experiments either way to make your own decision. It's entirely possible other factors will wash this out and it won't really matter. If you haven't already, it's worth reviewing Yahoo's and Google's website speed advice pages:

  • Yahoo! Best Practices for Speeding Up Your Website
  • Google's Let's make the web faster site
like image 119
T.J. Crowder Avatar answered Sep 20 '22 23:09

T.J. Crowder