Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP: Loading gzipped javascript files

Is it a good idea to concatenate a set of 20-30 .js files into one big file, compress this file with gzip, save it as something like somebigjsfile.js.gz, then load it like this with <script type="text/javascript" src="somebigjsfile.js.gz"></script> ?

This file would be generated again when at least one of the .js files have been modified (checking that with php's filemtime).

Also if it's relevant, this is for a public app.

like image 765
Alex Avatar asked Mar 23 '11 08:03

Alex


3 Answers

I assume you are trying to save the overhead on the server of having to gzip the javascript package on every request? If that is the intention, this is not the proper way to accomplish that. You need to indicate that the file is being transferred with gzip compression in the header, as such:

HTTP/1.1 200 OK
Date: Thu, 04 Dec 2003 16:15:12 GMT
Server: Apache/2.0
Vary: Accept-Encoding
Content-Encoding: gzip
Cache-Control: max-age=300
Expires: Thu, 04 Dec 2003 16:20:12 GMT
X-Guru: basic-knowledge=0, general-knowledge=0.2, complete-omnipotence=0.99
Content-Length: 1533
Content-Type: text/html; charset=ISO-8859-1

notice Content-Encoding: gzip

In any case, concatenating and compressing your javascript is always a good idea, as long as you do it right. I would also recommend using some form of JS minification prior to compression as it will improve you post-compressed size

like image 106
jordancpaul Avatar answered Sep 28 '22 08:09

jordancpaul


I don't think that's such a good idea if you plan on having returning visitors - and I assume it's the case if you are developing a web app. Because everytime one of your js files is modified, you will force your users to download a huge file.

When your user comes to your app the first time, they are often quite forgiving if the loading time is a bit long. If you show them a nice introduction text, they will have something to read and/or look at while the scripts and various assets are loading. But if they have to wait for your huge script to be loaded everytime they come back because it's all-or-nothing, that will be considered poor UX.

If you don't need all the 20-30 files at first when the user loads your app, use a script loader to load them in the background.

If you do need those 20-30 files, try to aggregate them in 10 or so files, trying to aggregate those that have the highest probability to be updated together.

As for the gzip compression, your web server should handle that.

like image 45
Alsciende Avatar answered Sep 28 '22 10:09

Alsciende


Yes its a good ideea.

What i'm doing is to use a custom php function like load_script($filename) that adds that file to a "load queue". And at the end of the page i insert another function generate_js_script() that parsers these queued scripts, generates a big js file(i guess i could run a minimizer at this point) and writes it to disk with a unique filename. The the function simply generates the <script> tag so the browser knows to load this "big" js file. This way i dont bother with customized header() calls, and other weird things that are caused problems with some browsers and their caching algorithm in the past.

This way, i could simply use a .htaccess directive to tell the server to serve all js files compressed(if the browser supports it ), so i wont have to do it in php. The trick about this kind of caching is not to re-generate too often. I use some kind of hashing to take care of this.

like image 41
Quamis Avatar answered Sep 28 '22 08:09

Quamis