Background
This is something driving me crazy for weeks. I use MrClay's PHP Minify script to minify and concatenate my JS and CSS. It works great, all my files are merged into common.css and common.js, which are virtual files, translated to link to to the script like this:
RewriteRule ^common.js$ /tynamic/min/?g=js [L,QSA]
RewriteRule ^common.css$ /tynamic/min/?g=css [L,QSA]
A query string is also appended to denote a version of these files, and they have a 3 year long caching, so a person that visited my site will likely never have to download any CSS or JS on any future visit to that site (if they don't change, obviously). So far everything works.
The problem
Often (sadly not always), when my browser asks for these 2 files (which is done at the same time), one of the files takes a second to get returned. Its always the one whose request arrives later to the server, so its usually the one which is later in the HTML, but its not a rule.
See these screenshots:
taken from my Firefox
test report by Pingdom
I'd be okay with the server putting the other file to a queue and proccesing it after the first one, but that doesn't take a whole second. Few more things: no action like concatetion or gzipping is being performed in this case.. the script is only doing a fpassthru() of an existing pre-gzipped file. It doesn't happen all the time however.. there it gets a little bit odd, if I do a large number of consecutive pageloads, like 30 or more, it goes back to "normal" when both files are processed in a trivial time. Then when I check after some time, it is back on the second hang. The time is always something little over a second.
What I already tried
if($_GET["g"]=="js") exit;
right at the start of the script.Thats right, that was of no help. The file was still delayed, outputting nothing. Just exit;
(for both files) however works... :)
Both runs report minimal times (units or tens of milliseconds) of their runs, so there is no function that would delay it.
No help, 3 different servers and hosting providers. It is not hosting related.
So I made a copy of the full script directory to ensure both runs are made by different set of files - no help.
So far I didn't came up with anything :(
This was interesting, modifying the files to do something else, e.g. do a scandir and pick a file didn't help either. Another analysis showed, that the PHP scripts are being assigned to free CPU threads every second. So if there are e.g. 5 threads, and 6 scripts need to be run at once, the first 5 are done in like 10 msecs, but the 6th has to wait a whole second to even start being proccessed... Why is this happening?
Thanks a lot in advance for any effort put in to helping me
CBroe is probably right. If you are using sessions (session_start()), PHP will only serve one request to one client (session_id) at a time. While one request is being served, the other is queued until the session is written by the first one. The session file is locked to prevent multiple requests writing into the same session which can cause unexpected results. The session is written either when the script is finished or by calling session_write_close(). This will free your session for the next request.
However, I feel obliged to tell you that you are doing it wrong. You shouldn't be minimizing JS and CSS with PHP. Here are the reasons:
I suggest you better invest your time in not writing the minification scripts, but in learning build tools (Grunt or Gulp) that will do the job for you and much more than you will want/be able to write in PHP.
In a nutshell, how this whole process works is
The skill of setting this up will come in very handy for any web developer. Also, this will free your time for building your web app itself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With