Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Parallel request on the same PHP script causes a second long delay

Background

This is something driving me crazy for weeks. I use MrClay's PHP Minify script to minify and concatenate my JS and CSS. It works great, all my files are merged into common.css and common.js, which are virtual files, translated to link to to the script like this:

RewriteRule ^common.js$ /tynamic/min/?g=js [L,QSA]
RewriteRule ^common.css$ /tynamic/min/?g=css [L,QSA]

A query string is also appended to denote a version of these files, and they have a 3 year long caching, so a person that visited my site will likely never have to download any CSS or JS on any future visit to that site (if they don't change, obviously). So far everything works.

The problem

Often (sadly not always), when my browser asks for these 2 files (which is done at the same time), one of the files takes a second to get returned. Its always the one whose request arrives later to the server, so its usually the one which is later in the HTML, but its not a rule.

See these screenshots:

taken from my Firefox

test report by Pingdom

I'd be okay with the server putting the other file to a queue and proccesing it after the first one, but that doesn't take a whole second. Few more things: no action like concatetion or gzipping is being performed in this case.. the script is only doing a fpassthru() of an existing pre-gzipped file. It doesn't happen all the time however.. there it gets a little bit odd, if I do a large number of consecutive pageloads, like 30 or more, it goes back to "normal" when both files are processed in a trivial time. Then when I check after some time, it is back on the second hang. The time is always something little over a second.

What I already tried

  1. Putting if($_GET["g"]=="js") exit; right at the start of the script.

Thats right, that was of no help. The file was still delayed, outputting nothing. Just exit; (for both files) however works... :)

  1. Timing the scripts

Both runs report minimal times (units or tens of milliseconds) of their runs, so there is no function that would delay it.

  1. Different server/hosting

No help, 3 different servers and hosting providers. It is not hosting related.

  1. Making a full copy of the script

So I made a copy of the full script directory to ensure both runs are made by different set of files - no help.

  1. Disabling file locking nad other tweaks to script config or the script itself.

So far I didn't came up with anything :(

  1. Different script - doing something else.

This was interesting, modifying the files to do something else, e.g. do a scandir and pick a file didn't help either. Another analysis showed, that the PHP scripts are being assigned to free CPU threads every second. So if there are e.g. 5 threads, and 6 scripts need to be run at once, the first 5 are done in like 10 msecs, but the 6th has to wait a whole second to even start being proccessed... Why is this happening?

Thanks a lot in advance for any effort put in to helping me

like image 966
Vitexikora Avatar asked Jan 16 '16 16:01

Vitexikora


Video Answer


1 Answers

CBroe is probably right. If you are using sessions (session_start()), PHP will only serve one request to one client (session_id) at a time. While one request is being served, the other is queued until the session is written by the first one. The session file is locked to prevent multiple requests writing into the same session which can cause unexpected results. The session is written either when the script is finished or by calling session_write_close(). This will free your session for the next request.

However, I feel obliged to tell you that you are doing it wrong. You shouldn't be minimizing JS and CSS with PHP. Here are the reasons:

  1. Using PHP for this causes unnecessary load on the server
  2. The browser is still requesting the files to get a 304 response - again unnecessary load on the server and reduced user experience (these requests still take time)
  3. It's hard to build a good minification tool and there's no need to reinvent the wheel. A better one is readily available.
  4. There's more reasons...

I suggest you better invest your time in not writing the minification scripts, but in learning build tools (Grunt or Gulp) that will do the job for you and much more than you will want/be able to write in PHP.

In a nutshell, how this whole process works is

  1. You set up your server to sent an Expires header. This will prevent the client from even requesting for changes in the files. Google for how this is done using Apache: https://www.google.com/search?q=apache+expires+htaccess&ie=utf-8&oe=utf-8&gws_rd=cr&ei=y12dVrqKG8KvsAHFiKjYDA
  2. Set up the tools above to "build" your minified resources - concatenate multiple files, "compile" stylesheets, minify etc. So you have these built files on disk and served directly by the webserver.
  3. You only set up the website to use minified resources in production. (You want to be able to debug full source in development).
  4. When you deploy any changes to your website .

The skill of setting this up will come in very handy for any web developer. Also, this will free your time for building your web app itself.

like image 166
Okneloper Avatar answered Oct 20 '22 21:10

Okneloper