Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Echoing content sometimes takes a very long time

I have a script that builds my webpage in one string ($content) and then echo it to the user.

My script looks like this:

$time1= microtime(true);
$content = create_content();
$content_time=(microtime(true)-$time1)

$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);

Now $content_time is always well under 0.5s so thats no problem. However a few times a day the $echo_time is well above one second and can even go up to 15 seconds. The content isn't really big, about 10-20kb and the times at which this happens are completly random, so it's not on the busy times and even happen in the middle of the night.

Anybody have any idea what that can be?

EDIT The site is hosted on a (remote) dedicated server and only host this site. There is a database involved but like I say the $content_time is well under 1 second, so what this function does can not be the delay.

When the time of my site is above a certain value (lets say 5s) I log this. Even Googlebots seems to have these issues sometimes so I don't think they use a dial-up connection :)

like image 890
Nin Avatar asked Jan 03 '12 11:01

Nin


2 Answers

Let's narrow the issue down and factor out some things...

In the question you indicate you're echoing out 10-15kb. That's a significant amount no matter how it's buffered to output--remember php is single thread, once you flush your buffer you got to wait for all the output to happen via the shell or HTTP before the script continues. It will eventually have to flush the internal buffer before continuing the echo. To get good time without the flushing overhead of echo

Try replacing the

$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);

With

ob_start();
$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);
ob_clean();

This will echo to a buffer, but not actually spit it out via HTTP or whatever. That should give you the 'real' time of the echo command without any concern sending out what's in the buffer.

If echo_time shrinks down, you have a transport issue to address as best you can with buffering.

If echo_time is still to large, you'll need to start digging into the PHP C code.

Either way you're a lot closer to finding your issue and a solution

like image 88
Ray Avatar answered Oct 18 '22 14:10

Ray


From http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why

This old bug report may shed some light. In short, using echo to send large strings to the browser results in horrid performance due to the way Nagle’s Algorithm causes data to be buffered for transmission over TCP/IP.

The solution? A simple three-line function that splits large strings into smaller chunks before echoing them:

function echobig($string, $bufferSize = 8192) { 
    $splitString = str_split($string, $bufferSize);

    foreach($splitString as $chunk) { echo $chunk; }
}

Play around with the buffer size and see what works best for you. I found that 8192, apart from being a nice round number, seemed to be a good size. Certain other values work too, but I wasn’t able to discern a pattern after several minutes of tinkering and there’s obviously some math at work that I have no desire to try to figure out.

By the way, the performance hit also happens when using PHP’s output control functions (ob_start() and friends)

Following the OPs comment that he's tried this I also found the following on PHP.net suggesting that str_split can also be a waste of resources and the echobig function can be optimised further by using the following code:

function echobig($string, $bufferSize = 8192) {
  // suggest doing a test for Integer & positive bufferSize
  for ($chars=strlen($string)-1,$start=0;$start <= $chars;$start += $bufferSize) {
    echo substr($string,$start,$buffer_size);
  }
}

Have you tried running your script using the CLI rather than through Apache?

like image 31
Ben Swinburne Avatar answered Oct 18 '22 14:10

Ben Swinburne