Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Intensive PHP script failing w/ "The timeout specified has expired" error / ap_content_length_filter

Tags:

php

Running a MySQL intensive PHP script that is failing. Apache log reports this:

[Wed Jan 13 00:20:10 2010] [error] [client xxx.xx.xxx.xxxx] (70007)
The timeout specified has expired:
ap_content_length_filter: apr_bucket_read() failed,
referer: http://domain.com/script.php

Tried putting set_time_limit(0) at the top.

Also tried set_time_limit(0)

Neither fixed the timeout.

Is there some specific timeout limit I can up in http.conf (or elsewhere) to prevent this?

like image 516
sbuck Avatar asked Jan 13 '10 06:01

sbuck


2 Answers

I hit a very similar wall as well with Apache 2.4.6 and PHP 5.4.23 FPM/FastCGI.

Symptom:

No matter what I set in PHP or Apache, my script would timeout in 30 seconds and I would see the following in my Apache Error log:

[timestamp] [proxy_fcgi:error] [pid...] (70007)The timeout specified has expired: [client ...] AH01075: Error dispatching request to :

My VirtualHost:

TimeOut  300
KeepAliveTimeout 300

<IfModule reqtimeout_module>
  RequestReadTimeout header=120-240,minrate=500
  RequestReadTimeout body=120,minrate=500
</IfModule>

<IfModule mod_proxy.c>
  ProxyTimeout 300
</IfModule>

<IfModule mod_fcgid.c>
  FcgidConnectTimeout 300
</IfModule>

The pesky php script:

ini_set( 'max_execution_time', '120' );
...
ini_restore( 'max_execution_time' );

The Fix: it's a hard coded value in Apache mod_proxy_fcgi

Take a look at the bug report here

  • A patch is available (link above)
  • The fix doesn't appear to be slated for general release yet (Mar 2014)
like image 83
misterich Avatar answered Nov 04 '22 06:11

misterich


First, my solution is only applicable to the Apache Web Server.

I am working on a script meant to act as a csv download script for a report against a very very large db, and I encountered this problem too. Am NOT using php, but instead my script is written in some obscure language called heitml ;-)

The request timeout proble does occur in my scenario like this:

[Wed Sep 19 20:29:01 2012] [warn] [client ::1] Timeout waiting for output from CGI script /var/www/cgi-bin/heitml
[Wed Sep 19 20:29:01 2012] [error] [client ::1] (70007)The timeout specified has expired: ap_content_length_filter: apr_bucket_read() failed

And the only serious solution I can currently adapt to is using this official timeout config extension here : mod_reqtimeout. It allows adjustment of timeout params like for example:

Allow 10 seconds to receive the request including the headers and 30 seconds for receiving the request body:

RequestReadTimeout header=10 body=30

Allow at least 10 seconds to receive the request body. If the client sends data, increase the timeout by 1 second for every 1000 bytes received, with no upper limit for the timeout (exept for the limit given indirectly by LimitRequestBody):

RequestReadTimeout body=10,MinRate=1000

Allow at least 10 seconds to receive the request including the headers. If the client sends data, increase the timeout by 1 second for every 500 bytes received. But do not allow more than 30 seconds for the request including the headers:

RequestReadTimeout header=10-30,MinRate=500

Usually, a server should have both header and body timeouts configured. If a common configuration is used for http and https virtual hosts, the timeouts should not be set too low:

RequestReadTimeout header=20-40,MinRate=500 body=20,MinRate=500

Am yet to find out whether there's a better solution offered by Apache that doesn't require me to use an this module (assuming it's not installed by default -- though it's included in all versions 2.2.15 and later).

like image 5
JWL Avatar answered Nov 04 '22 06:11

JWL