Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Browser shows time out while Server process is still running

I am having following problem:

  1. I am running BIG memory process but have divided memory load into smaller chunks so no CPU time out issue.
  2. In the Server I am creating .xml files with around 100kb sizes and they will be created around 100+.
  3. Now main problem is browser shows Response Time out and IE at the below (just upper status bar) shows .php file download message.
  4. During this in the backend (Server side) process is still running and continuously creating .xml files in incremental order. So no issue with that.

I have following php.ini configuration.

max_execution_time = 10000     ; Maximum execution time of each script, in seconds
max_input_time = 10000 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2000M      ; Maximum amount of memory a script may consume (128MB)
; Maximum allowed size for uploaded files.
upload_max_filesize = 2000M

I am running my site on IE. And I am using ZSCE with PHP 5.3

Can anybody redirect me on proper way on this issue?

Edit:

Uploading image of Time out and that's why asking for .php file download.

enter image description here


Edit 2:

I briefly explain my execution flow:

  1. I have one PHP file with objects of Class Hierarchies which will start to execute Function1() from each class Hierarchy.
  2. I have class file.
  3. First, let say, Function1() is executed which contains logic of creating XML files in chunks.
  4. Second, let say, Function2() is executed which will display output generated by Function1().

All is done in Class Hierarchies manner. So I can't terminate, in between, execution of Function1() until it get executed. And after that Function2() will be called.

Edit 3:

This is specially for @hakre.

As you asked some cross questions and I agree with some points but let me describe more in detail about the issue.

  1. First I was loading around 100+ MB size XML Files at a time and that's why my Memory in local setup was hanging and stops everything on Machine and CPU time was utilizing its most resources.

  2. I, then, divided this big size XML files in to small size (means now I am loading single XML file at a time and then unloading it after its usage). This saved me from Memory overload and CPU issue on local setup.

  3. Now my backend process is running no CPU or Memory issue but issue is with Browser Timeout. I even tried cURL but as per my current structure it does seems to fit because of my class hierarchy issue. I have a set of classes in hierarchy and they all execute first their Process functions and then they all execute their Output functions. So unless and until Process functions get executed the Output functions do not comes in picture and that's why Browser shows Timeout.

  4. I even followed instructions suggested by @vortex and got little success but not what I am looking for. Why I could not implement cURl because My process function is Creating required XML files at one go so it's taking too much time to output to Browser. As Process function is taking that much time no output is possible to assign to client unless and until it get completed.

cURL Output:

URL....: myurl 

Code...: 200 (0 redirect(s) in 0 secs) 

Content: text/html Size: -1 (Own: 433) Filetime: -1 

Time...: 60.437 Start @ 60.437 (DNS: 0 Connect: 0.016 Request: 0.016) 

Speed..: Down: 7 (avg.) Up: 0 (avg.) 

Curl...: v7.20.0 

Contents of test.txt file

* About to connect() to mylocalhost port 80 (#0)

*   Trying 127.0.0.1... * connected

* Connected to mylocalhost (127.0.0.1) port 80 (#0)

\> GET myurl HTTP/1.1
Host: mylocalhost
Accept: */*

< HTTP/1.1 200 OK

< Date: Tue, 06 Aug 2013 10:01:36 GMT

< Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/0.9.8o

< X-Powered-By: PHP/5.3.9-ZS5.6.0 ZendServer

< Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/

< Cache-Control: private

< Transfer-Encoding: chunked

< Content-Type: text/html

< 
* Connection #0 to host mylocalhost left intact

* Closing connection #0

Disclaimer : An answer for this question is chosen based on the first little success based on answer selected. The solution from @Hakre is also feasible when this type of question is occurred. But right now no answer fixed my question but little bit. Hakre's answer is also more detail in case of person finding for more details about this type of issues.

like image 439
NullPointer Avatar asked Dec 05 '22 10:12

NullPointer


1 Answers

assuming you made all the server side modifications so you dodge a server timeout [i saw pretty much everyting explained above], in order to dodge browser timeout it is crucial that you do something like this

<?php
set_time_limit(0);
error_reporting(E_ALL);
ob_implicit_flush(TRUE);
ob_end_flush();

I can tell you from experience that internet explorer doesn't have any issues as long as you output some content to it every now and then. I run a 30gb database update everyday [that takes around 2-4 hours] and opera seems to be the only browser that ignores the content output. if you don't set "ob_implicit_flush" you need to do an "ob_flush()" after every piece of content.

References

  1. ob_implicit_flush

  2. ob_flush

if you don't use ob_implicit_flush at the top of your script as I wrote earlier, you need to do something like:

<?php
echo 'dummy text or execution stats';
ob_flush();

within your execution loop

like image 127
vortex Avatar answered Jan 05 '23 19:01

vortex