Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache2 and CGI - how to keep Apache from buffering the POST data?

I'm trying to provide live parsing of a file upload in CGI and show the data on screen as it's being uploaded.

However, Apache2 seems to want to wait for the full POST to complete before sending the CGI application anything at all.

How can I force Apache2 to stop buffering the POST to my CGI application?

EDIT

It appears that it's actually the output of the CGI that's being buffered. I started streaming the data to a temp file to watch it's progress. That, and I have another problem.

1) The output is being buffered. I've tried SetEnvIf (and simply SetEnv) for "!nogzip", "nogzip", and "!gzip" without success (within the CGI Directory definition).

2) Apache2 appears to not be reading the CGI's output until the CGI process exits? I notice that my CGI app (flushing or not) is hanging up permanently on a "fwrite(..., stdout)" line at around 80K.

EDIT

Okay, Firefox is messing with me. If I send a 150K file, then there's no CGI lockup around 80K. If the file is 2G, then there's a lockup. So, Firefox is not reading the output from the server while it's trying to send the file... is there any header or alternate content type to change that behavior?

EDIT

Okay, I suppose the CGI output lockup on big files isn't important actually. I don't need to echo the file! I'm debugging a problem caused by debugging aids. :)

I guess this works well enough then. Thanks!

FINAL NOTE

Just as a note... the reason I thought Apache2 was buffering input was that I always got a "Content-Length" environment variable. I guess FireFox is smart enough to precalculate the content length of a multipart form upload and Apache2 was passing that on. I thought Apache2 was buffering the input and reporting the length itself.

like image 929
darron Avatar asked Jul 07 '10 16:07

darron


1 Answers

Are you sure it's the input being buffered that's the problem? Output buffering problems are much more common, and might not be distinguishable from input buffering, if your method of debugging is something like just print​ing to the response.

(Output buffering is commonly caused either by unflushed stdout in the script or by filters. The usual culprit is the DEFLATE filter, which is often used to compress all text/ responses, whether they come from a static file or a script. In general it's a good idea to compress the output of scripts, but a side-effect it that it will cause the response to be fully buffered. If you need immediate response, you'll need to turn it off for that one script or all scripts, by limiting the application of AddOutputFilterByType to particular <Directory>​s, or using mod_setenvif to set the !nogzip note.)

Similarly, an input filter (including, again DEFLATE) might cause CGI input to be buffered, if you're using any. But they're less widely-used.

Edit: for now, just comment out any httpd conf you have enabling the deflate filter. You can put it back selectively once you're happy that your IO is unbuffered without it.

I notice that my CGI app (flushing or not) is hanging up permanently on a "fwrite(..., stdout)" line at around 80K.

Yeah... if you haven't read all your input, you can deadlock when trying to write output, if you write too much. You can block on an output call, waiting for the network buffers to unclog so you can send the new data you've got, but they never will because the browser is trying to send all its data before it will start to read the output.

What are you working on here? In general it doesn't make sense to write progress-info output in response to a direct form POST, because browsers typically won't display it. If you want to provide upload-progress feedback on a plain HTML form submission, this is usually done with hacks like having an AJAX connection check back to see how the upload is going (meaning progress information has to be shared, eg. in a database), or using a Flash upload component.

like image 152
bobince Avatar answered Oct 15 '22 11:10

bobince