I'm working on a page which accepts file uploads. In theory, I can detect when the file they're sending me is too big (by looking at the Content-Length of their response), and refuse to accept the upload, returning an HTTP 413 "Request Entity Too Large" error.
However, it seems that simply doing that is not enough -- Firefox, at least, will still keep sending the rest of the file (which could take a Long Time), before it shows my error page.
The HTTP spec says that I: "MAY close the connection to prevent the client from continuing the request." However, doing either a 'close STDIN', 'shutdown STDIN, 0', or some variant of that does not seem to do the trick -- Firefox still keeps sending the file.
I suspect that, when my mod_perl handler closes the connection, it's just closing the connection between itself and Apache; Apache keeps the connection between it and the client alive. Is there some way to tell Apache to shut down the connection? Otherwise, this seems like a great DoS vector.
Any suggestions would be welcome.
Have you explored Apache's limitation capabilities (as opposed to Perl's)? I don't know in details how the LimitRequestBody
directive deals with requests that too large, but at least in theory it looks like a setting designed to block off attacks.
Yes, it doesn't matter what Perl does with STDIN
or STDOUT
, Apache will still allow the upload to proceed before it even checks what happens with your CGI. You can close STDIN
or STDOUT
or even exit()
(although you can't do that since your process is persistent), but none of that will have any effect until after Apache is done accepting the POST request in its entirety. Likewise with any kind of status headers you might generate, such as a 413
for "Request too large".
Hence, you need to have Apache refuse POST requests beyond a certain size for this to work, for example using LimitRequestBody
as suggested by Pekka.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With