I have this Perl/CGI to upload files and get the uploaded file size while uploaded.
The script works fine for files under 500MB but the buffer (OUTFILE) stops writing the file after around 500MB. Here's the partial code:
$u_size = $ENV{'CONTENT_LENGTH'};
if ($u_size > $max_size) {send_error ("Upload too big. Maximum size is $max_size bytes and your file is $u_size bytes.");}
print_progress(0);
# Set up uploading function
$query = CGI->new(\&hook);
#define functions
sub hook {
if ($error) {return;}
if (time >= $next_print) {
$next_print = time + $delay;
my ($filename, $buffer, $bytes_read, $data) = @_;
if ($check_mime) {
$filename =~ m/\.([^\.]+)$/;
$ext = lc($1);
print $ext;
$check_mime = 0;
}
$percent = $bytes_read / $u_size;
$filename =~ m/\\([^\\]+)$/;
$filename = $1;
print_progress($percent, $u_size, $bytes_read, $filename);
}
}
sub print_progress {
open(PROG, '>'.$uploaded_file_progress);
print PROG '{"percent" : ' . ($_[0] * 100) . ', "total" : ' . $_[1] . ', "uploaded" : ' . $_[2] . ', "filename" : "' . $_[3] . '"}';
close PROG;
}
#############
$uphandle = $query->upload($query->param());
binmode $uphandle;
if (!$error) {
open OUTFILE, ">" . $uploaded_file;
binmode OUTFILE;
while($bytesread = read $uphandle, $buffer, 1024) {
print OUTFILE $buffer;
}
#while (<$uphandle>) {print OUTFILE $_;}
close OUTFILE;
}
If the script is not a problem, what other stuff that I have to check? Thanks.
Edit: I have this in the log: Timeout waiting for output from CGI script. How do I get rid of this? I couldn't find a definitive answer on Google.
I would imagine you need to poke your apache TimeOut
configuration variable in the apache conf.
Seeing as you are using perl, this has been broached a couple of time on perlmonks.org and this link has popped up in response.
http://www.stonehenge.com/merlyn/LinuxMag/col39.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With