My Perl web-app, running under Apache mod_fastcgi, frequently gets errors like the following:
Maximal count of pending signals (120) exceeded at line 119.
I've seen this happen in relation to file uploads but I'm not sure that's the only time it happens. I also get a SIGPIPE right before (or possibly after) I get that error.
Any thoughts?
EDIT Thanks for the suggestions everyone. Someone asked what line 119 was. Sorry, should have put that in. It's in a block of code where I run the virus checker on an uploaded file. I don't get the error every time, only occasionally.
if(open VIRUS_CK, '|/usr/local/bin/clamscan - --no-summary >'.$tmp_file) {
print VIRUS_CK $data; // THIS IS LINE 119
close VIRUS_CK;
if (($? >> 8) == 1) {
open VIRUS_OUTPUT, '<'.$tmp_file;
my $vout = <VIRUS_OUTPUT>;
close VIRUS_OUTPUT;
$vout =~ s/^stdin:\s//;
$vout =~ s/FOUND$//;
print STDERR "virus found on upload: $vout\n";
return undef, 'could not accept attachment, virus found: '.$vout;
}
unlink($tmp_file);
}
It means that the operating system is delivering signals to Perl faster than it can handle them, and the saturation point has been reached. Between operations, Perl saves signals to be handled and then handles them once it has a chance. You get this error because too many signals were received before Perl had a chance to catch its breath. This is a fatal error, so your Perl process terminates.
The solution is to figure out what's generating so many signals. See here for more details.
Update: My original answer was somewhat inaccurate, saying that generating a new Perl process was part of the issue, when in fact it wasn't. I've updated based on @ysth's comment below.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With