I need to open more than 10,000 files in a Perl script, so I asked the system administrator to change the limit on my account to 14,000. ulimit -a
now shows these settings:
core file size (blocks, -c) unlimited
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
open files (-n) 14000
pipe size (512 bytes, -p) 10
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 29995
virtual memory (kbytes, -v) unlimited
After the change I ran a test Perl program that opens/creates 256 files and closes 256 file handles at the end of script. When it creates 253 files the program dies saying too many open files. I don't understand why I'm getting this error.
I am working on a Solaris 10 platform. This is my code
my @list;
my $filename = "test";
for ($i = 256; $i >= 0; $i--) {
print "$i " . "\n";
$filename = "test" . "$i";
if (open my $in, ">", ${filename}) {
push @list, $in;
print $in $filename . "\n";
}
else {
warn "Could not open file '$filename'. $!";
die;
}
}
for ($i = 256; $i >= 0; $i--) {
my $retVal = pop @list;
print $retVal . "\n";
close($retVal);
}
According to this article this is a default limitation of 32-bit Solaris. A program is normally limited to using the first 256 file numbers. STDIN, STDOUT and STDERR take 0, 1 and 2 which leaves you with 253. It's not a simple process to work around it, ulimit won't do it, and I don't know if Perl will honor it.
Here's a discussion about it on Perlmonks with a few suggested work arounds such as FileCache.
While the Solaris limitation is unforgivable, in general having hundreds of open filehandles indicates that your program could be designed better.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With