I have a huge file that I am writing into approximately 450 files. I am getting error as too many files open
. I searched the web and found some solution but it is not helping.
import resource resource.setrlimit(resource.RLIMIT_NOFILE, (1000,-1)) >>> len(pureResponseNames) #Filenames 434 >>> resource.getrlimit(resource.RLIMIT_NOFILE) (1000, 9223372036854775807) >>> output_files = [open(os.path.join(outpathDirTest, fname) + ".txt", "w") for fname in pureResponseNames] Traceback (most recent call last): File "<stdin>", line 1, in <module> IOError: [Errno 24] Too many open files: 'icd9_737.txt' >>>
I also changed ulimit
from the command line as below:
$ ulimit -n 1200 $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 1200 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 709 virtual memory (kbytes, -v) unlimited $
I am still getting the same error. PS: I also restarted my system and run the program but with no success.
"Too many open files" errors are always tricky – you not only have to twiddle with ulimit
, but you also have to check system-wide limits and OSX-specifics. This SO post gives more information on open files in OSX. (Spoiler alert: the default is 256).
However, it is often easy to limit the number of files that have to be open at the same time. If we look at Stefan Bollman's example, we can easily change that to:
pureResponseNames = ['f'+str(i) for i in range(434)] outpathDirTest="testCase/" output_files = [os.path.join(outpathDirTest, fname) + ".txt" for fname in pureResponseNames] for filename in range(output_files): with open(filename, 'w') as f: f.write('This is a test of file nr.'+str(i))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With