I am working with large matrixes, so I am using NumPy's memmap. However, I am getting an error as apparently the file descriptors used by memmap are not being closed.
import numpy
import tempfile
counter = 0
while True:
    temp_fd, temporary_filename = tempfile.mkstemp(suffix='.memmap')
    map = numpy.memmap(temporary_filename, dtype=float, mode="w+", shape=1000)
    counter += 1
    print counter
    map.close()
    os.remove(temporary_filename)
From what I understand, the memmap file is closed when the method close() is called. However, the code above cannot loop forever, as it eventually throws the "[Errno 24] Too many open files" error:
    1016
    1017
    1018
    1019
    Traceback (most recent call last):
      File "./memmap_loop.py", line 11, in <module>
      File "/usr/lib/python2.5/site-packages/numpy/core/memmap.py", line 226, in __new__
    EnvironmentError: [Errno 24] Too many open files
    Error in sys.excepthook:
    Traceback (most recent call last):
      File "/usr/lib/python2.5/site-packages/apport_python_hook.py", line 38, in apport_excepthook
    ImportError: No module named packaging_impl
    Original exception was:
    Traceback (most recent call last):
      File "./memmap_loop.py", line 11, in <module>
      File "/usr/lib/python2.5/site-packages/numpy/core/memmap.py", line 226, in __new__
    EnvironmentError: [Errno 24] Too many open files
Does anybody know what I am overlooking?
Since the memmap does not take the open file descriptor, but the file name, I suppose you leak the temp_fd file descriptor. Does os.close(temp_fd) help?
Great that it works.
Since you can pass numpy.memmap a file-like object, you could create one from the file descriptor you already have, temp_fd.
fobj = os.fdopen(temp_fd, "w+")
numpy.memmap(fobj, ...
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With