Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

OSError: [Errno 24] Too many open files; in python; difficult to debug

I am running code which after sometimes hours, sometimes minutes fails with the error

OSError: [Errno 24] Too many open files

And I have real trouble debugging this. The error itself is always triggered by the marked line in the code snippet below

try:
    with open(filename, 'rb') as f:
        contents = f.read()       <----- error triggered here
except OSError as e:
    print("e = ", e)
    raise
else:
    # other stuff happens

However, I can't see any problem in this part of the code (right?) so I guess that other parts of the code don't close files properly. However, while I do open files quite a bit, I always open them with the 'with' statement, and my understanding is that even if an error occurs the files will be closed (right?). So another part of my code looks like this

    try:
        with tarfile.open(filename + '.tar') as tar:
            tar.extractall(path=target_folder)
    except tarfile.ReadError as e:
        print("e = ", e)
    except OSError as e:
        print("e = ", e)
    else:
        # If everything worked, we are done
        return

The code above does run into a ReadError quite frequently, but even if that happens, the file should be closed, right? So I just don't understand how I can run into too many open files? Sorry this is not reproducible for you, since I can't debug it enough, I just fishing for some tips here, since I am lost. Any help is appreciated...

Edit: I am on a macbook. Here is the output of ulimit -a

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 256
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1418
virtual memory          (kbytes, -v) unlimited

Following the suggestion by @sj95126 I changed the code concerning the tar file to something which ensures that the file is closed

try:
    tar = tarfile.open(filename + '.tar')
    tar.extractall(path=target_folder)
except tarfile.ReadError as e:
    print("tarfile.ReadError e = ", e)
except OSError as e:
    print("e = ", e)
else:
    # If everything worked, we are done
    return
finally:
    print("close tar file")
    try:
        tar.close()
    except:
        print("file already closed")

but it did not solve the problem.

like image 550
carl Avatar asked May 02 '26 19:05

carl


1 Answers

on unix/linux systems there is a command with which you can check the total number of file locks or open files limit using ulimit -a. in @carl's situation the output was:

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 256
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 1418
virtual memory          (kbytes, -v) unlimited

as you can see the open files or file locks is equal to 256:

open files                      (-n) 256

which is a very small value

@carl's archive was at least containing more than 256 files; so python was opening each file using a file handler, which then results in a system file lock (in order to open a file on a system you need a file lock, like a pointer to that file; to access data, do whatever you want)

the solution is to make open files value to unlimited or to a very big number.

according to this stack answer this is how to you can change the limit

like image 69
alexzander Avatar answered May 05 '26 07:05

alexzander