Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

About causing too many open files with java?

Tags:

java

ulimit

When review colleague's code , found below code

    BufferedReader br = new BufferedReader(new FileReader(PATH + fileName));
    //...

just read a file and concat these lines as a one line, but I do not found any close code, So I think it should cause resource leak,and finally cause too many open files error , so to prove this, I write a test

for (int i = 0; i < 7168; i++) { // ulimit -n ==> 7168
    BufferedReader br = new BufferedReader(new FileReader("src/main/resources/privateKey/foo.pem"));
    System.out.println(br.readLine());
}
System.in.read();

Very strange, everything is ok, does not throw expected exception.

And check the real opened files in command line

➜  ~ lsof -p 16276 | grep 'foo.pem' | wc -l
    2538

why is only 2538, not 7168?

So what's wrong? how to cause the too many open files error ?


As @GhostCat suggested, change 7168 --> Integer.MAX_VALUE, this time it caused

java.io.FileNotFoundException: src/main/resources/privateKey/foo.pem (Too many open files in system)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)

when i is 27436, and in this case check the real opened files in command line is

➜  ~ lsof | grep foo.pem | wc -l
    7275

but where are left files(27346 - 7275)? and why ulimit number does not work?

like image 430
zhuguowei Avatar asked Apr 13 '17 14:04

zhuguowei


People also ask

What is Java net SocketException too many open files?

net. SocketException: Too many files open exception. This also means, if you are running Tomcat, Weblogic, Websphere, or any other web server in windows machine, you are more prone to this error than Linux based systems e.g. Solaris or Ubuntu. By the way, this error is the same as java. io.

What causes too many open files?

"Too many open files " errors happen when a process needs to open more files than it is allowed by the operating system. This number is controlled by the maximum number of file descriptors the process has. 2. Explicitly set the number of file descriptors using the ulimit command.

Can not be opened too many open files?

The "Too many open files" message means that the operating system has reached the maximum "open files" limit and will not allow SecureTransport, or any other running applications to open any more files. The open file limit can be viewed with the ulimit command: The ulimit -aS command displays the current limit.


2 Answers

I presume that the garbage collector is running, finding lots of unreachable BufferedReader objects and collecting them. That causes the underlying stream objects to be finalized ... which closes them.

To make this code break, add the BufferedReader objects to a list, so that they remain reachable.


And here's why I think that changing 7168 to MAXINT is working.

When a JVM starts, it will use a relatively small heap. One of the things that happens during GC is that the JVM decides if it needs to resize the heap. So here is what could be happening:

  • The JVM starts with a heap that is too small to hold 7168 open files + BufferedReader objects. (Remember that each of the latter probably has a preallocated buffer!)

  • You start opening files.

  • At about N = 7168 - 2538, the heap fills up with all of the BufferedReader objects + FileInputStream objects + various detritus from JVM startup / warmup.

  • The GC runs, and causes a (probably) all of the BufferedReader objects to be collected / finalized / closed.

  • Then the GC decides that it needs to expand the heap. You now have enough heap space for more open BufferedReader objects than your ulimit allows.

  • You resume opening files ... and then hit the open file limit.

That's one possible pattern.


If you really want to investigate this, I advise you turn on GC logging, and see if you can correlate the number of FDs reported by lsof with GC runs.

(You could try adding sleep calls between each open to make it easier to get lsof measurements. But that could change the JVM behavior in other ways ...)

like image 112
Stephen C Avatar answered Oct 19 '22 01:10

Stephen C


  1. jvm implicitly update ulimit value

    String [] cmdArray = {"sh","-c","ulimit -n"};
    Process p = Runtime.getRuntime().exec(cmdArray);
    BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
    System.out.println(in.readLine()); //it is 10240 not 7168
    
  2. @Stephen C is right, GC involved.

I create a MyBufferedReader extends BufferedRead and override finalize method

@Override
protected void finalize() throws Throwable {
    System.out.printf("Thread: %s finalize it and total: %d %n",Thread.currentThread().getName(),count.getAndAdd(1));
}

Got below info

Thread: Finalizer finalize it and total: 9410 

and in command line

➜  ~ lsof -p 5309 | grep 'taicredit_private_key_pkcs8' | wc -l
     830

and 9410 + 830 = 10240

like image 1
zhuguowei Avatar answered Oct 19 '22 02:10

zhuguowei