I'm working on a huge legacy Java application, with a lot of handwritten stuff, which nowadays you'd let a framework handle.
The problem I'm facing right now is that we are running out of file handles on our Solaris Server. I'd like to know what's the best way to track open file handles? Where to look at and what can cause open file handles to run out?
I cannot debug the application under Solaris, only on my Windows development environment. Is is even reasonable to analyze the open file handles under Windows?
try to identify the source of the problem. - 1 - Check the current limits. - 2 - Check the limits of a running process. - 3 - Tracking a possible file descriptors leak.
"Too many open files " errors happen when a process needs to open more files than it is allowed by the operating system. This number is controlled by the maximum number of file descriptors the process has. 2. Explicitly set the number of file descriptors using the ulimit command.
A simple fix for the "too many files open" limitation of Mac OS is to use the "ulimit - n" command. Curiously, the value of n appears to be critical to whether or not this command is accepted by MacOS. I've found that ulimit -n 10240 (the default is 256) works but n values higher do not.
On windows you can look at open file handles using process explorer:
http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx
On Solaris you can use "lsof" to monitor the open file handles
Its worth bearing in mind that open sockets also consume file handles on Unix systems. So it could very well be something like a database connection pool leak (e.g. open database connections not being closed and returned to the pool) that is leading to this issue - certainly I have seen this error before caused by a connection pool leak.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With