I have an application running on tomcat and sometimes I have the error below:
SEVERE: Socket accept failed java.net.SocketException: Too many open files at java.net.PlainSocketImpl.socketAccept(Native Method) at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:398) at java.net.ServerSocket.implAccept(ServerSocket.java:522) at java.net.ServerSocket.accept(ServerSocket.java:490) at org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServerSocketFactory.java:60) at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:216) at java.lang.Thread.run(Thread.java:722)
....
SEVERE: Error processed default web.xml named conf/web.xml at /local/myApp/apache-tomcat/conf/web.xml java.io.FileNotFoundException: /local/myApp/apache-tomcat/conf/web.xml (Too many open files) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:138) at org.apache.catalina.startup.ContextConfig.getWebXmlSource(ContextConfig.java:1838) at org.apache.catalina.startup.ContextConfig.getGlobalWebXmlSource(ContextConfig.java:1745) at org.apache.catalina.startup.ContextConfig.getDefaultWebXmlFragment(ContextConfig.java:1418) at org.apache.catalina.startup.ContextConfig.webConfig(ContextConfig.java:1253) at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:878) at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:369) at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119) at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5269) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.StandardContext.reload(StandardContext.java:3926) at org.apache.catalina.loader.WebappLoader.backgroundProcess(WebappLoader.java:426) at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1345) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1530) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1540) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1540) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1519) at java.lang.Thread.run(Thread.java:722)
I check the limits of the open files and it's 1024 but when I check the number of open files of the application with lsof it's nearly 200, why this happen if it does not reach the limit? Should I increase the limit? Is there any other reason to get this error? The only way to get the service running correctly again, it's restarting the tomcat, is there another way of getting back to normal?
Thanks in advance.
EDIT: Here is the servlet which process the doPost method, at the beginning I didn't close every stream, could it be that? I added the finally statement for doing that:
InputStream is = null; DataInputStream dis = null; OutputStream os = null; DataOutputStream dos = null; String paramName = ""; try { os = response.getOutputStream(); is = request.getInputStream(); dis = new DataInputStream(is); dos = new DataOutputStream(os); ..... }catch (Throwable e) { LOGGER.error(e.getMessage()); } finally { if (dis != null) { dis.close(); } else if(is != null) { is.close(); } if (dos != null) { dos.close(); } else if( os != null) { os.close(); } }
EDIT2: After making some testing I realize that if I close first the DataInputStream and then the InputStream, I get in the other part of the communication a number before the message(I don't know why). I changed the order of closing the stream and it seems that everything it's ok. But I still have the problem. Any idea?
finally { if(is != null) { try { is.close(); } catch (IOException e) { LOGGER.error(e.getMessage()); } } if (dis != null) { try { dis.close(); } catch (IOException e) { LOGGER.error(e.getMessage()); } } if(os != null) { try { os.close(); } catch (IOException e) { LOGGER.error(e.getMessage()); } } if (dos != null) { try { dos.close(); } catch (IOException e) { LOGGER.error(e.getMessage()); } } }
Too many open files Tomcat. Follow the below instructions to have a quick analysis of current configuration with your server and tune the tomcat hard and soft limits to fix this issue. This will display all open files of that process. This will show the count of open files. To increase open files limit update /etc/security/limits.conf.
One of these is the number of files a process can have open at once. If you’ve ever seen the “Too many files open” error message in a terminal window or found it in your system logs, it means that the upper limit has been hit, and the process is not being permitted to open any more files.
Our Tomcat instance was started as a service during boot and there’s a bug discovered and filed (with patch) in 2005 that doesn’t seem to have been resolved yet. The bug reveals itself by ignoring the max number of open files limit when starting daemons in Ubuntu/Debain. So the work-around suggested by “BOK” was to edit /etc/init.d/tomcat and add:
If you’ve ever seen the “Too many files open” error message in a terminal window or found it in your system logs, it means that the upper limit has been hit, and the process is not being permitted to open any more files. There’s a system-wide limit to the number of open files that Linux can handle.
Do the following to get the pid, say 1234, of tomcat7
ps aux |grep tomcat7
and then do
cat /proc/1234/limits
to read a line like the following
Max open files 16384 16384 files
These are the maximum number of open files that are allowed by Tomcat. To increase it, follow the instructions below
Too many open files Tomcat.
Follow the below instructions to have a quick analysis of current configuration with your server and tune the tomcat hard and soft limits to fix this issue.
This will display all open files of that process.
ls -l /proc/tomcatPID/fd
This will show the count of open files.
ls -l /proc/tomcatPID/fd | wc -l
To increase open files limit update /etc/security/limits.conf
.
To check no of open files specific to tomcat:
Hard limit: su - tomcat -c 'ulimit -Hn' -s '/bin/bash'
Soft limit: su - tomcat -c 'ulimit -Sn' -s '/bin/bash'
You can run below script with a corn job to know the details of open files.
============================= #!/bin/bash PID=$(ps -ef|grep tomcat6|grep -v grep |awk '{print $2}') value=$(ls -l /proc/$PID/fd | wc -l) echo `date`@$PID:$value >> /usr/local/filecount.txt if [ $value -gt 2000 ]; then printf "\n\n\n\n\n" >> /usr/local/files_report.txt echo "-------------------------------`date`--Starting Session----------------------" >> /usr/local/files_report.txt openfiles=$(ls -l /proc/$PID/fd | awk '{print NR,$11 "" >> "/usr/local/files_report.txt"}') echo "--------------------`date`---Ending Session ------------------------------" >> /usr/local/files_report.txt fi =================
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With