I have a Mule application that connects to an Oracle database. The application is a SOAP api which allows executing SQL Stored Procedures. My connector is set up to use connection pooling and I've been monitoring the connections themselves. I have a maximum pool size of 20 and when doing calls to the database, I can see them opening (netstat -ntl | grep PORTNUMBER).
tcp4 0 0 IP HERE OTHER IP HERE SYN_SENT
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 0 0 IP HERE OTHER IP HERE ESTABLISHED
tcp4 10 0 IP HERE OTHER IP HERE ESTABLISHED
When the calls are done, I expect the connections to be closed after a certain period of time. This does not happen. I've noticed that when the application was running on a server, connections were still open from july (that's a couple of months back).
The only way I found so far that actually closes the connections after a couple of seconds is by enabling XA transactions and setting the Connection Timeout. However, this completely messes up the performance of the application and it's unnecessary overhead.
How would I go about adding such a timeout without using XA connections? I'd like for my database connections to be closed after 20 seconds of inactivity.
Thank you
Edit: Generic database connector is used - Mule version 3.8.0 We have a maximum number of connections that are allowed to the database, we have multiple instances of this flow running. This means connections are reserved by one of the instances which causes the other instances unable to get new connections.
The specific issue we've had was that one instance still had 120 connections reserved, even though the last time it ran was weeks before. When the second instance requested more connections, it could only get 30 since the maximum on the database side is 150.
If you don't close it, it leaks, and ties up server resources. @EJP The connection itself might be thread-safe (required by JDBC), but the applications use of the connection is probably not threadsafe. Think of things like different transaction isolation, boundaries (commit/rollback/autocommit) etc.
Any new connection you establish that has exactly the same connection string will be able to reuse the connection from the pool. We strongly recommend that you always close the connection when you are finished using it so that the connection will be returned to the pool.
An application can call Close more than one time. No exception is generated. If the SqlConnection goes out of scope, it won't be closed. Therefore, you must explicitly close the connection by calling Close or Dispose .
Database connections shouldn't be left open. Open the connections just when you need to execute a query and close it as early as possible. So my solution is, YES.
You should use a connection pool implementation that provides you control of the time to live of a connection. Ideally the pool should also provide validation queries to detect stale queries.
For example the c3p0 pool has a configuration called maxConnectionAge that seems to match your needs. maxIdleTime also could be of interest.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With