I recently found an automatically created connection string specifying "Pooling=False" and wondered as to why it was set up like that. From my understanding pooling is virtually always advantageous as long as it is not totally mis-configured.
Are there any reasons for disabling pooling? Does it depend on the OS, the physical connection or the used DBMS?
Yes, there's a reason to disable pooling. You need to check how a particular pooling library copes with temporary network disconnects. For example some database drivers and/or pool libraries do nothing if connection was lost but connection object is still active. Instead of respawning a new connection, pool will give you stale connections and you will get errors. Some pool implementations check if connection is alive by issuing some fast command to the server before serving the connection to application. If they get error they kill that connection and spawn a new one. You always need to test your pool library against such scenario before enabling pooling.
If it's a single threaded app, pooling seems unnecessary. Was it on a resource constrained device? Is startup time important to the application? These are some factors that might lead to the decision to turn off pooling.
In general, I think you are right that pooling is beneficial. If it's a typical web app then I would inquire about it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With