I'm building a web app in Python (using Flask). I do not intend to use SQLAlchemy or similar ORM system, rather I'm going to use Psycopg2 directly.
Should I open a new database connection (and subsequently close it) for each new request? Or should I use something to pool these connections?
Connection pooling is great for scalability - if you have 100 threads/clients/end-users, each of which need to talk to the database, you don't want them all to have a dedicated connection open to the database (connections are expensive resources), but rather to share connections (via pooling).
Connection pooling is the process of having a pool of active connections on the backend servers. These can be used any time a user sends a request. Instead of opening, maintaining, and closing a connection when a user sends a request, the server will assign an active connection to the user.
PostgreSQL Connection Limits At provision, Databases for PostgreSQL sets the maximum number of connections to your PostgreSQL database to 115. 15 connections are reserved for the superuser to maintain the state and integrity of your database, and 100 connections are available for you and your applications.
One of the most common issues undermining connection pool benefits is the fact that pooled connections can end up being stale. This most often happens due to inactive connections being timed out by network devices between the JVM and the database. As a result, there will be stale connections in the pool.
PgBouncer is pretty neat and transparent to the application and server.
We have been using PgBouncer in production for 2 years without a single issue. It's a pretty awesome PostgreSQL connection pooler.
http://wiki.postgresql.org/wiki/PgBouncer
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With