The number of connections for Google Cloud SQL PostgreSQL databases is relatively low. Depending on the plan this is somewhere between 25 and 500, while the limit for MySQL in Google Cloud SQL is between 250 and 4000, reaching 4000 very quickly.
We currently have a number of trial instances for different customers running on Kubernetes and backed by the same Google Cloud SQL Postgres server. Each instance uses a separate set of database, roles and connections (one per service). We've already reached the limit of connections for our plan (50) and we're not even close to getting to the memory or cpu limits. Connection pooling seems not to be an option, because the connections are with different users. I'm wondering now why the limit is so low and if there's a way to increase the limit without having to upgrade to a more expensive plan.
PostgreSQL Connection Limits 15 connections are reserved for the superuser to maintain the state and integrity of your database, and 100 connections are available for you and your applications. If the number of connections to the database exceeds the 100-connection limit, new connections fail and return an error.
Cloud SQL for PostgreSQL is a fully-managed database service that helps you set up, maintain, manage, and administer your PostgreSQL relational databases on Google Cloud Platform.
Cloud Run limits Cloud Run services are limited to 100 connections to a Cloud SQL database. This limit applies per service instance.
It looks like google just released this as a beta feature. When creating or editing a database instance, you can add a flag called max_connections
, where you can enter a new limit between 14 and 262143 connections.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With