The company I work for uses Google Cloud SQL to manage their SQL databases in production.
We're having performance issues and I thought it'd be a good idea (among other things) to see/monitor all queries above a specific threshold (e.g. 250ms).
By looking at the PostgreSQL documentation I think log_min_duration_statement
seems like the flag I need.
log_min_duration_statement (integer)
Causes the duration of each completed statement to be logged if the statement ran for at least the specified number of milliseconds. Setting this to zero prints all statement durations.
But judging from the Cloud SQL documentation I see that is only possible to set a narrow set of database flags (as in for each DB instance) but as you can see from here log_min_duration_statement
is not among those supported flags.
So here comes the question. How do I log/monitor my slow PostgreSQL queries with Google Cloud SQL? If not possible then what kind of tool/methodologies do you suggest I use to achieve a similar result?
PostgreSQL attempts to do a lot of its work in memory, and spread out writing to disk to minimize bottlenecks, but on an overloaded system with heavy writing, it's easily possible to see heavy reads and writes cause the whole system to slow as it catches up on the demands.
To enable the slow query log, type the following command at the mysql> prompt: Copy SET GLOBAL slow_query_log = 'ON'; There are additional options that you can set for the slow query log: By default, when the slow query log is enabled, it logs any query that takes longer than 10 seconds to run.
Query insights helps you detect, diagnose, and prevent query performance problems for Cloud SQL databases. It supports intuitive monitoring and provides diagnostic information that helps you go beyond detection to identify the root cause of performance problems.
April 3, 2019 UPDATE
It is now possible to log slow queries on Google Cloud SQL PostgreSQL instances, see https://cloud.google.com/sql/docs/release-notes#april_3_2019:
database_flags = [
{
name = "log_min_duration_statement"
value = "1000"
},
]
Once you enable log_min_duration_statement
, you can view the logs using Stackdriver logging. Select Cloud SQL Database
-> cloudsql.googleapis.com/postgres.log
and you will see the log like this.
[103402]: [9-1] db=cloudsqladmin,user=cloudsqladmin LOG: duration: 11.211 ms statement: [YOUR SQL HERE]
References:
log_min_duration_statement
): https://cloud.google.com/sql/docs/postgres/flags#postgres-l
The possibility of monitoring slow PostgreSQL queries for Cloud SQL instances is currently not available. As you comment, the log_min_duration_statement
flag is currently not supported by Cloud SQL.
Right now, work is being made on adding this feature to Cloud SQL, and you can keep track on the progress made through this link. You can click on the star icon on the top left corner to get email notifications whenever any significant progress has been achieved.
There is a way to log slow queries through the pg_stat_statements
extension which is supported by Cloud SQL.
Since Cloud SQL doesn't grant superuser right to any of the users you need to use some workaround. First, you need to enable the extension with
CREATE EXTENSION IF NOT EXISTS pg_stat_statements;
then you can check slow queries with a query like
SELECT pd.datname,
us.usename,
pss.userid,
pss.query AS SQLQuery,
pss.rows AS TotalRowCount,
(pss.total_time / 1000) AS TotalSecond,
((pss.total_time / 1000) / calls) as TotalAverageSecond
FROM pg_stat_statements AS pss
INNER JOIN pg_database AS pd
ON pss.dbid = pd.oid
INNER JOIN pg_user AS us
ON pss.userid = us.usesysid
ORDER BY TotalAverageSecond DESC
LIMIT 10;
As postgres
user you can have a look on all slow queries, but since the user is not superuser you will see <insufficient privilege>
on all other users' queries.
To get around this limitation you can install the extension on other databases too (normally only postgres
user has rigths to install extensions) and you can check the query texts with the owner of the db.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With