Currently my role is owner, but I'm not seeing a way to view all queries across all users. I'm mainly interested in seeing bytes processed for each successful query. Without this ability, it makes it really hard to determine where the costs are coming from without asking each user individually. Is there a way to do this through the UI or the CLI command?
In the Google Cloud console, open the BigQuery page. In the Explorer panel, expand your project. Expand Saved queries. If your query is a project-level saved query, also expand Project queries.
Your project can run up to 100 concurrent interactive queries.
In BigQuery, materialized views are precomputed views that periodically cache the results of a query for increased performance and efficiency. BigQuery leverages precomputed results from materialized views and whenever possible reads only delta changes from the base tables to compute up-to-date results.
In the CLI, you can run bq ls -j -a
to retrieve jobs for all users in a project. I don't think this command returns the bytes processed for a successful query, but a bq show
on the successful queries' job IDs would retrieve the relevant information.
Using the API directly, you can specify the allUsers
parameter to a jobs.list
request.
There is no way to view this information in the web UI at present.
Danny Kitt's answer got me part of the way there. Here's one of the Ruby scripts I wrote that ultimately gave me what I wanted. Should be easy enough to port to the language of your choice.
#!/usr/bin/ruby
require "json"
require "time"
MAX_RESULTS = 1000
jobs_json = `bq ls --format json --jobs --all --max_results #{MAX_RESULTS}`
jobs = JSON.parse(jobs_json)
users = {}
jobs.each do |job|
if job["Job Type"] == "query" and job["State"] == "SUCCESS"
job_data_json = `bq show --format=json --job #{job["jobId"]}`
job_data = JSON.parse(job_data_json)
creation_time = Time.at(job_data["statistics"]["creationTime"][0..-4].to_i)
user = job_data["user_email"]
bytes_processed = job_data["statistics"]["totalBytesProcessed"].to_i
if bytes_processed > 0
users[user] = 0 unless users.key?(user)
users[user] += bytes_processed
puts "[#{creation_time}] #{user}: #{bytes_processed} (running total: #{users[user]})"
# puts job_data["configuration"]["query"]["query"]
end
end
end
puts "\nFINAL:\n"
users.each do |user, bytes_processed|
puts "#{user}: #{bytes_processed}"
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With