I would like to pull stats on a per bucket basis. Is this possible?
The default bucket location is within the US. If you do not specify a location constraint, then your bucket and data added to it are stored on servers in the US.
Cloud Stackdriver Monitoring measures bucket size once a day. You can use storage.googleapis.com/storage/total_bytes metric to get bucket size in Cloud Console.
(updated answer 2014/09/23 to reflect changes in the gsutil command)
gsutil du
displays the amount of space (in bytes) being used by the
objects in hierarchy under a given URL.
s
gives a summary total instead of the size of each object.h
prints human readable sizes instead of bytes.So:
$ gsutil du -sh gs://BUCKET_NAME
261.46 GB gs://BUCKET_NAME
... gives the total size of objects in the bucket. However, it is calculated on request and can take a long time for buckets with many objects.
For production use, enable Access Logs & Storage Data. The storage data logs will give you the average size in bytes/per hour for each bucket for the previous day.
The access logs give details about each request to your logged buckets.
There is also information on loading the logs into BigQuery for analysis.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With