Is there any way to set Bucket's default cache control (trying to override the public, max-age=3600
in bucket level every time creating a new object)
Similar to defacl but set the cache control
Cloud CDN leverages Google Cloud your choice of either the global external HTTP(S) load balancer or the global external HTTP(S) load balancer (classic) to provide routing, health checking, and anycast IP support.
Overview. When a Cloud Storage object is cached, copies of the object are stored in a Google or internet cache so your object can be served faster in future requests.
In the Google Cloud Console, go to the Cloud Storage Browser page. In the bucket list, find the bucket you want to modify, and click its Bucket overflow menu ( ).
In the Google Cloud Console, go to the Cloud Storage Browser page. In the bucket list, find the bucket you want to modify, and click its Bucket overflow menu ( ). Click Edit default storage class. In the overlay window, select the new default storage class you would like for your bucket. Click Save.
Note: By default, Cache-Control is set to private. This means that if you don't explicitly set Cache-Control to public, only the browser of the requesting user is allowed to cache the content. max-age — Tells the browser and the CDN how many seconds that they can cache the content.
Note: Cache-Control is also a header you can specify in your HTTP requests for an object; however, Cloud Storage ignores this header and sets response Cache-Control headers based on the stored metadata values.
If someone is still looking for an answer, one needs to set the metadata while adding the blob.
For those who want to update the metadata for all existing objects in the bucket, you can use setmeta
from gsutil
- https://cloud.google.com/storage/docs/gsutil/commands/setmeta
You just need to do the following :
gsutil setmeta -r -h "Cache-control:public, max-age=12345" gs://bucket_name
Using gsutil
- -h: Allows you to specify certain HTTP headers
- -r: Recursive
- -m: To performing a sequence of gsutil operations that may run significantly faster.
gsutil -m setmeta -r -h "Cache-control:public, max-age=259200" gs://bucket-name
It is possible to write a Google Cloud Storage Trigger.
This function sets the Cache-Control metadata field for every new object in a bucket:
from google.cloud import storage
CACHE_CONTROL = "private"
def set_cache_control_private(data, context):
"""Background Cloud Function to be triggered by Cloud Storage.
This function changes Cache-Control meta data.
Args:
data (dict): The Cloud Functions event payload.
context (google.cloud.functions.Context): Metadata of triggering event.
Returns:
None; the output is written to Stackdriver Logging
"""
print('Setting Cache-Control to {} for: gs://{}/{}'.format(
CACHE_CONTROL, data['bucket'], data['name']))
storage_client = storage.Client()
bucket = storage_client.get_bucket(data['bucket'])
blob = bucket.get_blob(data['name'])
blob.cache_control = CACHE_CONTROL
blob.patch()
You also need a requirements.txt file for the storage import in the same directory. Inside the requirements there is the google-cloud-storage package:
google-cloud-storage==1.10.0
You have to deploy the function to a specific bucket:
gcloud beta functions deploy set_cache_control_private \
--runtime python37 \
--trigger-resource gs://<your_bucket_name> \
--trigger-event google.storage.object.finalize
For debugging purpose you can retrieve logs with gcloud command as well:
gcloud functions logs read --limit 50
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With