Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Set Google Storage Bucket's default cache control

Is there any way to set Bucket's default cache control (trying to override the public, max-age=3600 in bucket level every time creating a new object)

Similar to defacl but set the cache control

like image 693
Reza Shahbazi Avatar asked Aug 21 '15 18:08

Reza Shahbazi


People also ask

Does Google Cloud storage have a CDN?

Cloud CDN leverages Google Cloud your choice of either the global external HTTP(S) load balancer or the global external HTTP(S) load balancer (classic) to provide routing, health checking, and anycast IP support.

What is Cloud Storage cache?

Overview. When a Cloud Storage object is cached, copies of the object are stored in a Google or internet cache so your object can be served faster in future requests.

How do I change the size of a Google Cloud storage bucket?

In the Google Cloud Console, go to the Cloud Storage Browser page. In the bucket list, find the bucket you want to modify, and click its Bucket overflow menu ( ).

How do I change the default storage class in Google Cloud Storage?

In the Google Cloud Console, go to the Cloud Storage Browser page. In the bucket list, find the bucket you want to modify, and click its Bucket overflow menu ( ). Click Edit default storage class. In the overlay window, select the new default storage class you would like for your bucket. Click Save.

What is the default setting for cache-control?

Note: By default, Cache-Control is set to private. This means that if you don't explicitly set Cache-Control to public, only the browser of the requesting user is allowed to cache the content. max-age — Tells the browser and the CDN how many seconds that they can cache the content.

What is cache-control in cloud storage?

Note: Cache-Control is also a header you can specify in your HTTP requests for an object; however, Cloud Storage ignores this header and sets response Cache-Control headers based on the stored metadata values.


3 Answers

If someone is still looking for an answer, one needs to set the metadata while adding the blob. For those who want to update the metadata for all existing objects in the bucket, you can use setmeta from gsutil - https://cloud.google.com/storage/docs/gsutil/commands/setmeta

You just need to do the following :

gsutil setmeta -r -h "Cache-control:public, max-age=12345" gs://bucket_name
like image 151
Ajinkya Tupkar Avatar answered Oct 16 '22 22:10

Ajinkya Tupkar


Using gsutil

  • -h: Allows you to specify certain HTTP headers
  • -r: Recursive
  • -m: To performing a sequence of gsutil operations that may run significantly faster.

gsutil -m setmeta -r -h "Cache-control:public, max-age=259200" gs://bucket-name

like image 43
sandes Avatar answered Oct 16 '22 21:10

sandes


It is possible to write a Google Cloud Storage Trigger.

This function sets the Cache-Control metadata field for every new object in a bucket:

from google.cloud import storage

CACHE_CONTROL = "private"

def set_cache_control_private(data, context):
    """Background Cloud Function to be triggered by Cloud Storage.
       This function changes Cache-Control meta data.

    Args:
        data (dict): The Cloud Functions event payload.
        context (google.cloud.functions.Context): Metadata of triggering event.
    Returns:
        None; the output is written to Stackdriver Logging
    """

    print('Setting Cache-Control to {} for: gs://{}/{}'.format(
            CACHE_CONTROL, data['bucket'], data['name']))
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(data['bucket'])
    blob = bucket.get_blob(data['name'])
    blob.cache_control = CACHE_CONTROL
    blob.patch()

You also need a requirements.txt file for the storage import in the same directory. Inside the requirements there is the google-cloud-storage package:

google-cloud-storage==1.10.0

You have to deploy the function to a specific bucket:

gcloud beta functions deploy set_cache_control_private \
    --runtime python37 \
    --trigger-resource gs://<your_bucket_name> \
    --trigger-event google.storage.object.finalize

For debugging purpose you can retrieve logs with gcloud command as well:

gcloud functions logs read --limit 50
like image 7
anon6789 Avatar answered Oct 16 '22 22:10

anon6789