Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Change s3 object CacheControl using boto3

I'm trying to change the CacheControl attribute from a file which is in S3 already. I've found that my best option is copying this object to the same path changing its metadata. The code is pretty simple:

    file_key = 'index.html'
    s3_object = s3_resource.Object(bucket_name, file_key)
    s3_object.copy_from(CopySource={'Bucket':bucket_name, 'Key':file_key},
        CacheControl='no-cache',
        MetadataDirective='REPLACE')

This code doesn't work without the MetadataDirective='REPLACE', but it makes the file lose all its other metadatas. I could set all the metadatas manually, but it could cause issues in the future.

Is there a way of changing one metadata and keep all others?

like image 824
Leandro Lima Avatar asked Apr 26 '26 00:04

Leandro Lima


1 Answers

I ran into this as well and was able to piece together a solution from some documentation & other peoples' solutions. The key to doing this without losing existing metadata is to explicitly set the metadata from the existing object:

bucket_name = "xxxxx"
key = "yyyyy"

s3 = boto3.resource("s3",
    aws_access_key_id=AWS_ACCESS_KEY_ID,
    aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
    region_name=AWS_REGION,
)
s3_object = s3.Object(bucket_name, key)
s3_object.copy_from(
    CopySource={"Bucket": bucket_name, "Key": key},
    CacheControl="max-age=86400",
    Metadata=s3_object.metadata, # This copies existing metadata
    MetadataDirective="REPLACE",
)
like image 95
John Debs Avatar answered Apr 28 '26 12:04

John Debs