Here's some sample code of copying an S3 key. There's a lot of reasons you might want to do this, one of which is to update key metadata. And while this does seem to be the widely accepted solution for that, there is a big issue. The problem is when I do the example below, I actually lose my Content-Type, which defaults back to 'application/octet-stream' (not very useful if trying to serve web images).
# Get bucket
conn = S3Connection(self._aws_key, self._aws_secret)
bucket = conn.get_bucket(self._aws_bucket)
# Create key
k = Key(bucket)
k.key = key
# Copy old key
k.metadata.update({ meta_key: meta_value })
k2 = k.copy(k.bucket.name, k.name, k.metadata, preserve_acl=True)
k = k2
Any ideas? Thanks.
The following GitHub Gist worked for me:
import boto
s3 = boto.connect_s3()
bucket = s3.lookup('mybucket')
key = bucket.lookup('mykey')
# Copy the key onto itself, preserving the ACL but changing the content-type
key.copy(key.bucket, key.name, preserve_acl=True, metadata={'Content-Type': 'text/plain'})
key = bucket.lookup('mykey')
print key.content_type
Took a looong time to run though!
take a look at this post
you need to do a
key = bucket.get_key(key.name)
then:
metadata['Content-Type'] = key.content_type will work
otherwise, key.content_type
will return application/octet-stream
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With