Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

boto.s3: copy() on a key object loses 'Content-Type' metadata

Here's some sample code of copying an S3 key. There's a lot of reasons you might want to do this, one of which is to update key metadata. And while this does seem to be the widely accepted solution for that, there is a big issue. The problem is when I do the example below, I actually lose my Content-Type, which defaults back to 'application/octet-stream' (not very useful if trying to serve web images).

# Get bucket
conn = S3Connection(self._aws_key, self._aws_secret)
bucket = conn.get_bucket(self._aws_bucket)

# Create key
k = Key(bucket)
k.key = key

# Copy old key
k.metadata.update({ meta_key: meta_value })
k2 = k.copy(k.bucket.name, k.name, k.metadata, preserve_acl=True)
k = k2

Any ideas? Thanks.

like image 961
Ryan_IRL Avatar asked Feb 03 '12 22:02

Ryan_IRL


2 Answers

The following GitHub Gist worked for me:

import boto

s3 = boto.connect_s3()
bucket = s3.lookup('mybucket')
key = bucket.lookup('mykey')

# Copy the key onto itself, preserving the ACL but changing the content-type
key.copy(key.bucket, key.name, preserve_acl=True, metadata={'Content-Type': 'text/plain'})

key = bucket.lookup('mykey')
print key.content_type

Took a looong time to run though!

like image 121
Justin Watt Avatar answered Sep 23 '22 18:09

Justin Watt


take a look at this post

you need to do a

key = bucket.get_key(key.name)

then:

metadata['Content-Type'] = key.content_type will work

otherwise, key.content_type will return application/octet-stream

like image 44
user3236227 Avatar answered Sep 22 '22 18:09

user3236227