Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Write file and objects to amazon s3

I am using amazon S3 to distribute the dynamically generated files to S3.

At a local server, I can use

destination = open(VIDEO_DIR + newvideo.name, 'wb+')

to store generated videos to the location VIDEO_DIR.newvideo.name

Is there feasible way to change VIDEO_DIR to S3 endpoint location. So the dynamically generated videos can be written to S3 server directly?

Another question is: is there any feasible way to write an object to S3 directly? For example, a chunklet=Chunklet(), how to write this chunklet object to S3 server directly?

I can do this first create a local file and use S3 API. For example,

mime = mimetypes.guess_type(filename)[0]
k = Key(b)
k.key = filename
k.set_metadata("Content-Type", mime)
k.set_contents_from_filename(filename)
k.set_acl('public-read')

But I want to improve the efficiency. Python is used.

like image 387
susanne Avatar asked Sep 25 '12 14:09

susanne


1 Answers

Use the boto library to access your S3 storage. You still have to write your data to a (temporary) file first before you can send it though, as the stream writing methods have not yet been implemented.

I'd use a context manager to work around that limitation:

import tempfile
from contextlib import contextmanager

@contextmanager
def s3upload(key):
    with tempfile.SpooledTemporaryFile(max_size=1024*10) as buffer:  # Size in bytes
        yield buffer  # After this, the file is typically written to
        buffer.seek(0)  # So that reading the file starts from its beginning
        key.set_contents_from_file(buffer)

Use it as a context managed file object:

k = Key(b)
k.key = filename
k.set_metadata("Content-Type", mime)

with s3upload(k) as out:
    out.write(chunklet)
like image 152
Martijn Pieters Avatar answered Oct 11 '22 15:10

Martijn Pieters