Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Writing bytes stream to s3 using python

I have a zip file loaded into memory (do not have it persisted on disk). The zip file contains jpg images. I am trying to upload each jpg into s3 but am getting an error.

# already have an opened zipfile stored in zip_file
# already connected to s3

files = zip_file.namelist()

for f in files:
    im = io.BytesIO(zip_file.read(f))
    s3_key.key = f
    s3_key.set_contents_from_stream(im)

I get the following error:

BotoClientError: BotoClientError: s3 does not support chunked transfer

What am I doing wrong?

like image 948
IUnknown Avatar asked Feb 11 '14 18:02

IUnknown


2 Answers

Here is the solution. I was over thinking the problem.

files = zip_file.namelist()

for f in files:
    data = zip_file.read(f)
    s3_key._key.key = f
    s3_key._key.set_contents_from_string(data)

That's all it took.

like image 58
IUnknown Avatar answered Sep 28 '22 07:09

IUnknown


Boto supports other storage services, such as Google Cloud Storage, in addition to S3. The set_contents_from_stream method only works for services that support chunked transfer (see https://codereview.appspot.com/4515170). S3 does not support that (See their Technical FAQs at http://aws.amazon.com/articles/1109.)

It's unfortunate, but you can't upload from a stream to S3.

like image 37
Charles Engelke Avatar answered Sep 28 '22 07:09

Charles Engelke