Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Upload file from memory to S3

I have downloaded a csv file from S3 into memory and edited the file using Boto3 and Python. How can I can reupload this file to S3 without ever storing it locally?

like image 214
Ciaran Avatar asked Nov 21 '19 00:11

Ciaran


People also ask

How do I transfer files to S3 bucket?

To upload folders and files to an S3 bucketSign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/ . In the Buckets list, choose the name of the bucket that you want to upload your folders or files to. Choose Upload.

How do I upload local files to AWS S3?

You have two options for uploading files: AWS Management Console: Use drag-and-drop to upload files and folders to a bucket. AWS CLI: With the version of the tool installed on your local machine, use the command line to upload files and folders to the bucket.

Can we upload CSV file to S3 bucket?

Upload your CSV files to an Amazon Simple Storage Service ( Amazon S3) bucket. This is the location that Amazon Personalize imports your data from. For more information, see Uploading Files and Folders by Using Drag and Drop in the Amazon Simple Storage Service User Guide.

What is the best way for the application to upload the large files in S3?

When you upload large files to Amazon S3, it's a best practice to leverage multipart uploads. If you're using the AWS Command Line Interface (AWS CLI), then all high-level aws s3 commands automatically perform a multipart upload when the object is large. These high-level commands include aws s3 cp and aws s3 sync.


2 Answers

As per @JamesMchugh, from put_object():

response = client.put_object(
    Body=b'bytes'|file,
    Bucket='string',
    Key='string',
)
like image 53
John Rotenstein Avatar answered Oct 16 '22 15:10

John Rotenstein


In my case, I have a list of dictionaries and I have to create in memory file and save that on S3. Following Code works for me!

import csv
import boto3
from io import StringIO
# input list
list_of_dicts = [{'name': 'name 1', 'age': 25}, {'name': 'name 2', 'age': 26}, {'name': 'name 3', 'age': 27}]

# convert list of dicts to list of lists
file_data = []
header = list(list_of_dicts[0].keys())
file_data = [[d[key] for key in header] for d in list_of_dicts]
file_data = [header] + file_data

# create in memory file and write data to it.
file_to_save = StringIO()
csv.writer(file_to_save).writerows(file_data)
file_to_save = bytes(file_to_save.getvalue(), encoding='utf-8')
file_name_on_s3 = 'my_data.csv'

# save in memory file to S3
client = boto3.client('s3',
                      aws_access_key_id='your access key',
                      aws_secret_access_key='your secret key')
response = client.put_object(
    Body=file_to_save,
    Bucket='your bucket name',
    Key=file_name_on_s3,
)
like image 24
Waqas Ali Avatar answered Oct 16 '22 14:10

Waqas Ali