Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to upload very large file to S3?

I have a postgres backup about 100GB and want to load it to S3 in EU Frankfurt and restore it in cloud database.

I have no access to AWS Import/Export service. on Ubuntu laptop

Strategies I have tried

1) management console upload, at least 2 weeks needed
2) bucket explore multi-upload, task failed due to java memory error every time
3) SDK multi-upload(boto, boto3, java SDK), do not show the progress bar. can not estimate how long it needs
4) other windows explore, do not have Linux version

What is the fastest way to load this into S3? or code snippet in either python or java. thanks a lot

like image 893
Hello lad Avatar asked Nov 13 '15 09:11

Hello lad


People also ask

How do I transfer large files to aws S3?

Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK. Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.

What is the largest size file you can transfer to S3?

Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB.

Can we upload 6tb file to S3?

The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if you are looking to upload an object larger than 5 gigabytes, you need to use either multipart upload or split the file into logical chunks of up to 5GB and upload them manually as regular uploads.


1 Answers

The easiest solution would be to use the AWS CLI (https://aws.amazon.com/de/cli/).

aws s3 cp /PATH_TO_BACKUP/BACKUP_FILE s3://BUCKETNAME
like image 86
Andreas Avatar answered Sep 28 '22 18:09

Andreas