I have a postgres backup about 100GB and want to load it to S3 in EU Frankfurt and restore it in cloud database.
I have no access to AWS Import/Export service. on Ubuntu laptop
Strategies I have tried
1) management console upload, at least 2 weeks needed
2) bucket explore multi-upload, task failed due to java memory error every time
3) SDK multi-upload(boto, boto3, java SDK), do not show the progress bar. can not estimate how long it needs
4) other windows explore, do not have Linux version
What is the fastest way to load this into S3? or code snippet in either python or java. thanks a lot
Instead of using the Amazon S3 console, try uploading the file using the AWS Command Line Interface (AWS CLI) or an AWS SDK. Note: If you use the Amazon S3 console, the maximum file size for uploads is 160 GB. To upload a file that is larger than 160 GB, use the AWS CLI, AWS SDK, or Amazon S3 REST API.
Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB.
The size of an object in S3 can be from a minimum of 0 bytes to a maximum of 5 terabytes, so, if you are looking to upload an object larger than 5 gigabytes, you need to use either multipart upload or split the file into logical chunks of up to 5GB and upload them manually as regular uploads.
The easiest solution would be to use the AWS CLI (https://aws.amazon.com/de/cli/).
aws s3 cp /PATH_TO_BACKUP/BACKUP_FILE s3://BUCKETNAME
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With