I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficulty with s3cmd and don't want an overkill java/RoR to do so).
However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?
I appreciate your hints.
You can't SCP. The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Export service.
The SFTP Gateway is a proxy server that provides a secure and convenient way to upload and download files from S3 buckets over the SFTP and SCP protocols. Manage access through IAM users and authenticate with the SFTP Gateway using IAM user credentials.
As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).
command line tool (pip3 install awscli
) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID
, AWS_SECRET_ACCESS_KEY
.
aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
and an rsync-like command:
aws s3 sync . s3://mybucket
Web interface:
Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.
EDIT: Actually, AWS CLI is based on botocore:
https://github.com/boto/botocore
So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With