Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

S3 moving files between buckets on different accounts?

I'm doing some work for a client that has 2 separate AWS accounts. We need to move all the files in a bucket on one of their S3 accounts to a new bucket on the 2nd account.

We thought that s3cmd would allow this, using the format:

s3cmd cp s3://bucket1 s3://bucket2 --recursive 

However this only allows me to use the keys of one account and I can't specify the accounts of the 2nd account.

Is there a way to do this without downloading the files and uploading them again to the 2nd account?

like image 382
Geuis Avatar asked Oct 03 '12 01:10

Geuis


People also ask

Are S3 buckets unique across accounts?

Amazon S3 supports global buckets, which means that each bucket name must be unique across all AWS accounts in all the AWS Regions within a partition.

Why can't I copy an object between two Amazon S3 buckets?

If the object that you can't copy between buckets is owned by another account, then the object owner can do one of the following: The object owner can grant the bucket owner full control of the object. After the bucket owner owns the object, the bucket policy applies to the object.


2 Answers

You don't have to open permissions to everyone. Use the below Bucket policies on source and destination for copying from a bucket in one account to another using an IAM user

  • Bucket to Copy from: SourceBucket

  • Bucket to Copy to: DestinationBucket

  • Source AWS Account ID: XXXX–XXXX-XXXX

  • Source IAM User: src–iam-user

The below policy means – the IAM user - XXXX–XXXX-XXXX:src–iam-user has s3:ListBucket and s3:GetObject privileges on SourceBucket/* and s3:ListBucket and s3:PutObject privileges on DestinationBucket/*

On the SourceBucket the policy should be like:

{   "Id": "Policy1357935677554",   "Statement": [{     "Sid": "Stmt1357935647218",     "Action": ["s3:ListBucket"],     "Effect": "Allow",     "Resource": "arn:aws:s3:::SourceBucket",     "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}   }, {     "Sid": "Stmt1357935676138",     "Action": ["s3:GetObject"],     "Effect": "Allow",     "Resource": "arn:aws:s3:::SourceBucket/*",     "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}   }] } 

On the DestinationBucket the policy should be:

{   "Id": "Policy1357935677555",   "Statement": [{     "Sid": "Stmt1357935647218",     "Action": ["s3:ListBucket"],     "Effect": "Allow",     "Resource": "arn:aws:s3:::DestinationBucket",     "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}   }, {     "Sid": "Stmt1357935676138",     "Action": ["s3:PutObject"],     "Effect": "Allow",     "Resource": "arn:aws:s3:::DestinationBucket/*",     "Principal": {"AWS": "arn:aws:iam::XXXXXXXXXXXX:user/src–iam-user"}   }] } 

Command to be run is s3cmd cp s3://SourceBucket/File1 s3://DestinationBucket/File1

like image 183
Robs Avatar answered Sep 20 '22 15:09

Robs


Bandwidth inside AWS does not count, so you could save some money and time by doing it all from a box inside AWS, as long as the buckets are in the same region.

As for doing it without having the file touch down on a computer somewhere - don't think so.

Except:Since they do bulk uploads from hard drives you mail to them, they might do the same for you for a bucket to bucket transfer.

like image 29
Tom Andersen Avatar answered Sep 20 '22 15:09

Tom Andersen