Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Duplicate file in Amazon S3

Tags:

c#

amazon-s3

I'm trying to duplicate a file from a bucket to another but I can't seam to see the new file on the destination bucket.

I'm getting no errors at all...

Request:

enter image description here

Response:

<?xml version="1.0" encoding="UTF-8"?>
<CopyObjectResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
    <LastModified>2012-04-08T11:26:36.000Z</LastModified
    <ETag>&quot;a5f9084078981b64737b57dbf1735fcf&quot;</ETag>
</CopyObjectResult>

But I keep checking the Last Modified Date on S3 and I can't find any information about this new file, either I can access it directly

http://jk-v20.s3.amazonaws.com/PublicFiles/3ff28e21-4801-47c6-a6d0-e370706d303f_Content_Favicon.ico

What am I doing wrong?


Method:

public void DuplicateFileInCloud(string original, string destination)
{
    try
    {
        CopyObjectRequest request = new CopyObjectRequest();

        if (original.StartsWith("http"))
        {
            // could be from other bucket, URL will show all data
            // example: http://jk-v30.s3.amazonaws.com/PredefinedFiles/Favicons/002.ico
            string bucket = getBucketNameFromUrl(original), // jk-v30
                    key = getKeyFromUrl(original);          // PredefinedFiles/Favicons/002.ico

            request.WithSourceBucket(bucket);
            request.WithSourceKey(key);
        }
        else
        {
            // same bucket: copy/paste operation
            request.WithSourceBucket(this.bucketName);
            request.WithSourceKey(original);
        }

        request.WithDestinationBucket(this.bucketName);
        request.WithDestinationKey(destination);
        request.CannedACL = S3CannedACL.PublicRead;

        using (AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(this.accessKey, this.secretAccessKey))
        {
            S3Response response = client.CopyObject(request);
            response.Dispose();
        }
    }
    catch (AmazonS3Exception s3Exception)
    {
        throw s3Exception;
    }
}
like image 980
balexandre Avatar asked Apr 08 '12 11:04

balexandre


People also ask

Does S3 allow duplicates?

S3 Replication supports two-way replication between two or more buckets in the same or different AWS Regions. While live replication like CRR and SRR automatically replicates newly uploaded objects as they are written to your bucket, S3 Batch Replication allows you to replicate existing objects.

How do I find duplicates in S3 bucket?

There is no "find duplicates" command in Amazon S3. However, you do do the following: Retrieve a list of objects in the bucket. Look for objects that have the same ETag (checksum) and Size.

What happens if we upload same file to S3?

By default, when you upload the file with same name. It will overwrite the existing file. In case you want to have the previous file available, you need to enable versioning in the bucket.


1 Answers

http://jk-v20.s3.amazonaws.com//PublicFiles/3ff28e21-4801-47c6-a6d0-e370706d303f_Content_Favicon.ico

Is where the file is. (Note double slash. // ..) If you hit this Url you see the ico file. So its something to do with the leading slash, which may be added automatically by your toolset.

like image 70
Tom Andersen Avatar answered Oct 16 '22 15:10

Tom Andersen