Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how appoint a subdomain for a s3 bucket?

Good morning,

I am using amazon s3 bucket as the image server. And I want to use a subdomain of my site, how to address this bucket. eg: a picture is now in: https://s3-sa-east-1.amazonaws.com/nomeBucket/pasta/imag.png, and I access it through this same link.

Would that it were so: imagens.mydomain.com.br / folder / imag.png Is there any way I can do this? appoint a subdomain address to a bucket? I've tried the amazon route 53, as CNAME. I tried this: https://s3-sa-east-1.amazonaws.com/nomeBucket/

I took the test yesterday, but apparently it did not work. Someone already did something similar, and / or know how to help me?

Note: I'm using nginx. also need to configure it for subdomain?

Thank you

like image 391
samuel_R Avatar asked Sep 12 '13 13:09

samuel_R


3 Answers

You need to rename your bucket to match the custom domain name (e.g. imagens.mydomain.com.br) and set up that domain as a CNAME to

<bucket-name>.s3.amazonaws.com.

(in your case imagens.mydomain.com.br.s3.amazonaws.com.

The full instructions are available here:

http://docs.aws.amazon.com/AmazonS3/latest/dev/VirtualHosting.html

like image 74
dcro Avatar answered Oct 25 '22 07:10

dcro


Update 2019 : AWS SUBDOMAIN hosting in S3

As of today following steps worked to have a successfully working subdomain for AWS S3 hosted static website:

  1. Create a bucket with subdomain name. In this example www.subtest.mysite.com

aws bucket

Note: Make sure on 'Permission' tab of bucket:

1.Block public access (bucket settings) 2.Access Control List & 3.Bucket policy are appropriately set to make sure bucket is public. ( Assuming you already did this for your root domain bucket, those settings can be mirrored on this subdomain bucket)

S3 bucket permissions

  1. Upload the index.html file in the bucket

index aws bucket

  1. Create a CNAME record with your domain provider CNAME record in namecheap
like image 33
zennni Avatar answered Oct 25 '22 08:10

zennni


I'm going to build on the other answers here for completeness.

I have moved my bucket to a subdomain so that the contents can be cached by Cloudflare.

  • Old S3 Bucket Name: autoauctions
  • New S3 Bucket Name: img.autoauctions.io
  • CNAME record: img.autoauctions.io.s3.amazonaws.com

Now you'll need to copy all of your objects since you cannot rename a bucket. Here's how to do that with AWS CLI:

pip install awscli
aws configure
  • Go to https://console.aws.amazon.com/iam/home and create a user or go to an existing user
  • Go to the user's Security credentials tab
  • Click Create access key. Copy the secret.
  • Here's a list of AWS regions.

Now you'll copy your old bucket contents to your new bucket.

aws s3 sync s3://autoauctions s3://img.autoauctions.io

I found this to be too slow for the 1TB of images I needed to copy, so I increased the number of concurrent connections and re-ran from an EC2 instance.

aws configure set default.s3.max_concurrent_requests 400

Sync it up!


Want to make folders within your bucket public? Create a bucket policy like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::img.autoauctions.io/copart/*"
        },
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::img.autoauctions.io/iaai/*"
        }
    ]
}

And now the image loads from img.autoauctions.io via Cloudflare's cache.

Hope this helps some people!

like image 2
Nick Woodhams Avatar answered Oct 25 '22 07:10

Nick Woodhams