Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I reduce my data transfer cost? Amazon S3 --> Cloudflare --> Visitor

I recently started using Amazon S3 to serve images to my visitors since this will reduce the server load. Now, there is a new problem: Today I looked into my AWS billings. I noticed that I have a huge bill waiting for me - there has been a total of 4TB AWS Data Transfer in 20 days.

Obviously this is because the high amount of outgoing Amazon S3 traffic (to Cloudflare which then serves it to the visitors). Now I should to reduce the amount of requested files by setting a Cache header (since Cloudflare's Crawler will respect that). I have modified my code like this:

$s3->putObjectFile($path, $bucket , 'images/'.$id.'.jpg', S3::ACL_PUBLIC_READ);

to

$s3->putObjectFile($path, $bucket , 'images/'.$id.'.jpg', S3::ACL_PUBLIC_READ, array('Cache-Control' => 'public,max-age=31536000'));

Still, it does not work. Cloudflare does not respect the Cache because the Cache-Control does not show up as "Cache-Control" in the Header but instead as "x-amz-meta-cachecontrol". Cloudflare ignores this.

Does anyone have an easy solution for this?

TL;DR: I have more or less the same problem as this guy: http://support.bucketexplorer.com/topic734.html (that was in 2008)

EDIT: I have stumbled upon this: Amazon S3 not caching images but unfortunately that solution does not work for me.

EDIT 2: Turns out it didn't work because I was using an old version of the "Amazon S3 class". I updated and the code works now.

Thank you for your time.

like image 569
Jonas Kaufmann Avatar asked Dec 14 '12 15:12

Jonas Kaufmann


People also ask

Why is my AWS data transfer so expensive?

AWS data transfer costs fluctuate depending on the AWS region. Each AWS region has a data transfer fee within and outside of it. Data transfers between two regions cost more than within one region. Data transfers from one Availability Zone (AZ) to another are costlier than transfers within an AZ.

What is a cost effective way to minimize network issues caused by S3 multipart uploads?

To avoid storage charges for the uploaded parts that are left behind from an incomplete multipart upload, create a lifecycle policy. The lifecycle policy can be used to clean up incomplete multipart uploads after a certain number of days. Use Amazon S3 API calls to list your multipart uploads.

Which type of data transfer are free for Amazon S3?

Standard data transfers from the Internet to AWS S3 buckets are free, but data transfers outside AWS S3 incur costs.


2 Answers

If you are getting "x-amz-meta-cachecontrol", it is likely you are not setting the headers correctly. It might just be the exact way you are doing it in your code. This is supposed to work. I am deducing this is php using Amazon S3 PHP Class?

Try this:

$s3->putObject(file_get_contents($path), $bucket, $url, S3::ACL_PUBLIC_READ, array(), array('Cache-Control' => 'max-age=31536000, public'));

In the S3 PHP docs putObjectFile is listed under Legacy Methods:

putObjectFile (string $file, 
               string $bucket, 
               string $uri, 
               [constant $acl = S3::ACL_PRIVATE], 
               [array $metaHeaders = array()], 
               [string $contentType = null])

Compare to this:

putObject (mixed $input, 
           string $bucket, 
           string $uri, 
           [constant $acl = S3::ACL_PRIVATE], 
           [array $metaHeaders = array()], 
           [array $requestHeaders = array()])

You need to set cache-control as a request header, but appears that there is no way to set request headers with putObjectFile, only meta headers. You have to use putObject and give it an empty array for meta headers and then another array with the request headers (including cache-control).

You can also try some of the other working examples I have listed below.

See also:

How to set the Expires and Cache-Control headers for all objects in an AWS S3 bucket with a PHP script (php)

Updating caching headers for Amazon S3 and CloudFront (python)

Set cache-control for entire S3 bucket automatically (using bucket policies?)

http://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectGET.html?r=5225

like image 112
Alex I Avatar answered Sep 18 '22 15:09

Alex I


You can now. Go to s3 bucket. Open the file and set property

Aws console

like image 35
aWebDeveloper Avatar answered Sep 17 '22 15:09

aWebDeveloper