Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Automate deployment of static website files to S3 after the build

I have a react app(SPA) that is deployed on S3 and publicly accessible. After each build, I have to manually upload the index.html, and other static assets to S3. Is there any ways to automate this process ?

I did an exhaustive search on CD(continuous deployment) to S3, here is a SO question about the same.

I am aggregating all the information from my exhaustive research. I have written the below answer, which contains various methods to achieve this.

like image 806
Lakshman Diwaakar Avatar asked Mar 29 '17 03:03

Lakshman Diwaakar


People also ask

Can you host dynamic websites on S3 what about Static websites?

You can use Amazon S3 to host a static website. On a static website, individual webpages include static content. They might also contain client-side scripts. By contrast, a dynamic website relies on server-side processing, including server-side scripts, such as PHP, JSP, or ASP.NET.

Which S3 feature can be used to host static website?

Configure static website hosting in the AWS S3 bucket We need to configure the S3 bucket for hosting a static website. Open the bucket and navigate to properties. Here, you get bucket configurations such as versioning, server access logging and static website hosting. By default, static website hosting is disabled.


2 Answers

There are many ways to do the automation of S3 deployment. Here are the things I gathered:

1. AWS SNS and Lambda:

This process is useful, if you want to trigger any AWS services on github push.So, here is the process:

  1. The github push triggers a message to SNS.
  2. The Lambda which is subscribed to the SNS topic is invoked.
  3. Inside the Lambda, I clone the github repository.
  4. Use AWS's S3 SDK to upload the build or dist directory to your S3 bucket. Here is a high level architecture of the above process:

Automation using SNS & Lambda

The downside of this approach is cloning large repos takes time and Lambdas are billed per second. So, this may become expensive for large repos.

2. Travis:

Travis is known for its CI(continuous integration) library. A .travis.yml is essential for the integration process.

If you want to make some tests after the build and then on success, upload the files to S3. Then this approach will be the best way. Travis is free for open source projects.

The downside is, I could not find a way to isolate a directory from the repo and upload that specific directory alone.

3. AWS cli:

This is the cheapest and best way to upload the files to S3. I used this approach. I got this information from this medium post.

Usually in react apps the build scripts are triggered by the npm or yarn written as scripts in the package.json. Here is the command for uploading the files to S3:

aws s3 sync build/ s3://<bucket-name>

I added this script as part of the build scripts in package.json. This was very handy and thus automated the manual process of uploading the files to S3.

This answer is based on my perspective. If anything is incorrect or If I had missed something, please feel free to comment and I will add it to the answer.

like image 198
Lakshman Diwaakar Avatar answered Oct 05 '22 13:10

Lakshman Diwaakar


I love @lakshman's answer, but it won't work for private/on-prem bitbucket repos and some other scenarios we run into with the commercial world.

A similar idea which is bitbucket-friendly (and more) is to have the code cloned into CodeCommit then trigger a CodePipeline which includes a CodeBuild step.

CodeBuild can push to S3. As a bonus, CodeBuild can run tests and include additional build steps as needed (migrations, etc). Bitbucket has mirror hook which allows CodeCommit to clone Bitbucket repos. I think this is the mirror hook link, but double check yourself as there are multiple Bitbucket plugins/extensions including the string mirror in the name.

This SO question discusses a different Bitbucket hook and also discusses cloning GitLab and JGit. Again, once the code is in CodeCommit then CodePipeline can take it from there.

Instead of using a Bitbucket hook you can also use a Bitbucket pipeline.

like image 40
John Vandivier Avatar answered Oct 05 '22 13:10

John Vandivier