Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to select S3 folder as the source for CodePipeline on AWS

I'm trying to setup a CI/CD pipeline on AWS using CodePipeline with the following setup

Source: S3

Build: CodeBuild

Deploy: CodeDeploy

Release: EC2 instance

I've managed to finish the config by following this link

However my pipeline fails with the error message that

The object with key 'code-sashi/api' does not exist.

I have checked and confirmed that the bucket name is correct and there is a folder 'api' inside the bucket.

The option to set this clearly states that I can enter either the S3 object key or an S3 folder. I would like to use an S3 folder in this case since my build artifact will only be ready after CodeBuild runs.

However CodePipeline continues to look for an object key and ignores my folder.

I have also tried setting the S3 folder as 'api', '/api', 'api/' and none of them work. I tried copying my files to the bucket directly and setting the folder as '/' which fails with a different error message that "object key cannot end with a trailing /"

Based on this link I should be able to get all the contents of the bucket by S3 folder as '/'.

If it helps, I am uploading files from Bitbucket private repository to S3 using Bitbucket Pipelines. Unfortunately CodePipeline cannot connect Bitbucket directly and hence the S3 workaround. Weird that CodeBuild has no problems connecting to a Bitbucket repository but it cannot do so if it is a part of CodePipeline.

Question

How do I configure CodePipeline correctly to get my files from 'code-sashi' bucket and 'api' folder? There will be other folders containing code in the future so I would like to hold all of them inside a single bucket.

like image 830
Sashi Avatar asked Dec 06 '18 00:12

Sashi


People also ask

Can S3 object be a folder?

Amazon S3 is a highly-scalable object storage system. Amazon S3 can contain any number of objects (files), and those objects can be organized into “folders”.

How do I create a CodePipeline with source from another AWS account?

Create or use an AWS KMS customer managed key in the Region for the pipeline, and grant permissions to use that key to the service role ( CodePipeline_Service_Role ) and AccountB . Create an Amazon S3 bucket policy that grants AccountB access to the Amazon S3 bucket (for example, codepipeline-us-east-2-1234567890 ).

Where is S3 bucket file path?

You would supply the bucket name and the object's key to that API. To get a list of all objects under a bucket, you can use the ListObjectsV2 API. You would supply the bucket name and an optional key prefix to that API.


1 Answers

I figured this out finally.

Even if you want to use S3 as your source rather than your artifact storage, the files must be zipped and you must specify the zipped file as your object key.

So I added a zip stage to my Bitbucket Pipeline and re-configured CodePipeline to use the zipped file as the source. Worked perfectly!

like image 76
Sashi Avatar answered Sep 25 '22 20:09

Sashi