I am building a CI/CD pipeline using AWS Codepipeline, the repository source is on bitbucket and I used the AWS-Codestarconnections to create a connection between the bitbucket repository and the pipeline.
The pipeline details are below:
{
"pipeline": {
"name": "test_pipeline",
"roleArn": "arn:aws:iam::<AccountId>:role/PipelineServiceRole",
"artifactStore": {
"type": "S3",
"location": "tadadadada-artifact"
},
"stages": [
{
"name": "Source",
"actions": [
{
"name": "Source",
"actionTypeId": {
"category": "Source",
"owner": "AWS",
"provider": "CodeStarSourceConnection",
"version": "1"
},
"runOrder": 1,
"configuration": {
"BranchName": "dev",
"ConnectionArn": "arn:aws:codestar-connections:us-east-2:<AccountId>:connection/4ca7b1cf-2917-4fda-b681-c5239944eb33",
"FullRepositoryId": "<username>/repository_name",
"OutputArtifactFormat": "CODE_ZIP"
},
"outputArtifacts": [
{
"name": "SourceArtifact"
}
],
"inputArtifacts": [],
"region": "us-east-2",
"namespace": "SourceVariables"
}
]
},
{
"name": "Build",
"actions": [
{
....
}
]
}
],
"version": 1
},
"metadata": {
"pipelineArn": "arn:aws:codepipeline:us-east-2:<AccountId>:test_pipeline",
"created": 1611669087.267,
"updated": 1611669087.267
}
}
The PipelineServiceRole + that policy attached to it are:
Service Role
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "codepipeline.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "IamPassRolePolicy",
"Effect": "Allow",
"Action": [
"iam:PassRole"
],
"Resource": "*",
"Condition": {
"StringEqualsIfExists": {
"iam:PassedToService": [
"cloudformation.amazonaws.com",
"ec2.amazonaws.com",
"ecs-tasks.amazonaws.com"
]
}
}
},
{
"Sid": "CodeBuildPolicy",
"Effect": "Allow",
"Action": [
"codebuild:BatchGetBuilds",
"codebuild:StartBuild"
],
"Resource": "*"
},
{
"Sid": "S3AccessPolicy",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:GetObjectVersion",
"s3:GetBucketAcl",
"s3:GetBucketLocation"
],
"Resource": "*"
},
{
"Sid": "ECRAccessPolicy",
"Effect": "Allow",
"Action": [
"ecr:DescribeImages"
],
"Resource": "*"
},
{
"Sid": "CodeStarConnectionsAccessPolicy",
"Effect": "Allow",
"Action": [
"codestar-connections:UseConnection"
],
"Resource": "*"
}
]
}
The source stage fails with an error :
[Bitbucket] Upload to S3 failed with the following error: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 085999D90C19E650; S3 Extended Request ID: gJ6l08+cX3U6i2Vj0+fW7PiqA/UzM6ZGCfyECmWb+Jit4Knu+gi/L4y3F24uqkFWUfGy9tZo0VE=; Proxy: null) (Service: null; Status Code: 0; Error Code: null; Request ID: null; S3 Extended Request ID: null; Proxy: null) (Service: null; Status Code: 0; Error Code: null; Request ID: null; S3 Extended Request ID: null; Proxy: null)
The error message lacks details, I am not sure which service is trying to access s3, shouldn't it be code-pipeline (which in this case has PutObject permission)?
If you're getting Access Denied errors on public read requests that are allowed, check the bucket's Amazon S3 Block Public Access settings. Review the S3 Block Public Access settings at both the account and bucket level. These settings can override permissions that allow public read access.
Short description. The "403 Access Denied" error can occur due to the following reasons: Your AWS Identity and Access Management (IAM) user or role doesn't have permissions for both s3:GetBucketPolicy and s3:PutBucketPolicy.
Individual Amazon S3 objects can range in size from a minimum of 0 bytes to a maximum of 5 TB. The largest object that can be uploaded in a single PUT is 5 GB. For objects larger than 100 MB, customers should consider using the Multipart Upload capability.
Navigate to All Settings > Raw Data Export > CSV Upload. Toggle the switch to ON. Select Amazon S3 Bucket from the dropdown menu. Enter your Access Key ID, Secret Access Key, and bucket name.
Resolved this by changing the OutputArtifactFormat from "OutputArtifactFormat": "CODE_ZIP"
to "OutputArtifactFormat": "CODEBUILD_CLONE_REF"
.
CODEBUILD_CLONE_REF - from the console description is a Full clone, in which case AWS CodePipeline passes metadata about the repository that allows subsequent actions to do a full git clone. Only supported for AWS CodeBuild actions. The "CODE_ZIP" option does not include the git metadata about the repository
This issue appears to be related to a recent change in the CDK's default IAM Role for the BitBucketSourceAction.
I found that by adding the "s3:PutObjectAcl" action to the list I was able to successfully integrate the BitBucketSourecAction (for GitHub version 2 connection). Note: this did not require:
"OutputArtifactFormat": "CODE_ZIP"
to "OutputArtifactFormat": "CODEBUILD_CLONE_REF"
, or,As detailed in this CDK issue, I was using the BitBucketSourceAction to integrate with a GitHub repository. I got the following error when the CodePipeline first attempted the GitHub (Version2) action:
[GitHub] Upload to S3 failed with the following error: Access Denied
On a previous pipeline I released with the BitBucketSourceAction the "s3:PutObject*" wildcarded action was included in the synthesized template. On reviewing the IAM role generated during my latest cdk deployment (using version 1.91.0) the BitBucketSourceAction only had the "s3:PutObject" action (i.e. not wildcarded). This excludes the "s3:PutObjectAcl" action which seems to be required to upload the source repository from GitHub to S3 and free it up for use further along in the pipeline.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With