Using the AWS CDK I created a simple stack with an auto scaling group, and also define launch configuration resource to execute some powershell scripts during the ec2 instance creation. The scripts are located in the same cdk typescript project and I use and aws-s3-asset construct to upload the scripts directory. When the CDK synthtize the template it creates 3 CloudFormation parameters with autogenerated names to reference the S3 bucket of the assets. At this point every thing works perfect, I execute the cdk deploy StackWithAutoScalingGroup command, the cdk automatically populates the value of CloudFormation parameters and deploys the stack.
I decided to implement a CodePipeline stack(StackWithTheCodePipline) to deploy the StackWithAutoScalingGroup, it fetches the code from a CodeCommit repository the executes code build to synthesize the template and as final stage it is a CodeDeploy CloudFormation action used to deploy the stack. This final step is failing because the CloudFormation params where not provided by the pipeline.
Im looking for a way to access the s3 Assets bucket created in the StackWithAutoScalingGroup from the StackWithTheCodePipline in order to provide the required CloudFormation params
Any help will be appreciated
StackWithAutoScalingGroup.ts
const captivaServer = new AutoScalingGroupStandard(this, 'CaptivaServer', {
vpc: props.vpc,
instanceType: ec2.InstanceType.of(ec2.InstanceClass.T2, ec2.InstanceSize.MICRO),
machineImage: new ec2.WindowsImage(ec2.WindowsVersion.WINDOWS_SERVER_2019_ENGLISH_FULL_BASE),
});
const scripts = new assets.Asset(this, 'Scripts', {
path: path.join('cfninit', 'scripts'),
readers: [
captivaServer.instanceRole,
]
});
StackWithAutoScalingGroup.template.json (Parameters created afters the stack is Synthesized)
"Parameters": {
"SsmParameterValueawsserviceamiwindowslatestWindowsServer2019EnglishFullBaseC96584B6F00A464EAD1953AFF4B05118Parameter": {
"Type": "AWS::SSM::Parameter::Value<String>",
"Default": "/aws/service/ami-windows-latest/Windows_Server-2019-English-Full-Base"
},
"ScriptsS3Bucket1E273C2D": {
"Type": "String",
"Description": "S3 bucket for asset \"CdkCaptivaStack/Scripts\""
},
"ScriptsS3VersionKey0B5B668F": {
"Type": "String",
"Description": "S3 key for asset version \"CdkCaptivaStack/Scripts\""
},
"ScriptsArtifactHashC07F896B": {
"Type": "String",
"Description": "Artifact hash for asset \"CdkCaptivaStack/Scripts\""
}
}
StackWithTheCodePipline.ts
new codepipeline.Pipeline(this, 'Pipeline', {
stages: [
{
stageName: 'Source',
actions: [
new codepipeline_actions.CodeCommitSourceAction({
actionName: 'CodeCommitSource',
repository: code,
output: sourceOutput,
}),
],
},
{
stageName: 'Build',
actions: [
new codepipeline_actions.CodeBuildAction({
actionName: 'BuildStack',
project: buildProject,
input: sourceOutput,
outputs: [buildOutput],
}),
],
},
{
stageName: 'DeployToTest',
actions: [
new codepipeline_actions.CloudFormationCreateUpdateStackAction({
actionName: 'DeployStack',
templatePath: buildOutput.atPath('StackWithAutoScalingGroup.template.json'),
stackName: 'csg-cdk-captiva',
//parameterOverrides: props.parameterOverrides,
adminPermissions: true,
}),
],
},
],
});
The action provides the parameterOverrides property to set the required parameters but like the names was autogenerated I'm not able to find a way to know the parameters that the template expect neither the value for the parameters.
What I'm expect is a way to know the generated param names and also a way to reference the s3 assets bucket to provide the value for the parameters.
{
stageName: 'DeployToTest',
actions: [
new codepipeline_actions.CloudFormationCreateUpdateStackAction({
actionName: 'DeployStack',
templatePath: buildOutput.atPath('StackWithAutoScalingGroup.template.json'),
stackName: 'csg-cdk-captiva',
parameterOverrides: {
'ScriptsS3Bucket1E273C2D':????,//how I can get the param name and also the values
'ScriptsS3VersionKey0B5B668F':???,
}
adminPermissions: true,
}),
],
},
You can do a few things:
use a set name on your bucket (or a set name with a variable prefix defined at deploy using context variables) and then use Bucket.fromAttributes to import it into your other stack
Expose the Bucket object as an attribute on your stack. When you instantiate the stack in your app you can then reference that attribute as a custom parameter into another stack (pseudo code)
inside a given stack...
my_bucket = s3.Bucket...
in your app
my_stack = MyStack(...
...
second_stack=SecondStack(.... specialBucket:mystack.my_bucket
in the second stack
something you need the bucket for(bucket: specialBucket
The issue with this method is that it links the two stacks creating a dependency between them. This in turn means changes to one may not be able to be deployed because the other stack would fall out of sync. You can solve this by making both MyStack and SecondStack (from the above example) Nested Stacks in a common app stack. That give you the ability to move pieces around between stacks and deploy them all as a single application in terms of CloudFormation stacks, preventing deployment issues
And finally... if you deployment does not require a lot of complex steps and things to follow, I have found it FAR easier to simply use a CodeBuild, install cdk during the pre-build, and run cdk deploy Stack* from inside a codebuild. The assets and whatnot are much easier to line up that way.
CodeCommit -> CodeBuild (that cdk Deploys)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With