Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

I'm trying to integrate Bitbucket into AWS Code Pipeline? What is the best approach?

Tags:

I want to integrate my code from Bitbucket into AWS Code Pipeline. I unable to find proper examples on the same. My source code is in .Net. Can someone please guide me. Thanks.

like image 566
Nigel Fds Avatar asked Jan 16 '17 23:01

Nigel Fds


People also ask

How does Bitbucket integrate with AWS?

Sign in to the AWS Management Console, and open the AWS Developer Tools console at https://console.aws.amazon.com/codesuite/settings/connections . Choose Settings > Connections, and then choose Create connection. To create a connection to a Bitbucket repository, under Select a provider, choose Bitbucket.

Does AWS CodePipeline work with Bitbucket?

You can now easily connect your Atlassian Bitbucket Cloud source repository to your AWS CodePipeline, allowing for the automation of the build, test, and deploy phases of your release process every time there is a code change.


2 Answers

You can integrate Bitbucket with AWS CodePipeline by using webhooks that call to an AWS API Gateway, which invokes a Lambda function (which calls into CodePipeline). There is an AWS blog that walks you thru this: Integrating Git with AWS CodePipeline

like image 100
Kirkaiya Avatar answered Oct 06 '22 00:10

Kirkaiya


BitBucket has a service called PipeLines which can deploy code to AWS services. Use Pipelines to package and push updates from your master branch to an S3 bucket which is hooked up to CodePipeline

Note:

  • You must enable PipeLines in your repository

  • PipeLines expects a file named bitbucket-pipelines.yml which must be placed inside your project

  • Ensure you set your accounts AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in the BitBucket Pipelines UI. This comes with an option to encrypt so all is safe and secure

Here is an example bitbucket-pipelines.yml which copies the contents of a directory named DynamoDb to an S3 bucket

pipelines:   branches:     master:       - step:           script:             - apt-get update # required to install zip             - apt-get install -y zip # required if you want to zip repository objects             - zip -r DynamoDb.zip .             - apt-get install -y python-pip             - pip install boto3==1.3.0 # required for s3_upload.py             # the first argument is the name of the existing S3 bucket to upload the artefact to             # the second argument is the artefact to be uploaded             # the third argument is the the bucket key             - python s3_upload.py LandingBucketName DynamoDb.zip DynamoDb.zip # run the deployment script 

Here is a working example of a Python upload script which should be deployed alongside the bitbucket-pipelines.yml file in your project. Above I have named my Python script s3_upload.py:

from __future__ import print_function import os import sys import argparse import boto3 from botocore.exceptions import ClientError  def upload_to_s3(bucket, artefact, bucket_key):     """     Uploads an artefact to Amazon S3     """     try:         client = boto3.client('s3')     except ClientError as err:         print("Failed to create boto3 client.\n" + str(err))         return False     try:         client.put_object(             Body=open(artefact, 'rb'),             Bucket=bucket,             Key=bucket_key         )     except ClientError as err:         print("Failed to upload artefact to S3.\n" + str(err))         return False     except IOError as err:         print("Failed to access artefact in this directory.\n" + str(err))         return False     return True   def main():      parser = argparse.ArgumentParser()     parser.add_argument("bucket", help="Name of the existing S3 bucket")     parser.add_argument("artefact", help="Name of the artefact to be uploaded to S3")     parser.add_argument("bucket_key", help="Name of the S3 Bucket key")     args = parser.parse_args()      if not upload_to_s3(args.bucket, args.artefact, args.bucket_key):         sys.exit(1)  if __name__ == "__main__":     main() 

Here is an example CodePipeline with only one Source stage (you may want to add more):

Pipeline:   Type: "AWS::CodePipeline::Pipeline"   Properties:     ArtifactStore:       # Where codepipeline copies and unpacks the uploaded artifact       # Must be versioned       Location: !Ref "StagingBucket"       Type: "S3"     DisableInboundStageTransitions: []     RoleArn:       !GetAtt "CodePipelineRole.Arn"     Stages:       - Name: "Source"         Actions:           - Name: "SourceTemplate"             ActionTypeId:               Category: "Source"               Owner: "AWS"               Provider: "S3"               Version: "1"             Configuration:               # Where PipeLines uploads the artifact               # Must be versioned               S3Bucket: !Ref "LandingBucket"               S3ObjectKey: "DynamoDb.zip" # Zip file that is uploaded             OutputArtifacts:               - Name: "DynamoDbArtifactSource"             RunOrder: "1"  LandingBucket:   Type: "AWS::S3::Bucket"   Properties:     AccessControl: "Private"     VersioningConfiguration:       Status: "Enabled" StagingBucket:   Type: "AWS::S3::Bucket"   Properties:     AccessControl: "Private"     VersioningConfiguration:       Status: "Enabled" 

Reference to this Python code along with other examples can be found here: https://bitbucket.org/account/user/awslabs/projects/BP

like image 42
insudo Avatar answered Oct 05 '22 23:10

insudo