Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Persistent Bitbucket pipeline build artifacts greater than 14 days

I have a pipeline which loses build artifacts after 14 days. I.e, after 14 days, without S3 or Artifactory integration, the pipeline of course loses "Deploy" button functionality - it becomes greyed out since the build artifact is removed. I understand this is by intention by BB/Atlassian to reduce costs etc (detail in below link).

Please check last section of this page "Artifact downloads and Expiry" - https://support.atlassian.com/bitbucket-cloud/docs/use-artifacts-in-steps/

If you need artifact storage for longer than 14 days (or more than 1 GB), we recommend using your own storage solution, like Amazon S3 or a hosted artifact repository like JFrog Artifactory.

Question: Is anyone able to provide advice or sample code on how to approach BB Pipeline integration with Artifactory (or S3) in order to retain artifacts. Is the Artifactory generic upload/download pipe approach the only way or is the quote above hinting at a more native BB "repository setting" to provide integration with S3 or Artifactory? https://www.jfrog.com/confluence/display/RTF6X/Bitbucket+Pipelines+Artifactory+Pipes

enter image description here

like image 631
wired00 Avatar asked Sep 13 '25 19:09

wired00


2 Answers

Bitbucket give an example of linking to an S3 bucket on their site. https://support.atlassian.com/bitbucket-cloud/docs/publish-and-link-your-build-artifacts/

The key is Step 4 where you link the artefact to the build.

However the example doesn't actually create an artefact that is linked to S3, but rather adds a status code with a description that links to the uploaded item's in S3. To use these in further steps you would then have to download the artefacts.

This can be done using the aws cli and an image that has this installed, for example the amazon/aws-sam-cli-build-image-nodejs14.x (SAM was required in my case).

The following is an an example that:

  1. Creates an artefact ( a txt file ) and uploads to an AWS S3 bucket
  2. Creates a "link" as a build status against the commit that triggered the pipeline, as per Amazon's suggestion ( this is just added for reference after the 14 days... meh)
  3. Carrys out a "deployment", where by the artefact is downloaded from AWS S3, in this stage I also then set the downloaded S3 artefact as a BitBucket artefact, I mean why not... it may expire after 14 days but at if I've just re-deployed then I may want this available for another 14 days....
image: amazon/aws-sam-cli-build-image-nodejs14.x

pipelines:
  branches:
    main:
      - step:
          name: Create artefact
          script:
            - mkdir -p artefacts
            - echo "This is an artefact file..." > artefacts/buildinfo.txt
            - echo "Generating Build Number:\ ${BITBUCKET_BUILD_NUMBER}" >> artefacts/buildinfo.txt
            - echo "Git Commit Hash:\ ${BITBUCKET_COMMIT}" >> artefacts/buildinfo.txt
            - aws s3api put-object --bucket bitbucket-artefact-test --key ${BITBUCKET_BUILD_NUMBER}/buildinfo.txt --body artefacts/buildinfo.txt
      - step:
          name: Link artefact to AWS S3
          script:
            - export S3_URL="https://bitbucket-artefact-test.s3.eu-west-2.amazonaws.com/${BITBUCKET_BUILD_NUMBER}/buildinfo.txt"
            - export BUILD_STATUS="{\"key\":\"doc\", \"state\":\"SUCCESSFUL\", \"name\":\"DeployArtefact\", \"url\":\"${S3_URL}\"}"
            - curl -H "Content-Type:application/json" -X POST --user "${BB_AUTH_STRING}" -d "${BUILD_STATUS}" "https://api.bitbucket.org/2.0/repositories/${BITBUCKET_REPO_OWNER}/${BITBUCKET_REPO_SLUG}/commit/${BITBUCKET_COMMIT}/statuses/build"
      - step:
          name: Test - Deployment
          deployment: Test
          script:
            - mkdir artifacts
            - aws s3api get-object --bucket bitbucket-artefact-test --key ${BITBUCKET_BUILD_NUMBER}/buildinfo.txt artifacts/buildinfo.txt
            - cat artifacts/buildinfo.txt
          artifacts:
            - artifacts/**

Note: I've got the following secrets/variables against the repository:

  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • BB_AUTH_STRING
like image 76
110100100 Avatar answered Sep 15 '25 10:09

110100100


Bitbucket also has Download section with unlimited (they ask for fair use) storage but an individual file is limited to .5-2 GiB:

  • https://community.atlassian.com/t5/Bitbucket-questions/Downloads-section-storage-limit/qaq-p/617970
  • https://confluence.atlassian.com/bitbucket/what-kind-of-limits-do-you-have-on-repository-file-upload-size-273877699.html
curl --verbose --location --request POST -F [email protected] \
   --header 'Authorization: Bearer <token-goes-here>' \
  'https://api.bitbucket.org/2.0/repositories/<workspace-name>/<repo-name>/downloads'

or see:

  • https://support.atlassian.com/bitbucket-cloud/docs/deploy-build-artifacts-to-bitbucket-downloads/
  • https://bitbucket.org/product/features/pipelines/integrations?p=atlassian/bitbucket-upload-file
like image 44
gavenkoa Avatar answered Sep 15 '25 11:09

gavenkoa