Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Bitbucket Pipelines - How to use the same Docker container for multiple steps?

I have set up Continuous Deployment for my web application using the configuration below (bitbucket-pipelines.yml).

pipelines:
  branches:
    master:
        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            caches:
              - node
            script:
              # Install dependencies
              - yarn install
              - yarn global add gulp-cli

              # Run tests
              - yarn test:unit
              - yarn test:integration

              # Build app
              - yarn run build

              # Deploy to production
              - yarn run deploy

Although this works, I would like to increase the build speed by running the unit and integration test steps in parallel.

What I've tried

pipelines:
  branches:
    master:
        - step:
            name: Install dependencies
            script:
              - yarn install
              - yarn global add gulp-cli

        - parallel:
            - step:
                name: Run unit tests
                script:
                  - yarn test:unit
            - step:
                name: Run unit tests
                script:
                  - yarn test:integration

        - step:
            name: Build app
            script:
              - yarn run build

        - step:
            name: Deploy to production
            trigger: manual
            deployment: production
            script:
              - yarn run deploy

This also has the advantage of seeing the different steps in Bitbucket including the execution time per step.

The problem

This does not work because for each step a clean Docker container is created and the dependencies are no longer installed on the testing steps.

I know that I can share files between steps using artifacts, but that would still require multiple containers to be created which increases the total execution time.

So my question is...

How can I share the same Docker container between multiple steps?

like image 271
Duncan Luk Avatar asked Jul 10 '18 10:07

Duncan Luk


People also ask

Can we add multiple images in bitbucket pipeline?

Each step has 4 GB of memory available. A single pipeline can have up to 100 steps. Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images.

What is parallel in bitbucket pipeline?

Parallel steps enable you to build and test faster, by running a set of self-contained steps at the same time.

Does Bitbucket Pipelines support Docker images?

We’re thrilled to announce support for building Docker images and Service containers in Bitbucket Pipelines. Starting today, you can build your application as Docker containers, benefiting from the portability and minimal overhead of containerization.

Can I run multiple services in Bitbucket pipeline?

Today we’re excited to announce service containers for Bitbucket Pipelines, bringing the power of Docker to your test environment configuration. You can now run up to three background services in your pipeline, in addition to your build container, using your own Docker images or any of those available on Docker Hub.

How do I create a pipeline in Bitbucket?

A single pipeline can have up to 100 steps. Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images. 1. To configure the yaml file, in Bitbucket go to your repository and select Pipelines from the left navigation bar.

How many Docker services can I run in a pipeline?

Note that even if you declare Docker here, it still counts as a service for Pipelines, has a limit of 1 GB memory, and can only be run with two other services in your build step. This setting is provided for legacy support, and we recommend setting it on a step level so there's no confusion about how many services you can run in your pipeline.


1 Answers

I've had the same issue a while ago and found a way to do it and I'm using it successfully right now.

You can do this using Docker's save and load along with BitBucket's Artifacts. You just need to make sure that your image isn't too large because BitBucket's Artifacts limit is 1GB and you can easily ensure this using multi stage-builds and other tricks.

- step:
  name: Build app
  script:
  - yarn run build
  - docker save --output <backup-file-name>.tar <images-you-want-to-export>
  artifacts:
  - <backup-file-name>.tar
- step:
  name: Deploy to production
  trigger: manual
  deployment: production
  script:
  - docker load --input <backup-file-name>.tar
  - yarn run deploy

You might also like to use BitBucket's caches which can make building Docker images much faster. For example, you can make it so that NPM packages are only installed when package.json and yarn.lock files change.

Further Reading

  • docker save (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/save/index
  • docker load (Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/load/index
  • BitBucket Artifacts: https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html
  • BitBucket Pipelines Caches: https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
like image 192
Abdullah Avatar answered Oct 07 '22 02:10

Abdullah