I have set up Continuous Deployment for my web application using the configuration below (bitbucket-pipelines.yml).
pipelines:
branches:
master:
- step:
name: Deploy to production
trigger: manual
deployment: production
caches:
- node
script:
# Install dependencies
- yarn install
- yarn global add gulp-cli
# Run tests
- yarn test:unit
- yarn test:integration
# Build app
- yarn run build
# Deploy to production
- yarn run deploy
Although this works, I would like to increase the build speed by running the unit and integration test steps in parallel.
pipelines:
branches:
master:
- step:
name: Install dependencies
script:
- yarn install
- yarn global add gulp-cli
- parallel:
- step:
name: Run unit tests
script:
- yarn test:unit
- step:
name: Run unit tests
script:
- yarn test:integration
- step:
name: Build app
script:
- yarn run build
- step:
name: Deploy to production
trigger: manual
deployment: production
script:
- yarn run deploy
This also has the advantage of seeing the different steps in Bitbucket including the execution time per step.
This does not work because for each step a clean Docker container is created and the dependencies are no longer installed on the testing steps.
I know that I can share files between steps using artifacts, but that would still require multiple containers to be created which increases the total execution time.
How can I share the same Docker container between multiple steps?
Each step has 4 GB of memory available. A single pipeline can have up to 100 steps. Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images.
Parallel steps enable you to build and test faster, by running a set of self-contained steps at the same time.
We’re thrilled to announce support for building Docker images and Service containers in Bitbucket Pipelines. Starting today, you can build your application as Docker containers, benefiting from the portability and minimal overhead of containerization.
Today we’re excited to announce service containers for Bitbucket Pipelines, bringing the power of Docker to your test environment configuration. You can now run up to three background services in your pipeline, in addition to your build container, using your own Docker images or any of those available on Docker Hub.
A single pipeline can have up to 100 steps. Each step in your pipeline runs a separate Docker container. If you want, you can use different types of containers for each step by selecting different images. 1. To configure the yaml file, in Bitbucket go to your repository and select Pipelines from the left navigation bar.
Note that even if you declare Docker here, it still counts as a service for Pipelines, has a limit of 1 GB memory, and can only be run with two other services in your build step. This setting is provided for legacy support, and we recommend setting it on a step level so there's no confusion about how many services you can run in your pipeline.
I've had the same issue a while ago and found a way to do it and I'm using it successfully right now.
You can do this using Docker's save
and load
along with BitBucket's Artifacts. You just need to make sure that your image isn't too large because BitBucket's Artifacts limit is 1GB
and you can easily ensure this using multi stage-builds and other tricks.
- step:
name: Build app
script:
- yarn run build
- docker save --output <backup-file-name>.tar <images-you-want-to-export>
artifacts:
- <backup-file-name>.tar
- step:
name: Deploy to production
trigger: manual
deployment: production
script:
- docker load --input <backup-file-name>.tar
- yarn run deploy
You might also like to use BitBucket's caches which can make building Docker images much faster. For example, you can make it so that NPM packages are only installed when package.json
and yarn.lock
files change.
docker save
(Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/save/index
docker load
(Docker 17): https://devdocs.io/docker~17/engine/reference/commandline/load/index
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With