I am using external docker image from dockerhub.
In each step the dockerimage is pulled from dockerhub again and again. Yes it is desired workflow.
My question is can we cache this image, so that it wont pull from dockerhub in each step? This DockerImage is not going to change frequently, as it has only node and meteor as preinstalled.
So is it possible to cache the docker image?
Original bitbucket-pipeline.yml
image: tasktrain/node-meteor-mup
pipelines:
branches:
'{develop}':
- step:
name: "Client: Install Dependencies"
caches:
- node
script:
- npm install
- npm run setup-meteor-client-bundle
artifacts:
- node_modules/**
- step:
name: "Client: Build for Staging"
script:
- npm run build-browser:stag
artifacts:
- dist/**
- step:
name: "Client: Deploy to Staging"
deployment: staging
script:
- pipe: atlassian/aws-s3-deploy:0.2.2
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_STAGING_BUCKET_NAME
LOCAL_PATH: 'dist'
ACL: "public-read"
DELETE_FLAG: "true"
EXTRA_ARGS: "--follow-symlinks --quiet"
- step:
name: "Server: Build and Deploy to Staging"
script:
- cd server
- mup setup --config=.deploy/mup-settings.stag.js
- mup deploy --config=.deploy/mup-settings.stag.js --settings=meteor-settings.stag.json
So the way to go is: Create a repository on Github or Bitbucket. Commit and push your Dockerfile (with config files if necessary) Create an automated build on Docker Hub which uses the Github / Bitbucket repo as source.
Bitbucket Pipelines is able to cache external build dependencies and directories, such as 3rd-party libraries, between builds providing faster builds, and reducing the number of consumed build minutes.
Docker uses a layer cache to optimize and speed up the process of building Docker images. Docker Layer Caching mainly works on the RUN , COPY and ADD commands, which will be explained in more detail next.
In a default install, these are located in /var/lib/docker. During a new build, all of these file structures have to be created and written to disk — this is where Docker stores base images. Once created, the container (and subsequent new ones) will be stored in the folder in this same area.
As the OP said in comments to the other answer, defining a Docker cache doesn't work for the build image itself
image: tasktrain/node-meteor-mup
which is always downloaded for each step and then the step scripts are executed in that image. Afaik, the Docker cache
services:
- docker
caches:
- docker
only works for images pulled or built in a step.
However, Bitbucket Pipelines has recently started caching public build images internally, according to this blog post:
Public image caching – Behind the scenes, Pipelines has recently started caching public Docker images, resulting in a noticeable boost to startup time to all builds running on our infrastructure.
There is also an open feature request to also cache private build images.
It is indeed possible to cache dependencies and docker is one of the pre-defined caches of Bitbucket Pipelines
pipelines:
default:
- step:
services:
- docker
caches:
- docker
script:
- docker pull my-own-repository:5000/my-image
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With