I have a Gitlab CI runner of docker type, and a repo with a Dockerfile that I will use for building my artifacts.
I know that I can do it in two stages, one with building the image using the docker
image, pushing it to the registry, and one stage that uses the image for building the artifacts.
But I don't care about keeping the image in the registry, I'd like to skip the step, and just use the docker image in my pipeline without pushing it.
So I've tried the following in my .gitlab-ci.yml
, just docker build
and docker run
:
build-docs:
tags:
- docker
image: docker:stable
stage: build-docs
script:
- docker build -t $IMAGE_TAG .
- docker run --rm $IMAGE_TAG source build
artifacts:
paths:
- build
My Dockerfile has an entry point that takes a source and build directory, and of course my repository has the source directory filled up with files
However, the docker run
step doesn't find the repository files, as if the source
directory was empty, but I guess it's because running a docker image within the docker image is a bit strange.
How can I fix my run step so that files are found, or is there a different way to do what I want?
You must run your in-job Docker container with your job's current directory mounted as explained in Gitlab DinD limitations doc. For example you can do:
script:
- docker build -t $IMAGE_TAG .
# assuming 'source' folder exists at 'path/to/source' in your Git repo
- docker run -v "/builds/$CI_PROJECT_PATH/path/to/source:/path/to/source" $IMAGE_TAG /path/to/source build
Using Docker in Docker, your $IMAGE_TAG
container should then have your local build files (including source
folder) available
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With