Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Two Docker images in GitLab CI .yaml

I'm trying to extend one of my CI jobs on GitLab:

deploy-stage:
  image: python:3.5
  environment: stage
  script:
  - pip install awscli
  - aws s3 cp dist s3://$S3_BUCKET_STAGE/ --recursive
  only:
    - stage

What I want to achieve is to be able to build code from Vue.js files (by using npm run build), but to do this I need Node.js. But I also need Python to be able to upload files to S3. How can I achieve that?

like image 318
jean d'arme Avatar asked Jan 23 '19 15:01

jean d'arme


People also ask

Can you have multiple Docker images in one container?

In later versions of Docker, it provides the use of multi-stage dockerfiles. Using multi-stage dockerfiles, you can use several base images as well as previous intermediate image layers to build a new image layer.

Can Docker images be combined?

Docker doesn't do merges of the images, but there isn't anything stopping you combining the dockerfiles if available, and rolling into them into a fat image which you'd need to build.

Can GitLab CI Yml have multiple files?

No, you can't have multiple gitlab-ci files per repository.

Can we store Docker images in GitLab?

Easily build Docker images with the help of GitLab CI and store them in the GitLab Container Registry. Easily create images per branches, tags, or any other way suitable to your workflow, and with little effort, store them on GitLab.


3 Answers

After some help from here I ended up with such gitlab-ci.yml configuration:

build-stage:
  stage: build
  image: node:latest
  environment:
    name: stage
  script:
    - npm install
    - npm run build-stage
  only:
    - stage
  artifacts:
    expire_in: 2 hrs
    paths:
      - dist/

deploy-stage:
  stage: deploy
  image: python:3.5
  environment:
    name: stage
  script:
    - pip install awscli
    - aws s3 cp dist s3://$S3_BUCKET_STAGE/ --recursive
  dependencies:
    - build-stage
  only:
    - stage

It's easy and readable. No need for any custom images from Docker - just do one job and then move results to the next job. Works like a charm.

like image 159
jean d'arme Avatar answered Oct 06 '22 04:10

jean d'arme


You could just use 2 separates build steps and use artifacts to pass the build between the steps.

What you could do is in your first build step you use an image with vue.js and run npm run build and any other steps you need to do.

At the end of job you specify artifacts.

artifacts:
  paths:
    - build

This would pass the folder build to the next job.

Then you can run your second job using python to upload the contents to S3.

This gives you the freedom to build your program in the way you want without having to limit yourself to a specific environment.

If you can't find an image that does what you need you can either create your own or, if build time isn't important, you can use a base image and install everything you need as part of your job.

image: node:alpine
  before_script:
    - yum install curl -y && yum update -y
  script:
    - npm run build

The above snippet would install curl on a node alpine image.

like image 36
Berimbolinho Avatar answered Oct 06 '22 05:10

Berimbolinho


I would suggest following approaches:

First, simple. Use general-purpose container as a base image, lets say ubuntu and install both python and npm there.

Why not use python image and install there npm or vice versa:

  • your package manager (apt/apk) becomes implicit unless you specify python:3-alpine or whatever. I personally prefer explicit definition because you teammates would be confused with question "What is the manager for npm image?" if they are not familiar with it.

  • the set of preinstalled packages is also undefined and can change from version to version. Does bash exist in python:3?

  • changing the version of one tool (let's say python2 -> python3) could dramatically influence the image if it is used as a base one for all others.

  • tomorrow you will need to add 3rd tool (gcc?) to your image.

So having a general purpose base image with all needed tools installed explicitly looks for me as a best idea.

Also to note, you would probably need to separate your image building process from using this image. I prefer having first prepare stage in my gitlab CI which build all necessary images and puts them into provided by GitLab private docker registry.

like image 40
grapes Avatar answered Oct 06 '22 03:10

grapes