Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Running multiple Docker containers from a single Jenkinsfile

So I spent the whole day trying to figure out how to configure a simple Jenkins Pipeline with multiple Docker images and I am not happy at all.

I need a few stages (prepare, build, test, docs) executed on a couple of different docker containers (currently I just picked three standard Python containers). And it would be nice if those would run in parallel, but I only found this solution, which combines all stages into a single one (and thus creates a not so informative overview in the Blue Ocean UI): Jenkins Pipeline Across Multiple Docker Images

So I ended up with the configuration below, which is ugly as hell (code repetition everywhere), but more or less creates an good looking overview in the classic UI:

Classic UI stages

A not so informative overview in the Blue Ocean UI

Blue Ocean stages overview

And an acceptable test overview from junit, which combines all the tests from each stage but if any test is failing, the corresponding "version" is shown:

junit

The most annoying thing however is, you cannot see which step has failed. If Python 2.7 fails, everything else is also marked as failed and you don't even see which stage failed.

I tried so many different approaches and I am wondering how this should be done. This should be such a common thing to do with Jenkins, so I guess I have some general misunderstandings in this (for me absolutely new) pipeline/nodes/labels/stages/steps/declarative/scripted/groovy/blueocean stuff...

It should be possible to define a list of docker containers some (maybe customisable stages/steps) for each of them and run them in parallel and having it displayed nicely in Blue Ocean and in Classic UI, shouldn't it?

node {
    stage("Python 2.7.14") {
        checkout scm
        docker.image('python:2.7.14').inside {  // just a dummy for now
            stage("Prepare") { sh 'python --version' }
            stage("Build") { sh 'ls -al' }
        }
    }
    stage("Python 3.5.4") {
        checkout scm
        docker.image('python:3.5.4').inside {
            stage("Prepare") { sh 'python -m venv venv' }
            stage("Build") {
                sh """
                    . venv/bin/activate
                    make install-dev
                """
            }
            stage('Test') {
                sh """
                    . venv/bin/activate
                    make test
                """
            }
            stage('Docs') {
                sh """
                    . venv/bin/activate
                    make doc-dependencies
                    cd docs
                    make html
                """
            }
        }
    }
    stage("Python 3.6.4") {
        checkout scm
        docker.image('python:3.5.4').inside {
            stage("Prepare") { sh 'python -m venv venv' }
            stage("Build") {
                sh """
                    . venv/bin/activate
                    make install-dev
                """
            }
            stage('Test') {
                sh """
                    . venv/bin/activate
                    make test
                """
            }
            stage('Docs') {
                sh """
                    . venv/bin/activate
                    make doc-dependencies
                    cd docs
                    make html
                """
            }
        }
    }
}

Update: this is how it looks like in the Blue Ocean UI when a step fails, int this case "Test" in both Python 3.5.4 and 3.6.4 failed but it looks like everything has failed. Test step failed

Also the Python 2.7.14 and 3.5.4 stages are collapsed and cannot be viewed separately. If I click on one of them, all the steps are shown in green although in this case . venv/bin/activate make test failed:

Failed test step is shown in green

like image 250
tamasgal Avatar asked Apr 11 '18 18:04

tamasgal


People also ask

Can you have multiple Docker containers running?

With Docker compose, you can configure and start multiple containers with a single yaml file. This is really helpful if you are working on a technology stack with multiple technologies.

How can you run multiple containers using a single service?

Use a process manager which can run multiple processes: You can set the container's entrypoint to a specialised program which is capable of running and managing multiple processes. One example of this is supervisord. You can use supervisord as your container entrypoint, which will then load the services that you need.

Can I run multiple Docker containers on same port?

So there is no conflict if multiple containers are using the same port ( :80 in this case). You can access one container from another using its container-name or service-name or ip-address, whereas ip-address is not a good idea because this might change every time you (re)start the container.


1 Answers

So this is what I ended up with. There are surely better solutions, but I have to move on. I hope to gather some (better) answers in time, I'll not mark this as "the solution" yet ;)

First, some credits to Stephen Kings slides (the title says "Declarative" but there are some nice examples regarding the scripted Pipeline): (Declarative) Jenkins Pipelines

Here is my gist on GitHub with the following snippet:

def docker_images = ["python:2.7.14", "python:3.5.4", "python:3.6.2"]

def get_stages(docker_image) {
    stages = {
        docker.image(docker_image).inside {
            stage("${docker_image}") {
                echo 'Running in ${docker_image}'
            }
            stage("Stage A") {
                switch (docker_image) {
                    case "python:2.7.14":
                        sh 'exit 123'  // for python 2.7.14 we force an error for fun
                        break
                    default:
                        sh 'sleep 10'  // for any other docker image, we sleep 10s
                }
                sh 'echo this is stage A'  // this is executed for all
            }
            stage("Stage B") {
                sh 'sleep 5'
                sh 'echo this is stage B'
            }
            stage("Stage C") {
                sh 'sleep 8'
                sh 'echo this is stage C'
            }

        }
    }
    return stages
}

node('master') {
    def stages = [:]

    for (int i = 0; i < docker_images.size(); i++) {
        def docker_image = docker_images[i]
        stages[docker_image] = get_stages(docker_image)
    }

    parallel stages
}

I tried to make it easy to use:

  • you add your Docker images in a list at the top and then you define the stages in the get_stages() function
  • add the common stages and steps
  • if any Docker image needs special treatment (like python:2.7.14 in my example), you can use a simple switch. This could also be realised with a double map for the special cases ('images'->'stage'='steps') and a fallback double map for defaults, but I'll leave it as an exercise for the reader. (to be honest, I could not figure out the correct, supported Groovy-lang syntax)

This is how it looks like when everything is fine in both the Classic and the Blue Ocean UIs (it's known that the Blue Ocean UI fails to display multiple stages in parallel runs, see JENKINS-38442):

Classic UI Classic UI - Build OK

Blue Ocean UI Blue Ocean UI - Build OK

And this is the output if Stage A in python:2.7.14 fails:

Classic UI Classic UI - Failed Stage A step

Blue Ocean UI Blue Ocean UI - Failed Stage A step

like image 140
tamasgal Avatar answered Oct 24 '22 13:10

tamasgal