Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to build on remote Docker server with Jenkins declarative pipeline?

I wrote a declarative pipeline within Jenkins. This pipeline should run in a Docker container on a remote Docker server.

I read Customizing the execution environment:

Customizing the execution environment

Pipeline is designed to easily use Docker images as the execution environment for a single Stage or the entire Pipeline. Meaning that a user can define the tools required for their Pipeline, without having to manually configure agents. Practically any tool which can be packaged in a Docker container. can be used with ease by making only minor edits to a Jenkinsfile.

pipeline {
    agent {
        docker { image 'node:7-alpine' }
    }
    stages {
        stage('Test') {
            steps {
                sh 'node --version'
            }
        }
    }
}

and it works, but it uses Jenkins server as Docker server.

I read Using a remote Docker server:

Using a remote Docker server

By default, the Docker Pipeline plugin will communicate with a local Docker daemon, typically accessed through /var/run/docker.sock.

To select a non-default Docker server, such as with Docker Swarm, the withServer() method should be used.

By passing a URI, and optionally the Credentials ID of a Docker Server Certificate Authentication pre-configured in Jenkins, to the method with:

node {
    checkout scm

    docker.withServer('tcp://swarm.example.com:2376', 'swarm-certs') {
        docker.image('mysql:5').withRun('-p 3306:3306') {
            /* do things */
        }
    }
}

and this works, but it uses scripted pipeline.

I also read Specifying a Docker Label:

Specifying a Docker Label

By default, Pipeline assumes that any configured agent is capable of running Docker-based Pipelines. For Jenkins environments which have macOS, Windows, or other agents, which are unable to run the Docker daemon, this default setting may be problematic. Pipeline provides a global option in the Manage Jenkins page, and on the Folder level, for specifying which agents (by Label) to use for running Docker-based Pipelines.

but using a label would need a full Jenkins slave with sshd and a Jenkins user installed, online all the time and Jenkins's master would install slave libraries. That sounds a little bit heavy just for using declarative pipelines instead of scripted pipelines.

How could I build on a remote Docker server with a declarative pipeline?

like image 427
dur Avatar asked Feb 27 '19 16:02

dur


1 Answers

I think you mix few things here. In regards to documentation's part about the agent, agent part is used to identify the node your stage should be run on, along with special configuration and information.

Docker tag is in turn giving the information that it should be run from inside the image.

This doesn't leave you any room for connecting to external docker server unless you would make it a Jenkins node with some special label, like for example 'docker-special'. Then you could do just as simple stuff as

agent {
    docker {
        image 'maven:3-alpine'
        label 'docker-special'
    }
}

Documentation part on docker's label

docker

Execute the Pipeline, or stage, with the given container which will be dynamically provisioned on a node pre-configured to accept Docker-based Pipelines, or on a node matching the optionally defined label parameter.

As for your particular use case.

This part of jenkins documentation, mentions the sidecar pattern/advanced usage, and they immediately switch to scripted.

So yes as I've mentioned it, this is not available in declarative. The only way out for declarative would be to mark the external server as jenkins node with specific label.

like image 54
hakamairi Avatar answered Oct 09 '22 23:10

hakamairi