Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Jenkins pipeline using docker on existing slaves

We have the following jenkins setup:

  • Jenkins master
  • Jenkins Slave1
  • Jenkins Slave2
  • Jenkins Slave3

Those are all virtual machines and the slaves do always exist. They don't spawn automatically up and down.

Now we have builds which needs a lot of tools (maven, python, aws cli, ...). We can install every tool on every slave and everything will work fine. But we want to build a docker approach.

Nearly all the tutorials I've seen are using slaves in Docker. They use some orchestration tool like Kubernetes and are creating slaves in Docker, do their stuff and delete the pod again.

We don't have the possibility to do this:

Question: Is it a decent approach to use an 'old' Jenkins setup with real VM slaves on which we use docker?

What I'm thinking about is writing a pipeline and in each stage we use a docker container:

  • start build (it will choose a slave, e.g. Slave1)
  • pipeline will start
  • stage1: spin up e.g. a python container: git clone and execute python commands. mount volume to workspace??
  • stage2: sping up e.g. aws container and mount the content of the workspace and execute new commands etc.

Can someone evaluate this approach?

like image 693
DenCowboy Avatar asked Oct 29 '22 20:10

DenCowboy


1 Answers

This is a very good approach. In fact the way to do that is documented under jenkins docs under Using multiple containers section.

In each stage you basically spin up a container with the necessary tools available and you can use a volume to presist output from the stage into the workspace so that other stages can use it.

like image 75
yamenk Avatar answered Nov 15 '22 06:11

yamenk