I want to run same shell command (very simple shell commands like ls) on all the UNIX slaves
which are connected to the master by using the master's script console.
How can I do this using groovy?
Want to do something like this: Display Information About Nodes but instead of displaying information, I want to also run some simple UNIX commands on each slave and print the results.
import hudson.util.RemotingDiagnostics;
print_ip = 'println InetAddress.localHost.hostAddress';
print_hostname = 'println InetAddress.localHost.canonicalHostName';
// here it is - the shell command, uname as example
uname = 'def proc = "uname -a".execute(); proc.waitFor(); println proc.in.text';
for (slave in hudson.model.Hudson.instance.slaves) {
println slave.name;
println RemotingDiagnostics.executeGroovy(print_ip, slave.getChannel());
println RemotingDiagnostics.executeGroovy(print_hostname, slave.getChannel());
println RemotingDiagnostics.executeGroovy(uname, slave.getChannel());
}
The pipeline looks something like this:
stages {
stage('Checkout repo') {
steps {
//checkout what I need
}
}
stage('Generate Jobs') {
steps {
jobDsl targets:'generate_projects.groovy',
}
}
stage('Build Projects') {
steps {
build job: "build-all",
propagate: true,
wait: true
}
}
}
and then is file generate_projects.groovy, where the actually DSL generation is:
for (agent in hudson.model.Hudson.instance.slaves) {
if (!agent.getComputer().isOffline()) { // check that agent is not offline
node = jenkins.model.Jenkins.instance.getNode(agent.name) // get agent name
agentIPs = node.computer.getChannel().call(new ListPossibleNames())
agentIP = agentIPs[0] // get agent IP
//Create a job that will run on that specific agent
jobName = FOLDER + '/<Job_name>' + agent.name // need to create different names
job(jobName)
{
label(agent.name)
steps
{
shell(<shell script or commands that you want to run>)
}
}
}
}
Beside the above generation of the jobs, you'll need to keep a list of jobs that were generated and add all its elements in "build-all" pipeline job, that will look something like:
parallel(
b0: {build '<Job_name>' + agent.name'},
b1: {build '<Job_name>' + agent.name'},
b2: {build '<Job_name>' + agent.name'},
.....
)
failFast: false
So when you run the pipeline, a job for each agent will be created, and all new created jobs will run in parallel. I use it for updating setup scenario.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With