I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test@tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test@tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
One way you could load environment variables from a file is to load a Groovy file.
For example:
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello" env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node { load "$JENKINS_HOME/.envvars/stacktest-staging.groovy" echo "${env.DB_URL}" echo "${env.DB_URL2}" }
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see: issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') { def someGroovyVar = 'Hello world' withEnv(['VAR1=VALUE ONE', "VAR2=${someGroovyVar}" ]) { def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true) echo result } }
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') { withCredentials([usernamePassword( credentialsId: 'CREDENTIALS_ID1', passwordVariable: 'PASSWORD', usernameVariable: 'USER')]) { echo "User name: $USER" echo "Password: $PASSWORD" } }
The jenkisn console log output hides the real values:
[Pipeline] echo User name: **** [Pipeline] echo Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') { withCredentials([usernamePassword( credentialsId: 'CREDENTIALS_ID1', passwordVariable: 'PASSWORD', usernameVariable: 'USER')]) { withEnv(["ENV_USERNAME=${USER}", "ENV_PASSWORD=${PASSWORD}" ]) { def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true) echo result } } }
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps) Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value. But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well. To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path
by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With