I am trying to std in an xml file. I read the xml file from subversion, updated a line in the file and now I am trying to create a jenkins job by using subporcess.Popen and stdin
test = subprocess.Popen('svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin', stdout=subprocess.PIPE, universal_newlines=True)
job = test.stdout.read().replace("@url@", "http://localhost/svn/WernerTest/TMS/branches/test1")
output = io.StringIO()
output.write(job)
subprocess.Popen('java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7', stdin=output)
and I am getting the following error:
Traceback (most recent call last): File "D:\scripts\jenkinsGetJobs.py", line 20, in <module>
subprocess.Popen('java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7', stdin=output)
File "D:\applications\Python 3.5\lib\subprocess.py", line 914,
in __init__errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "D:\applications\Python 3.5\lib\subprocess.py", line 1127, in _get_handles
p2cread = msvcrt.get_osfhandle(stdin.fileno())
io.UnsupportedOperation: fileno
So how do I pass in the updated file to the next subprocess?
Use a pipe and write the data directly to that pipe:
test = subprocess.Popen(
'svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin',
stdout=subprocess.PIPE, universal_newlines=True)
job = test.stdout.read().replace("@url@", "http://localhost/svn/WernerTest/TMS/branches/test1")
jenkins = subprocess.Popen(
'java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7',
stdin=subprocess.PIPE, universal_newlines=True)
jenkins.communicate(job)
The Popen.communicate()
method takes the first argument and sends that as stdin to the subprocess.
Note that I set the universal_newlines
argument to True
for Jenkins as well; the alternative would be for you to explicitly encode the job
string to a suitable codec that Jenkins will accept.
Popen()
accepts only real files (valid .fileno()
at least).
@Martijn Pieters♦' answer shows how pass the data if you can load it all at once in memory (also, jenkins
process is not started until svn
produces all output).
Here's how to read one line at a time (svn and jenkins processes run in parallel):
#!/usr/bine/env python3
from subprocess import Popen, PIPE
with Popen(svn_cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as svn, \
Popen(java_cmd, stdin=PIPE, bufsize=1, universal_newlines=True) as java:
for line in svn.stdout:
line = line.replace('@url@', 'http://localhost/svn/WernerTest/TMS/branches/test1')
java.stdin.write(line)
if java.returncode != 0:
"handle error"
see svn_cmd
, java_cmd
definitions below (you don't need shlex.split(cmd)
on Windows -- note: no shell=True
).
If you didn't need to replace @url@
then It would look like you are trying to emulate: svn_cmd | java_cmd
pipeline, where:
svn_cmd = 'svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin'
java_cmd = 'java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7'
The simplest way is to envoke the shell:
#!/usr/bin/env python
import subprocess
subprocess.check_call(svn_cmd + ' | ' + java_cmd, shell=True)
You could emulate it in Python:
#!/usr/bin/env python3
from subprocess import Popen, PIPE
#NOTE: use a list for compatibility with POSIX systems
with Popen(java_cmd.split(), stdin=PIPE) as java, \
Popen(svn_cmd.split(), stdout=java.stdin):
java.stdin.close() # close unused pipe in the parent
# no more code here (the for-loop is inside an OS code that implements pipes)
if java.returncode != 0:
"handle error here"
See How do I use subprocess.Popen to connect multiple processes by pipes?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With