My python script uses subprocess to call an another script, which produces output very slow(line-by-line basis). I would like to write the output line by line to file not when the whole process ends and writes the entire output as string.The following code writes the output to "file" when the "script" ends.
args = ("script")
file = open('output.txt', 'w')
subprocess.Popen(args,stdout=file)
Is it even possible ? Thanx, Chris
You can interact with the process using poll so that you can attempt to interact with it line by line:
For example:
process = subprocess.Popen(["ls", "-lart"],
bufsize=-1, # fully buffered (default)
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=os.curdir,
env=os.environ)
my_stdout_file = open("stdout.txt", "w")
while True:
process.poll()
line = process.stdout.readline()
my_stdout_file.write(line)
eline = process.stderr.readline()
if line:
stdout_lines.append(line)
if eline:
stderr_lines.append(eline)
if (line == "" and eline == "" and
process.returncode != None):
break
Yes, it is possible. Here is a function that I wrote for a test harness use to do unit testing of Python shell scripts.
def testrun(cmdline):
try:
cmdout, cmderr = "",""
cmdp = Popen(cmdline, shell=True,stdout=PIPE, stderr=PIPE)
cmdout,cmderr = cmdp.communicate()
retcode = cmdp.wait()
if retcode < 0:
print >>sys.stderr, "Child was terminated by signal", -retcode
else:
return (retcode,cmdout,cmderr)
except OSError, e:
return (e,cmdout,cmderr)
The function returns a tuple which contains the shell return code issues by sys.exit()
, the standard output text, and the standard error output text. They are both text strings so you would need to use splitlines
to break them into lines before processing.
If you really need to interact with the output, line by line, then it is probably better to use pexpect rather than the subprocess
module.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With