Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python subprocess get children's output to file and terminal?

I'm running a script that executes a number of executables by using

subprocess.call(cmdArgs,stdout=outf, stderr=errf)

when outf/errf is either None or a file descriptor (different files for stdout/stderr).

Is there any way I can execute each exe so that the stdout and stderr will be written to the files and terminal together?

like image 960
user515766 Avatar asked Feb 13 '11 13:02

user515766


People also ask

How do I capture the output of a subprocess run?

To capture the output of the subprocess. run method, use an additional argument named “capture_output=True”. You can individually access stdout and stderr values by using “output. stdout” and “output.

What is subprocess popen ()?

The subprocess module defines one class, Popen and a few wrapper functions that use that class. The constructor for Popen takes arguments to set up the new process so the parent can communicate with it via pipes. It provides all of the functionality of the other modules and functions it replaces, and more.

What does subprocess Check_call return?

subprocess. check_call() gets the final return value from the script, and 0 generally means "the script completed successfully".

What is the difference between subprocess run and call?

call() when you want the program to wait for the process to complete before moving onto the next process. In the case of subprocess. run() , the program will attempt to run all the processes at once, inevitably causing the program to crash.


1 Answers

The call() function is just Popen(*args, **kwargs).wait(). You could call Popen directly and use stdout=PIPE argument to read from p.stdout:

#!/usr/bin/env python
import sys
from subprocess import Popen, PIPE
from threading import Thread


def tee(infile, *files):
    """Print `infile` to `files` in a separate thread."""

    def fanout(infile, *files):
        with infile:
            for line in iter(infile.readline, b""):
                for f in files:
                    f.write(line)

    t = Thread(target=fanout, args=(infile,) + files)
    t.daemon = True
    t.start()
    return t


def teed_call(cmd_args, **kwargs):
    stdout, stderr = [kwargs.pop(s, None) for s in ["stdout", "stderr"]]
    p = Popen(
        cmd_args,
        stdout=PIPE if stdout is not None else None,
        stderr=PIPE if stderr is not None else None,
        **kwargs
    )
    threads = []
    if stdout is not None:
        threads.append(
            tee(p.stdout, stdout, getattr(sys.stdout, "buffer", sys.stdout))
        )
    if stderr is not None:
        threads.append(
            tee(p.stderr, stderr, getattr(sys.stderr, "buffer", sys.stderr))
        )
    for t in threads:
        t.join()  # wait for IO completion
    return p.wait()


outf, errf = open("out.txt", "wb"), open("err.txt", "wb")
assert not teed_call(["cat", __file__], stdout=None, stderr=errf)
assert not teed_call(["echo", "abc"], stdout=outf, stderr=errf, bufsize=0)
assert teed_call(["gcc", "a b"], close_fds=True, stdout=outf, stderr=errf)
like image 116
jfs Avatar answered Sep 22 '22 15:09

jfs