How can I watch standard output and standard error of a long-running subprocess simultaneously, processing each line as soon as it is generated by the subprocess?
I don't mind using Python3.6's async tools to make what I expect to be non-blocking async loops over each of the two streams, but that doesn't seem to solve the problem. The below code:
import asyncio
from asyncio.subprocess import PIPE
from datetime import datetime
async def run(cmd):
p = await asyncio.create_subprocess_shell(cmd, stdout=PIPE, stderr=PIPE)
async for f in p.stdout:
print(datetime.now(), f.decode().strip())
async for f in p.stderr:
print(datetime.now(), "E:", f.decode().strip())
if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(run('''
echo "Out 1";
sleep 1;
echo "Err 1" >&2;
sleep 1;
echo "Out 2"
'''))
loop.close()
outputs:
2018-06-18 00:06:35.766948 Out 1
2018-06-18 00:06:37.770187 Out 2
2018-06-18 00:06:37.770882 E: Err 1
While I expect it to output something like:
2018-06-18 00:06:35.766948 Out 1
2018-06-18 00:06:36.770882 E: Err 1
2018-06-18 00:06:37.770187 Out 2
It has occurred to me that there is actually a simpler solution to the problem, at least if the watching code is such that it doesn't need to be in a single coroutine invocation.
What you can do is spawn two separate coroutines, one for stdout and one for stderr. Running them in parallel will give you the needed semantics, and you can use gather
to await their completion:
def watch(stream, prefix=''):
async for line in stream:
print(datetime.now(), prefix, line.decode().strip())
async def run(cmd):
p = await asyncio.create_subprocess_shell(cmd, stdout=PIPE, stderr=PIPE)
await asyncio.gather(watch(p.stdout), watch(p.stderr, 'E:'))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With