Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python: How to read stdout non blocking from another process?

During the runtime of a process I would like to read its stdout and write it to a file. Any attempt of mine however failed because no matter what I tried as soon as I tried reading from the stdout it blocked until the process finished.

Here is a snippet of what I am trying to do. (The first part is simply a python script that writes something to stdout.)

import subprocess

p = subprocess.Popen('python -c \'\
from time import sleep\n\
for i in range(3):\n\
    sleep(1)\n\
    print "Hello", i\
\'', shell = True, stdout = subprocess.PIPE)

while p.poll() == None:
    #read the stdout continuously
    pass

print "Done"

I know that there are multiple questions out there that deal with the same subject. However, none of the ones I found was able to answer my question.

like image 610
Woltan Avatar asked Jun 20 '11 15:06

Woltan


1 Answers

What is happening is buffering on the writer side. Since you are writing such small chunks from the little code snippet the underlying FILE object is buffering the output until the end. The following works as you expect.

#!/usr/bin/python

import sys
import subprocess

p = subprocess.Popen("""python -c '
from time import sleep ; import sys
for i in range(3):
    sleep(1)
    print "Hello", i
    sys.stdout.flush()
'""", shell = True, stdout = subprocess.PIPE)

while True:
    inline = p.stdout.readline()
    if not inline:
        break
    sys.stdout.write(inline)
    sys.stdout.flush()

print "Done"

However, you may not be expecting the right thing. The buffering is there to reduce the number of system calls in order to make the system more efficient. Does it really matter to you that the whole text is buffered until the end before you write it to a file? Don't you still get all the output in the file?

like image 62
Keith Avatar answered Oct 11 '22 10:10

Keith