I have a script where I'm loading a file which takes a while because there is quite much data to read and to prevent the user from terminating the process I want to show some kind of loading indication. I thought this was a good opportunity to learn how to use the multiprocessing module so I wrote this example to test the module:
import time, multiprocessing
def progress():
delay = 0.5
while True:
print "Loading.",
time.sleep(delay)
print "\b.",
time.sleep(delay)
print "\b.",
time.sleep(delay)
print "\r \r",
return
def loader(filename, con):
# Dummy loader
time.sleep(5)
con.send(filename)
con.close()
return
if __name__ == "__main__":
parrent_con, child_con = multiprocessing.Pipe()
filename = "main.key"
p1 = multiprocessing.Process(target=progress)
p2 = multiprocessing.Process(target=loader, args=(filename, child_con))
p1.start()
p2.start()
data = parrent_con.recv()
p1.terminate()
print "\n", data
It works as I expect when I run it in windows cmd, it prints "Loading" and sequentially adds dots until the loader is complete. But in unix where I need it to work I don't get any output from progress function, process p1.
Just as Mark and Dacav suggested, buffering is the problem. Here are some possible solutions:
Using python -u
to run the script
python -u
will unbuffer stdout and stderr. This is the easiest solution if it's acceptable to you.
Using sys.stdout.flush
sys.stdout.flush
will flush the stdout from buffer.
delay = 0.5
while True:
print("Loading."),
sys.stdout.flush()
time.sleep(delay)
print("\b."),
sys.stdout.flush()
time.sleep(delay)
print("\b."),
sys.stdout.flush()
time.sleep(delay)
print("\r \r"),
sys.stdout.flush()
return
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With