Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python multiprocessing stdin input

All code written and tested on python 3.4 windows 7.

I was designing a console app and had a need to use stdin from command-line (win os) to issue commands and to change the operating mode of the program. The program depends on multiprocessing to deal with cpu bound loads to spread to multiple processors.

I am using stdout to monitor that status and some basic return information and stdin to issue commands to load different sub-processes based on the returned console information.

This is where I found a problem. I could no get the multiprocessing module to accept stdin inputs but stdout was working just fine. I think found the following help on stack So I tested it and found that with the threading module this all works great, except for the fact that all output to stdout is paused until each time stdin is cycled due to GIL lock with stdin blocking.

I will say I have been successful with a work around implemented with msvcrt.kbhit(). However, I can't help but wonder if there is some sort of bug in the multiprocessing feature that is making stdin not read any data. I tried numerous ways and nothing worked when using multiprocessing. Even attempted to use Queues, but I did not try pools, or any other methods from multiprocessing.

I also did not try this on my linux machine since I was focusing on trying to get it to work.

Here is simplified test code that does not function as intended (reminder this was written in Python 3.4 - win7):

import sys
import time
from multiprocessing import Process

def function1():
    while True:
        print("Function 1")
        time.sleep(1.33)

def function2():
    while True:
        print("Function 2")
        c = sys.stdin.read(1) # Does not appear to be waiting for read before continuing loop.
        sys.stdout.write(c) #nothing  in 'c'
        sys.stdout.write(".") #checking to see if it works at all.
        print(str(c)) #trying something else, still nothing in 'c'
        time.sleep(1.66)

if __name__ == "__main__":
    p1 = Process(target=function1)
    p2 = Process(target=function2)
    p1.start()
    p2.start()

Hopefully someone can shed light on whether this is intended functionality, if I didn't implement it correctly, or some other useful bit of information.

Thanks.

like image 452
user2398421 Avatar asked May 08 '15 23:05

user2398421


1 Answers

When you take a look at Pythons implementation of multiprocessing.Process._bootstrap() you will see this:

if sys.stdin is not None:
    try:
        sys.stdin.close()
        sys.stdin = open(os.devnull)
    except (OSError, ValueError):
        pass

You can also confirm this by using:

>>> import sys
>>> import multiprocessing
>>> def func():
...     print(sys.stdin)
... 
>>> p = multiprocessing.Process(target=func)
>>> p.start()
>>> <_io.TextIOWrapper name='/dev/null' mode='r' encoding='UTF-8'>

And reading from os.devnull immediately returns empty result:

>>> import os
>>> f = open(os.devnull)
>>> f.read(1)
''

You can work this around by using open(0):

file is either a string or bytes object giving the pathname (absolute or relative to the current working directory) of the file to be opened or an integer file descriptor of the file to be wrapped. (If a file descriptor is given, it is closed when the returned I/O object is closed, unless closefd is set to False.)

And "0 file descriptor":

File descriptors are small integers corresponding to a file that has been opened by the current process. For example, standard input is usually file descriptor 0, standard output is 1, and standard error is 2:

>>> def func():
...     sys.stdin = open(0)
...     print(sys.stdin)
...     c = sys.stdin.read(1)
...     print('Got', c)
... 
>>> multiprocessing.Process(target=func).start()
>>> <_io.TextIOWrapper name=0 mode='r' encoding='UTF-8'>
Got a
like image 81
Vyktor Avatar answered Oct 03 '22 17:10

Vyktor