Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

multiprocessing.queue.get() blocks and does not return

I am building a webserver on an embedded Linux device with very little RAM (only 256 MB). That webserver should be able to issue shell commands using subprocess.check_output, but since each check_output seems to require about as much RAM available as the parent process consumes, I am using multiprocessing to create a second process right at the start when Python still does not consume much memory. This process then uses a multiprocessing.Queue to receive commands from the main process, executes them and returns the output using another multiprocessing.Queue. This used to work but I seem to have some kind of race condition that causes the whole thing to get stuck.

This is my minimum test case that replicates the problem:

# shwrapper.py
from multiprocessing import Process, Queue

def f(iq,oq):
    oq.put("Ready")
    while True:
        oq.put(iq.get()+" out")

def init():
    iq = Queue()
    oq = Queue()
    p = Process(target=f, args=(iq,oq,))
    p.start()
    print oq.get()
    iq.put("test")
    print(oq.get())
init()

If I try to import this I get this result:

>>> import shwrapper
Ready

Here it get's stuck. Now I issue a KeyboardInterrupt:

^CProcess Process-1:
Traceback (most recent call last):
  File "/usr/lib/python27.zip/multiprocessing/process.py", line 258, in _bootstrap
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "shwrapper2.py", line 17, in <module>
    init()
  File "shwrapper2.py", line 15, in init
    print(oq.get())
  File "/usr/lib/python27.zip/multiprocessing/queues.py", line 117, in get
KeyboardInterrupt
    self.run()
  File "/usr/lib/python27.zip/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "shwrapper2.py", line 6, in f
    oq.put(iq.get()+" out")
  File "/usr/lib/python27.zip/multiprocessing/queues.py", line 117, in get
    res = self._recv()
KeyboardInterrupt

So as you can see, it get's stuck at iq.get() in f. The iq-Queue reports to always be empty, no matter what I put into it. The oq-Queue on the other hand works as expected. Any ideas what I can do here?

My system here is Python 2.7.3 running on Linux 2.6.29.6 on a PowerPC.

If I run it on the Python 2.7.10 on my Windows computer everything goes through fine.

I know Python 2.7.3 and Linux 2.6.29 are ancient, but there are no newer builds from the manufacturer and the system is kinda locked down.

like image 955
Dakkaron Avatar asked Nov 05 '15 12:11

Dakkaron


1 Answers

I finally fixed it by using multiprocessing.queues.SimpleQueue instead of multiprocessing.Queue. According to the documentation there should be no difference in that respect, but in the source the SimpleQueue uses Locks that make it work with multiple processes.

like image 142
Dakkaron Avatar answered Oct 01 '22 23:10

Dakkaron