Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python multiprocessing continuously spawns pythonw.exe processes without doing any actual work

I don't understand why this simple code

# file: mp.py
from multiprocessing import Process
import sys

def func(x):
    print 'works ', x + 2
    sys.stdout.flush()

p = Process(target= func, args= (2, ))
p.start()
p.join()
p.terminate()
print 'done'
sys.stdout.flush()

creates "pythonw.exe" processes continuously and it doesn't print anything, even though I run it from the command line:

python mp.py

I am running the latest of Python 2.6 on Windows 7 both 32 and 64 bits

like image 920
lj8888 Avatar asked Aug 04 '10 12:08

lj8888


People also ask

Does Python multiprocessing work on Windows?

The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads. Due to this, the multiprocessing module allows the programmer to fully leverage multiple processors on a given machine. It runs on both Unix and Windows.

How does multiprocessing lock work in Python?

Python provides a mutual exclusion lock for use with processes via the multiprocessing. Lock class. An instance of the lock can be created and then acquired by processes before accessing a critical section, and released after the critical section. Only one process can have the lock at any time.

What is a Daemonic process Python?

Daemon processes in Python Python multiprocessing module allows us to have daemon processes through its daemonic option. Daemon processes or the processes that are running in the background follow similar concept as the daemon threads. To execute the process in the background, we need to set the daemonic flag to true.

How does multiprocessing queue work in Python?

A queue is a data structure on which items can be added by a call to put() and from which items can be retrieved by a call to get(). The multiprocessing. Queue provides a first-in, first-out FIFO queue, which means that the items are retrieved from the queue in the order they were added.


1 Answers

You need to protect then entry point of the program by using if __name__ == '__main__':.

This is a Windows specific problem. On Windows your module has to be imported into a new Python interpreter in order for it to access your target code. If you don't stop this new interpreter running the start up code it will spawn another child, which will then spawn another child, until it's pythonw.exe processes as far as the eye can see.

Other platforms use os.fork() to launch the subprocesses so don't have the problem of reimporting the module.

So your code will need to look like this:

from multiprocessing import Process
import sys

def func(x):
    print 'works ', x + 2
    sys.stdout.flush()

if __name__ == '__main__':
    p = Process(target= func, args= (2, ))
    p.start()
    p.join()
    p.terminate()
    print 'done'
    sys.stdout.flush()
like image 150
Dave Webb Avatar answered Oct 18 '22 01:10

Dave Webb