Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python sharing a lock between processes

I am attempting to use a partial function so that pool.map() can target a function that has more than one parameter (in this case a Lock() object).

Here is example code (taken from an answer to a previous question of mine):

from functools import partial  def target(lock, iterable_item):     for item in items:         # Do cool stuff         if (... some condition here ...):             lock.acquire()             # Write to stdout or logfile, etc.             lock.release()  def main():     iterable = [1, 2, 3, 4, 5]     pool = multiprocessing.Pool()     l = multiprocessing.Lock()     func = partial(target, l)     pool.map(func, iterable)     pool.close()     pool.join() 

However when I run this code, I get the error:

Runtime Error: Lock objects should only be shared between processes through inheritance. 

What am I missing here? How can I share the lock between my subprocesses?

like image 628
DJMcCarthy12 Avatar asked Aug 28 '14 20:08

DJMcCarthy12


People also ask

How lock in multiprocessing Python?

Python provides a mutual exclusion lock for use with processes via the multiprocessing. Lock class. An instance of the lock can be created and then acquired by processes before accessing a critical section, and released after the critical section. Only one process can have the lock at any time.

How do you lock a function in Python?

A lock can be locked using the acquire() method. Once a thread has acquired the lock, all subsequent attempts to acquire the lock are blocked until it is released. The lock can be released using the release() method.

What is a Daemonic process Python?

Daemon processes in Python Python multiprocessing module allows us to have daemon processes through its daemonic option. Daemon processes or the processes that are running in the background follow similar concept as the daemon threads. To execute the process in the background, we need to set the daemonic flag to true.


1 Answers

You can't pass normal multiprocessing.Lock objects to Pool methods, because they can't be pickled. There are two ways to get around this. One is to create Manager() and pass a Manager.Lock():

def main():     iterable = [1, 2, 3, 4, 5]     pool = multiprocessing.Pool()     m = multiprocessing.Manager()     l = m.Lock()     func = partial(target, l)     pool.map(func, iterable)     pool.close()     pool.join() 

This is a little bit heavyweight, though; using a Manager requires spawning another process to host the Manager server. And all calls to acquire/release the lock have to be sent to that server via IPC.

The other option is to pass the regular multiprocessing.Lock() at Pool creation time, using the initializer kwarg. This will make your lock instance global in all the child workers:

def target(iterable_item):     for item in items:         # Do cool stuff         if (... some condition here ...):             lock.acquire()             # Write to stdout or logfile, etc.             lock.release() def init(l):     global lock     lock = l  def main():     iterable = [1, 2, 3, 4, 5]     l = multiprocessing.Lock()     pool = multiprocessing.Pool(initializer=init, initargs=(l,))     pool.map(target, iterable)     pool.close()     pool.join() 

The second solution has the side-effect of no longer requiring partial.

like image 101
dano Avatar answered Sep 20 '22 23:09

dano