Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python multiprocessing manager list error: [Errno 2] No such file or directory

I worte a multiprocessing program in python. I use multiprocessing.Manager().list() to share list within subprocess. At first, I add some tasks in main process. And then, start some subprocesses to do tasks which in the shared list, the subprocesses also add tasks to the shared list. But I got a exception as follow:

    Traceback (most recent call last):
      File "/usr/lib64/python2.6/multiprocessing/process.py", line 232, in _bootstrap
        self.run()
      File "/usr/lib64/python2.6/multiprocessing/process.py", line 88, in run
        self._target(*self._args, **self._kwargs)
      File "gen_friendship.py", line 255, in worker
        if tmpu in nodes:
      File "<string>", line 2, in __contains__
      File "/usr/lib64/python2.6/multiprocessing/managers.py", line 722, in _callmethod
        self._connect()
      File "/usr/lib64/python2.6/multiprocessing/managers.py", line 709, in _connect
        conn = self._Client(self._token.address, authkey=self._authkey)
      File "/usr/lib64/python2.6/multiprocessing/connection.py", line 143, in Client
        c = SocketClient(address)
      File "/usr/lib64/python2.6/multiprocessing/connection.py", line 263, in SocketClient
        s.connect(address)
      File "<string>", line 1, in connect
    error: [Errno 2] No such file or directory

I find something about how to use shared list in python multiprocessing like this. But still have some exception. I have no idea of the meaning of the exception. And what's the difference between the common list and the manager.list?

the code as follow:

    nodes = multiprocessing.Manager().list()

    lock = multiprocessing.Lock()

    AMOUNT_OF_PROCESS = 10

    def worker():
        lock.acquire()
        nodes.append(node)
        lock.release()

    if __name__ == "__main__":

        for i in range(i):
            nodes.append({"name":"username", "group":1})

        processes = [None for i in range(AMOUNT_OF_PROCESS)]

        for i in range(AMOUNT_OF_PROCESS):
            processes[i] = multiprocessing.Process(taget=worker, args=())
            processes[i].start()
like image 892
stamaimer Avatar asked Apr 17 '15 14:04

stamaimer


1 Answers

The problem is that your main process is exiting immediately after you start all your worker processes, which shuts down your Manager. When your Manager shuts down, none of the children can use the shared list you passed into them. You can fix it by using join to wait for all the children to finish. Just make sure you actually start all your processes prior to calling join:

for i in range(AMOUNT_OF_PROCESS):
    processes[i] = multiprocessing.Process(target=worker, args=())
    processes[i].start()
for process in processes:
    process.join()
like image 188
dano Avatar answered Oct 17 '22 15:10

dano