I am writing a sample program to test the usage of multiprocessing pool of workers in python 2.7.2+
This is the code i have written in the python ubuntu interpreter
>>> from multiprocessing import Pool
>>> def name_append(first_name,last_name):
... return first_name+" "+last_name
...
>>> from functools import partial
>>> partial_name_append=partial(name_append,'kiran')
>>> partial_name_append('acb')
'kiran acb'
>>> abc='kiran'
>>> pool=Pool(processes=4)
>>> pool.map(partial_name_append,abc)
['kiran k', 'kiran i', 'kiran r', 'kiran a', 'kiran n']
>>> pool.close()
>>> pool.join()
>>> pool.map(partial_name_append,abc)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/multiprocessing/pool.py", line 226, in map
assert self._state == RUN
AssertionError
After i got pickle errors, over my pool of workers code for large data sets, i am trying to do small examples and try to figure out what is the error.
I dont understand why the same statement 'pool.map' doesn't work, when it has worked above. I think i have executed the 'pool map' correctly but i dont understand the reason.
Is this error related to "PicklingError: Can't pickle : attribute lookup builtin.function failed"
Can someone help me out ?
Thanks
If the assertion fails, Python uses ArgumentExpression as the argument for the AssertionError. AssertionError exceptions can be caught and handled like any other exception using the try-except statement, but if not handled, they will terminate the program and produce a traceback.
In order to handle the assertion error, we need to declare the assertion statement in the try block and catch the assertion error in the catch block.
AssertionError is inherited from Exception class, when this exception occurs and raises AssertionError there are two ways to handle, either the user handles it or the default exception handler. In Example 1 we have seen how the default exception handler does the work.
Use Pool. The multiprocessing pool starmap() function will call the target function with multiple arguments. As such it can be used instead of the map() function. This is probably the preferred approach for executing a target function in the multiprocessing pool that takes multiple arguments.
You typed:
>>> pool.close()
from the docs:
close()
Prevents any more tasks from being submitted to the pool. Once all the tasks have been completed the worker processes will exit.
Of course you can't use the pool anymore, you closed it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With