I have code that, simplified down, looks like this:
run = functools.partial(run, grep=options.grep, print_only=options.print_only, force=options.force)
if not options.single and not options.print_only and options.n > 0:
pool = multiprocessing.Pool(options.n)
Map = pool.map
else: Map = map
for f in args:
with open(f) as fh: Map(run, fh)
try:
pool.close()
pool.join()
except NameError: pass
That works fine when I run it in single process mode, but fails with errors like this
TypeError: type 'partial' takes at least one argument
mixed up together with long call stacks through the multiprocessing module. What's going on?
I'm using python 2.6.1.
It uses the Pool. starmap method, which accepts a sequence of argument tuples. It then automatically unpacks the arguments from each tuple and passes them to the given function: import multiprocessing from itertools import product def merge_names(a, b): return '{} & {}'.
Use the multiprocessing. Pool class when you need to execute tasks that may or may not take arguments and may or may not return a result once the tasks are complete. Use the multiprocessing. Pool class when you need to execute different types of ad hoc tasks, such as calling different target task functions.
apply_async() The apply_async() function can be called directly to execute a target function in the process pool. The call will not block, but will instead immediately return an AsyncResult object that we can ignore if our function does not return a value.
Google tells me that this is a bug in Python; apparently fixed in Py3k. It's supposedly due to partial
not being picklable.
There is a workaround.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With