I've run into a problem, where Python quits unexpectedly, when running multiprocessing with numpy. I've isolated the problem, so that I can now confirm that the multiprocessing works perfect when running the code stated below:
import numpy as np
from multiprocessing import Pool, Process
import time
import cPickle as p
def test(args):
x,i = args
if i == 2:
time.sleep(4)
arr = np.dot(x.T,x)
print i
if __name__ == '__main__':
x = np.random.random(size=((2000,500)))
evaluations = [(x,i) for i in range(5)]
p = Pool()
p.map_async(test,evaluations)
p.close()
p.join()
The problem occurs when I try to evaluate the code below. This makes Python quit unexpectedly:
import numpy as np
from multiprocessing import Pool, Process
import time
import cPickle as p
def test(args):
x,i = args
if i == 2:
time.sleep(4)
arr = np.dot(x.T,x)
print i
if __name__ == '__main__':
x = np.random.random(size=((2000,500)))
test((x,4)) # Added code
evaluations = [(x,i) for i in range(5)]
p = Pool()
p.map_async(test,evaluations)
p.close()
p.join()
Please help someone. I'm open to all suggestions. Thanks. Note: I have tried two different machines and the same problem occurs.
This is a known issue with multiprocessing and numpy on MacOS X, and a bit of a duplicate of:
segfault using numpy's lapack_lite with multiprocessing on osx, not linux
http://mail.scipy.org/pipermail/numpy-discussion/2012-August/063589.html
The answer seems to be to use a different BLAS other than the Apple accelerate framework when linking Numpy... unfortunate :(
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With