Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

segfault using numpy's lapack_lite with multiprocessing on osx, not linux

The following test code segfaults for me on OSX 10.7.3, but not other machines:

from __future__ import print_function

import numpy as np
import multiprocessing as mp
import scipy.linalg

def f(a):
    print("about to call")

    ### these all cause crashes
    sign, x = np.linalg.slogdet(a)
    #x = np.linalg.det(a)
    #x = np.linalg.inv(a).sum()

    ### these are all fine
    #x = scipy.linalg.expm3(a).sum()
    #x = np.dot(a, a.T).sum()

    print("result:", x)
    return x

def call_proc(a):
    print("\ncalling with multiprocessing")
    p = mp.Process(target=f, args=(a,))
    p.start()
    p.join()


if __name__ == '__main__':
    import sys
    n = int(sys.argv[1]) if len(sys.argv) > 1 else 50

    a = np.random.normal(0, 2, (n, n))
    f(a)

    call_proc(a)
    call_proc(a)

Example output for one of the segfaulty ones:

$ python2.7 test.py
about to call
result: -4.96797718087

calling with multiprocessing
about to call

calling with multiprocessing
about to call

with an OSX "problem report" popping up complaining about a segfault like KERN_INVALID_ADDRESS at 0x0000000000000108; here's a full one.

If I run it with n <= 32, it runs fine; for any n >= 33, it crashes.

If I comment out the f(a) call that's done in the original process, both calls to call_proc are fine. It still segfaults if I call f on a different large array; if I call it on a different small array, or if I call f(large_array) and then pass off f(small_array) to a different process, it works fine. They don't actually need to be the same function; np.inv(large_array) followed by passing off to np.linalg.slogdet(different_large_array) also segfaults.

All of the commented-out np.linalg things in f cause crashes; np.dot(self.a, self.a.T).sum() and scipy.linalg.exp3m work fine. As far as I can tell, the difference is that the former use numpy's lapack_lite and the latter don't.


This happens for me on my desktop with

  • python 2.6.7, numpy 1.5.1
  • python 2.7.1, numpy 1.5.1, scipy 0.10.0
  • python 3.2.2, numpy 1.6.1, scipy 0.10.1

The 2.6 and 2.7 are I think the default system installs; I installed the 3.2 versions manually from the source tarballs. All of those numpys are linked to the system Accelerate framework:

$ otool -L `python3.2 -c 'from numpy.core import _dotblas; print(_dotblas.__file__)'`
/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/site-packages/numpy/core/_dotblas.so:
    /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate (compatibility version 1.0.0, current version 4.0.0)
    /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 125.2.1)

I get the same behavior on another Mac with a similar setup.

But all of the options for f work on other machines running

  • OSX 10.6.8 with Python 2.6.1 and numpy 1.2.1 linked to Accelerate 4 and vecLib 268 (except that it doesn't have scipy or slogdet)
  • Debian 6 with Python 3.2.2, numpy 1.6.1, and scipy 0.10.1 linked to the system ATLAS
  • Ubuntu 11.04 with Python 2.7.1, numpy 1.5.1 and scipy 0.8.0 linked to system ATLAS

Am I doing something wrong here? What could possibly be causing this? I don't see how running a function on a numpy array that's getting pickled and unpickled can possibly cause it to later segfault in a different process.


Update: when I do a core dump, the backtrace is inside dispatch_group_async_f, the Grand Central Dispatch interface. Presumably this is a bug in the interactions between numpy/GCD and multiprocessing. I`ve reported this as a numpy bug, but if anyone has any ideas about workarounds or, for that matter, how to solve the bug, it'd be greatly appreciated. :)

like image 884
Danica Avatar asked Mar 26 '12 20:03

Danica


1 Answers

It turns out that the Accelerate framework used by default on OSX just doesn't support using BLAS calls on both sides of a fork. No real way to deal with this other than linking to a different BLAS, and it doesn't seem like something they're interested in fixing.

like image 71
Danica Avatar answered Nov 10 '22 07:11

Danica