I am trying to compile a C++ module to use in scipy.weave
that is composed of several headers and source C++ files. These files contain classes and methods that extensively use the Numpy/C-API interface. But I am failing to figure out how to include import_array()
successfully. I have been struggling on this for the past week and I am going nuts. I hope you could help me with it because the weave
help is not very explanatory.
In practice I have first a module called pycapi_utils
that contains some routines to interface C objects with Python objects. It consists of a header file pycapi_utils.h
and a source file pycapi_utils.cpp
such as:
//pycapi_utils.h
#if ! defined _PYCAPI_UTILS_H
#define _PYCAPI_UTILS_H 1
#include <stdlib.h>
#include <Python.h>
#include <numpy/arrayobject.h>
#include <tuple>
#include <list>
typedef std::tuple<const char*,PyObject*> pykeyval; //Tuple type (string,Pyobj*) as dictionary entry (key,val)
typedef std::list<pykeyval> kvlist;
//Declaration of methods
PyObject* array_double_to_pyobj(double* v_c, long int NUMEL); //Convert from array to Python list (double)
...
...
#endif
and
//pycapi_utils.cpp
#include "pycapi_utils.h"
PyObject* array_double_to_pyobj(double* v_c, long int NUMEL){
//Convert a double array to a Numpy array
PyObject* out_array = PyArray_SimpleNew(1, &NUMEL, NPY_DOUBLE);
double* v_b = (double*) ((PyArrayObject*) out_array)->data;
for (int i=0;i<NUMEL;i++) v_b[i] = v_c[i];
free(v_c);
return out_array;
}
Then I have a further module model
that contains classes and routines dealing with some mathematical model. Again it consists of a header and source file like:
//model.h
#if ! defined _MODEL_H
#define _MODEL_H 1
//model class
class my_model{
int i,j;
public:
my_model();
~my_model();
double* update(double*);
}
//Simulator
PyObject* simulate(double* input);
#endif
and
//model.cpp
#include "pycapi_utils.h"
#include "model.h"
//Define class and methods
model::model{
...
...
}
...
...
double* model::update(double* input){
double* x = (double*)calloc(N,sizeof(double));
...
...
// Do something
...
...
return x;
}
PyObject* simulate(double* input){
//Initialize Python interface
Py_Initialize;
import_array();
model random_network;
double* output;
output = random_network.update(input);
return array_double_to_pyobj(output); // from pycapi_utils.h
}
The above code is included in a scipy.weave
module in Python with
def model_py(input):
support_code="""
#include "model.h"
"""
code = """
return_val = simulate(input.data());
"""
libs=['gsl','gslcblas','m']
vars = ['input']
out = weave.inline(code,
vars,
support_code=support_code,
sources = source_files,
libraries=libs
type_converters=converters.blitz,
compiler='gcc',
extra_compile_args=['-std=c++11'],
force=1)
It fails to compile giving:
error: int _import_array() was not declared in this scope
Noteworthy is that if I lump into pycapi_utils.h
also the source pycapi_utils.cpp
, everything works fine. But I don't want to use this solution, as in practice my modules here need to be included in several other modules that also use PyObjects and need call import_array()
.
I was looking to this post on stack exchange, but I cannot figure out if and how to properly define the #define
directives in my case. Also the example in that post is not exactly my case as there, import_array()
is called within the global scope of main()
whereas in my case import_array()
is called within my simulate
routine which is invoked by main()
build by scipy.weave
.
I had a similar problem, as the link you've posted points out, the root of all evil is that the PyArray_API
is defined static, which means that each translation unit has it's own PyArray_API
which is initialized with PyArray_API = NULL
by default. Thus import_array()
must be called once for every *.cpp
file. In your case it should be sufficient to call it in pycapi_utils.cpp
and also once in model.cpp
. You can also test if array_import is necessary before actualy calling it with:
if(PyArray_API == NULL)
{
import_array();
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With