I have a C++ function returning a std::vector and I want to use it in python, so I'm using the C numpy api:
static PyObject *
py_integrate(PyObject *self, PyObject *args){
...
std::vector<double> integral;
cpp_function(integral); // This changes integral
npy_intp size = {integral.size()};
PyObject *out = PyArray_SimpleNewFromData(1, &size, NPY_DOUBLE, &(integral[0]));
return out;
}
Here's how I call it from python:
import matplotlib.pyplot as plt
a = py_integrate(parameters)
print a
fig = plt.figure()
ax = fig.add_subplot(111)
ax.plot(a)
print a
What happens is: The first print is ok, the values are correct. But when I plot a
they are not; in the second print I see very strange values like 1E-308 1E-308 ...
or 0 0 0 ...
as an uninitialized memory. I don't understand why the first print is ok.
static void DeleteVector(void *ptr)
{
std::cout << "Delete" << std::endl;
vector * v = static_cast<std::vector<double> * >(ptr);
delete v;
return;
}
static PyObject *
cppfunction(PyObject *self, PyObject *args)
{
std::vector<double> *vector = new std::vector<double>();
vector->push_back(1.);
PyObject *py_integral = PyCObject_FromVoidPtr(vector, DeleteVector);
npy_intp size = {vector->size()};
PyArrayObject *out;
((PyArrayObject*) out)->base = py_integral;
return (PyObject*)(out);
}
NumPy provides a C-API to enable users to extend the system and get access to the array object for use in other routines. The best way to truly understand the C-API is to read the source code. If you are unfamiliar with (C) source code, however, this can be a daunting experience at first.
NumPy is written in C, and executes very quickly as a result. By comparison, Python is a dynamic language that is interpreted by the CPython interpreter, converted to bytecode, and executed. While it's no slouch, compiled C code is always going to be faster.
A NumPy array is a Python object implemented using Python's C API. NumPy arrays do provide an API at the C level, but they cannot be created independent from the Python interpreter. They are especially useful because of all the different array manipulation routines available in NumPy and SciPy.
The Python code can't be faster than properly-coded C++ code since Numpy is coded in C, which is often slower than C++ since C++ can do more optimizations.
Your std::vector
object appears to be local to that function. PyArray_SimpleNewFromData
does not make a copy of the data you pass it. It just keeps a pointer. So once your py_integrate function returns, the vector is deallocated. The print works the first time because nothing has written over the freed memory yet, but by the time you get to the next print, something else has used that memory, causing the values to be different.
You need to make a NumPy array that owns its own storage space and then copy the data into it.
Alternatively, allocate your vector on the heap. Then store a pointer to it in a CObject. Provide a destructor that deletes the vector. Then, take a look at the C-level PyArrayObject type. It has a PyObject *
member called base
. Store your CObject
there. Then when the NumPy array is garbage collected, the reference count on this base object will be decremented, and assuming you haven't taken a copy of it elsewhere, your vector will be deleted thanks to the destructor you provided.
You forgot to actually create the PyArray. Try this:
(You didn't post DeleteVector
, so I can only hope that it's right)
std::vector<double> *vector = new std::vector<double>();
vector->push_back(1.);
PyObject *py_integral = PyCObject_FromVoidPtr(vector, DeleteVector);
npy_intp size = {vector->size()};
PyObject *out = PyArray_SimpleNewFromData(1, &size, NPY_DOUBLE, &((*vector)[0]));
((PyArrayObject*) out)->base = py_integral;
return out;
Note: I'm not a C++ programmer, so I can only assume that &((*vector)[0])
works as intended with a pointer to a vector. I do know that the vector reallocate its storage area if you grow it, so don't increase its size after getting that pointer or it won't be valid anymore.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With