I am getting what I think is a strange seg fault when I am trying to pass boost::numpy::ndarray
as an argument:
#include <iostream>
#include <boost/python.hpp>
#include <boost/numpy.hpp>
void say_hello(boost::numpy::ndarray& my_array)
//void say_hello(int x) This works fine
{
std::cout<<"Hello"<<std::endl;
}
BOOST_PYTHON_MODULE(hello_ext)
{
using namespace boost::python;
def("say_hello", say_hello);
}
I know the example is silly, but I should not be getting a seg fault there, and this is the smallest example I was able to reduce the problem to. Maybe I need to specify the ndarray
type or number of dimensions, but I could not find any documentation on it. I was looking at this, but it didn't seem very helpful. My gut feeling is I am missing something simple, but I just don't see it.
When I run this:
In [1]: from hello_ext import *
In [2]: import numpy as np
In [3]: say_hello(np.array([3,4,5]))
Segmentation fault (core dumped)
My Makefile:
PYTHON_VERSION = 2.7
PYTHON_INCLUDE = /usr/include/python$(PYTHON_VERSION)
BOOST_INC = /usr/include
BOOST_LIB = /usr/lib
TARGET = hello_ext
$(TARGET).so: $(TARGET).o
g++ -std=c++11 -shared -Wl,--export-dynamic $(TARGET).o -L$(BOOST_LIB) -lboost_python -lboost_numpy -L/usr/lib/python$(PYTHON_VERSION)/config -lpython$(PYTHON_VERSION) -o $(TARGET).so
$(TARGET).o: $(TARGET).cpp
g++ -std=c++11 -I$(PYTHON_INCLUDE) -I$(BOOST_INC) -fPIC -c $(TARGET).cp
I knew it was something simple. I needed to add these two lines:
Py_Initialize();
boost::numpy::initialize();
as explained : here seg fault results after any attempt to use boost::numpy::ndarray
if the above lines are not ran.
Therefore: my file becomes:
#include <iostream>
#include <boost/python.hpp>
#include <boost/numpy.hpp>
void say_hello(boost::numpy::ndarray& my_array)
//void say_hello(int x) This works fine
{
std::cout<<"Hello"<<std::endl;
}
BOOST_PYTHON_MODULE(hello_ext)
{
using namespace boost::python;
Py_Initialize();
boost::numpy::initialize();
def("say_hello", say_hello);
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With