I recently used cython to speedup an application, and now struggle with passing a 3D numpy array from cython to a C++ function. I can call my function from a python test script, but it segfauls. When I test my C++ on its own, it does not. Therefore, I assume that I do something wrong with passing the array correctly.
What is going wrong there?
>> python test_harvest.py
(100, 100) I twerk
[1] 6771 segmentation fault (core dumped) python test_harvest.py
logic.pyx
import cython
import numpy as np
cimport numpy as np
cdef extern from "fast_harvest.h":
void start_harvest(int *** data , int x, int y, int t, int n)
def harvest(np.ndarray[int, ndim=3, mode="c"] data not None,
int goal_x,
int goal_y,
int mission_time,
int number_of_robots):
m, n, o = data.shape[0], data.shape[1], data.shape[2]
assert m == mission_time
assert n == number_of_robots
assert o == 2
start_harvest (<int ***> data.data,
goal_x, goal_y,
mission_time,
number_of_robots)
fast_harvest.cpp
#include <iostream>
#include <cstdio>
#include "fast_harvest.h"
#include "Harvester.h"
using std::cout;
using std::endl;
void start_harvest(int ***data, int x, int y, int mission_time, int number_of_robots) {
Point p(x,y);
p.dump();
cout << "I twerk" << endl;
for(int n = 0; n < number_of_robots; n++) {
int xpos = data[0][n][0];
int ypos = data[0][n][1];
printf("(%d, %d)\n", xpos, ypos);
}
}
test_harvest.py
import numpy as np
import fharvest.logic as fhl
ROBO_COUNT = 2
MISSION_TIME = 20
GOAL_X = 100
GOAL_Y = 100
data = np.zeros([MISSION_TIME, ROBO_COUNT, 2], dtype=int)
data[0][0] = 0, 200
data[0][1] = 200, 0
fhl.harvest(data, GOAL_X, GOAL_Y, MISSION_TIME, ROBO_COUNT)
print(data)
What I finally ended up doing was passing a 1D-array to C++, and wrote a wrapper class for it to make the transformations from 3D to 1D coordinates. My actual program had its internal datastructures. After finishing its work, I passed the wrapper to the main class and it copied its state over to the wrapped 1D buffer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With