I have written the following function in cython
to estimate the log-likelihood
@cython.boundscheck(False) @cython.wraparound(False) def likelihood(double m, double c, np.ndarray[np.double_t, ndim=1, mode='c'] r_mpc not None, np.ndarray[np.double_t, ndim=1, mode='c'] gtan not None, np.ndarray[np.double_t, ndim=1, mode='c'] gcrs not None, np.ndarray[np.double_t, ndim=1, mode='c'] shear_err not None, np.ndarray[np.double_t, ndim=1, mode='c'] beta not None, double rho_c, np.ndarray[np.double_t, ndim=1, mode='c'] rho_c_sigma not None): cdef double rscale = rscaleConstM(m, c,rho_c, 200) cdef Py_ssize_t ngals = r_mpc.shape[0] cdef np.ndarray[DTYPE_T, ndim=1, mode='c'] gamma_inf = Sh(r_mpc, c, rscale, rho_c_sigma) cdef np.ndarray[DTYPE_T, ndim=1, mode='c'] kappa_inf = Kap(r_mpc, c, rscale, rho_c_sigma) cdef double delta = 0. cdef double modelg = 0. cdef double modsig = 0. cdef Py_ssize_t i cdef DTYPE_T logProb = 0. #calculate logprob for i from ngals > i >= 0: modelg = (beta[i]*gamma_inf[i] / (1 - beta[i]*kappa_inf[i])) delta = gtan[i] - modelg modsig = shear_err[i] logProb = logProb -.5*(delta/modsig)**2 - logsqrt2pi - log(modsig) return logProb
but when I run the compiled version of this function, I get the following error message:
File "Tools.pyx", line 3, in Tools.likelihood def likelihood(double m, ValueError: ndarray is not C-contiguous
I could not quite understand why this problem occurs??!!! I will appreciate to get any useful tips.
Just before you get the error, try printing the flags
attribute of the numpy array(s) you're passing to likelihood
. You'll probably see something like:
In [2]: foo.flags Out[2]: C_CONTIGUOUS : False F_CONTIGUOUS : True OWNDATA : True WRITEABLE : True ALIGNED : True UPDATEIFCOPY : False
Note where it says C_CONTIGUOUS : False
, because that's the issue. To fix it, simply convert it to C-order:
In [6]: foo = foo.copy(order='C') In [7]: foo.flags Out[7]: C_CONTIGUOUS : True F_CONTIGUOUS : False OWNDATA : True WRITEABLE : True ALIGNED : True UPDATEIFCOPY : False
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With