Given a file docprep.pyx as simple as
from spacy.structs cimport TokenC
print("loading")
And trying to cythonize it via
cythonize -3 -i docprep.pyx
I get the following error message
docprep.c:613:10: fatal error: ios: No such file or directory
 #include "ios"
          ^~~~~
compilation terminated
As you can tell from the paths, this system has an Anaconda installation with Python 3.7. numpy, spacy and cython are all installed through conda.
<ios> is a c++-header. The error message shows that you try to compile a C++-code as C-code.
Per default, Cython will produce a file with extension *.c, which will be interpreted as C-code by the compiler later on.
Cython can also produce a file with the right file-extension for c++, i.e. *.cpp. And there are multiple ways to trigger this behavior:
# distutils: language = c++ at the beginning of the pyx-file.language="c++" to the Extension definition in the setup.py-file.cython with option --cplus.%%cython magic with -+, i.e. %%cython -+.pyximport, see this SO-question.Actually, for cythonize there is no command line option to trigger c++-generation, thus the first options looks like the best way to go:
# distutils: language = c++
from spacy.structs cimport TokenC
print("loading") 
The problem is that spacy/structs.pxd   uses c++-constructs, for example vectors or anything else cimported from libcpp:
...
from libcpp.vector cimport vector
...
and thus also c++-libraries/headers are needed for the build.
In my case, it worked using @mountrix tip, just add the language="c++" to your setup.py, an example:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
import numpy
extensions = [
    Extension("processing_module", sources=["processing_module.pyx"], include_dirs=[numpy.get_include()], extra_compile_args=["-O3"], language="c++")
]
setup(
    name="processing_module",
    ext_modules = cythonize(extensions),
)
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With