Is it possible to create a Python 2.7 package using __init__.pyx
(compiled to __init__.so
)? If so how? I haven't had any luck getting it to work.
Here is what I have tried:
setup.py
:
#!/usr/bin/env python from distutils.core import setup from distutils.extension import Extension from Cython.Distutils import build_ext foo = Extension(name='foo.__init__', sources=['foo/__init__.pyx']) bar = Extension(name='foo.bar', sources=['foo/bar.pyx']) setup(name='foo', packages = ['foo'], cmdclass={'build_ext':build_ext}, ext_modules = [foo, bar])
foo/__init__.pyx
:
import foo.bar cpdef hello_world(): print "hello world" foo.bar.blah()
foo/bar.pyx
:
cpdef blah(): print "blah"
The above has the following behavior:
$ python -c 'import foo; foo.hello_world()' Traceback (most recent call last): File "<string>", line 1, in <module> ImportError: No module named foo
I saw Python issue #15576 which was fixed by this Hg commit. Looking at the equivalent Git commit in the Git mirror of the Python Hg repository, I see that the commit is reachable from the Python v2.7.5 tag (as well as all subsequent v2.7.x versions). Was there a regression?
Because Cython code compiles to C, it can interact with those libraries directly, and take Python's bottlenecks out of the loop. But NumPy, in particular, works well with Cython. Cython has native support for specific constructions in NumPy and provides fast access to NumPy arrays.
In addition to the . pyx source files, Cython uses . pxd files which work like C header files – they contain Cython declarations (and sometimes code sections) which are only meant for inclusion by Cython modules. A pxd file is imported into a pyx module by using the cimport keyword.
Linux The GNU C Compiler (gcc) is usually present, or easily available through the package system. On Ubuntu or Debian, for instance, it is part of the build-essential package. Next to a C compiler, Cython requires the Python header files.
According to this really old mailing list post it works if you also have an __init__.py
file (the __init__.py
file is not used, but seems to be necessary for the directory to be treated as a module, and hence the __init__.so
file to be loaded).
If I add __init__.py
:
# an exception just to confirm that the .so file is loaded instead of the .py file raise ImportError("__init__.py loaded when __init__.so should have been loaded")
then your example works on Linux Python 2.7.3:
$ python -c 'import foo; foo.hello_world()' hello world blah
This has all the signs of a buggy corner case so probably isn't recommended. Note that on Windows this doesn't seem to work for me giving
ImportError: DLL load failed: %1 is not a valid Win32 application.
Addendum (for a little extra context):
This behaviour doesn't seem to be explicitly documented. In the original description of packages from around Python 1.5 era they say:
without the
__init__.py
, a directory is not recognized as a package
and
Tip: the search order is determined by the list of suffixes returned by the function
imp.get_suffixes()
. Usually the suffixes are searched in the following order: ".so", "module.so", ".py", ".pyc". Directories don't explicitly occur in this list, but precede all entries in it.
The observed behaviour is certainly consistent with this — __init__.py
needed to treat a directory as a package, but .so file is loaded in preference to .py file — but it's hardly unambiguous.
From a Cython point of view this behaviour seems to be been used to compile the standard library (in which case __init__.py
would always have been present), or in the testcases given https://github.com/cython/cython/blob/master/tests/build/package_compilation.srctree (and a few other examples too). In these the "srctree" file looks to be expanded into a variety of folders containing __init__.py
(and other files) then compiled. It's possible that only having __init__.so
was simply never tested.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With