Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

project structure for wrapping many c++ classes in cython to a single shared object

Tags:

I have found partial answers between the docs, mailing lists, and this question here, but I wanted to get a more direct answer addressing my specifics...

I'm learning cython by trying to wrap small parts, little by little, of a library that I am already using that is currently wrapped in boost::python. I have contributed a tiny bit to this boost wrapper, and am using it as a c++ reference, while at the same time I am using ZeroMQ Python bindings as a cython reference.

My question is about project structure. The current boost version of this lib compiles to a single .so, and that is my goal. I quickly found out that you cannot directly compile multiple .pyx modules to a single .so. I then started going down the route of defining the cppclass's in pxd files, and their corresponding python exported implementation classes in .pxi, and was trying to include them into a single pyx for compilation. While it worked at first, once I wrote a little more I hit problems with conflicting multiple definitions because of the pxi includes in different places.

I would love to hear a proper organizational approach that addresses the following questions and goals:

  • Naming the public classes the same as the cppclass (I am doing this now by having the cppclass in a different named pyd and using the imported namespace to handle the similar names, ala Using cimport to resolve naming conflicts)
  • Single .so as the compiled output (acceptable approach?)
  • Do I use the pyx multi-include approach into the main pyx for that alone, or should that main pyx contain anything else beyond just holding the includes?
  • Where to centrally define constants that will be exported in python?
  • Is there a preferred folder structure? Right now I have everything in a big src directory beneath my setup.py. It gets confusing seeing so many pxi, pxd, pyx files.
  • Are pxi completely unnecessary now? If not, do I need to use a cython-style ifndef guard to handle the multiple inclusions between different modules?
  • I know the ZeroMQ python bindings build multiple modules and use the package approach by including them through __init__.py. Is that really the proper approach with cython?

For reference, the project I am practicing to re-wrap is PyOpenNI (openni). The pattern this boost project takes is to collect the common objects in one place, and then define a 1-to-1 header definition with the source, and then there is a huge wrapper that collects all of the definitions into the single location. And also the added custom exception handling and utilities.

like image 581
jdi Avatar asked Jul 28 '12 05:07

jdi


1 Answers

While waiting for a definitive answer, I kept playing around with organizing my code. The including of pyx files into a single pyx for compilation has been working so far.

My setup.py is simple like:

ext_modules = [     Extension(         "openni",          ["src/openni.pyx"],          language="c++",         include_dirs=['src/', '/usr/include/ni'],         libraries=['OpenNI'],     ) ], 

The main openni.pyx looks like:

include "constants.pyx" include "exceptions.pyx" include "context.pyx" ... 

I have a common libopenni.pxd to provide declaration-only externs to the rest of the modules.

I name my cppclass declarations a different pxd name than the pyx class definitions to avoid name collision:

xncontext.pxd

cdef extern from "XnCppWrapper.h" namespace "xn":     cdef cppclass Context:            ... 

context.pyx:

from libopenni cimport * from xncontext cimport Context as c_Context   cdef class Context:     cdef c_Context *handle            ... 
like image 126
jdi Avatar answered Oct 26 '22 07:10

jdi