Python Extension modules are just dynamic libraries, so I assume it's possible to dynamically link a Python extension to another. The problem is on Windows Python Extensions are are given the .pyd
extension instead of .dll
, so I can't get distutils to link to them when I run the setup script. (I don't think this is a problem on UNIX because Python extensions use the .so
file extension.)
Assume I have an extension bar.pyd
which needs to link to foo.pyd
. Basically, what I did in the setup script was:
from distutils.core import setup, Extension
foo = Extension("foo", sources=["foo.c"])
bar = Extension("bar", libraries=["foo"], sources=["bar.c"])
setup(ext_modules=[foo, bar])
So far this isn't working. Is this even possible? I assume it is, but I haven't been able to find anything online. I'm using MinGW on Windows, but I would like this to work with different MSVC++ and on other systems as well.
Edit: Previously, I solved this problem by passing the object file (foo.o
) created when foo
was compiled to the extra_objects
option in the extension (this would only work if I defined prototypes of all foo
symbols in bar
):
bar = Extension("bar", sources=["bar.c"], extra_objects=["build/.../foo.o"]
This didn't seem to be the right solution, but it worked. I don't understand dynamic linking that well, so this may be the right way to do it. It feels very wrong, though.
Then, I tried passing some explicit arguments to gcc to make it compile an import library:
foo = Extension("foo", sources=["foo.c"], extra_compile_args=["-Wl,--out-implib,foo.lib,--export-all-symbols"])
And then I linked bar
to the new import library:
bar = Extension("bar", libraries=["foo"], sources=["bar.c"])
This compiled without complaint, but there were some issues with some of the symbols (specifically, I had a few global PyTypeObject
s in foo
that seemed to be redefined in bar
. I need the PyTypeObject
s in both modules refer to the same definition.).
Edit 2: So, I singled out the problem. Function symbols were exporting correctly after I built and linked against the import libraries, but the PyTypeObject
s were getting redeclared. Assume there was a PyTypeOject Foo_Type
in foo
. I declared it in foo.h
, which was included in both foo.c
and bar.c
:
PyTypeObject Foo_Type;
I took that out, and put this near the top of foo.c
:
PyTypeObject __declspec(dllexport) Foo_Type;
and this near the top of bar.c
:
PyTypeObject __declspec(dllimport) Foo_Type;
That fixed the problem. I could then use Foo_Type in both foo
and bar
and it referred to the same definition of Foo_Type. The problem is, this isn't going to work on non-Windows systems. I assume if I just take the __declspec
s out, it'll work fine on other systems.
No, loading a pure-Python module is not considered a form of dynamic linking. Traditional dynamic linking loads machine code into a new chunk of memory, and multiple executable processes can be given access (the dynamically linked library only needs to be loaded once, virtual memory takes care of the rest).
Python is a dynamically typed language. 00:12 The Python interpreter does type checking only when the code runs.
If you cannot open your PYD file correctly, try to right-click or long-press the file. Then click "Open with" and choose an application. You can also display a PYD file directly in the browser: Just drag the file onto this browser window and drop it.
If you're using the normal Python import mechanisms then there's no need to link against the other extension. If you're calling functions within the other extension, presumably because you have gotten a hold on the header file, then you'll need to generate an import lib from the DLL before you can link against it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With