I would like to modify a setup.py file such that the command "python setup.py build" compiles a C-based extension module that is statically (rather than dynamically) linked to a library.
The extension is currently dynamically linked to a number of libraries. I would like to leave everything unchanged except for statically linking to just one library. I have successfully done this by manually modifying the call to gcc that distutils runs, although it required that I explicitly listed the dependent libraries.
Perhaps this is too much information, but for clarity this is the final linking command that was executed during the "python setup.py build" script:
gcc -pthread -shared -L/system/lib64 -L/system/lib/ -I/system/include build/temp.linux-x86_64-2.7/src/*.o -L/system/lib -L/usr/local/lib -L/usr/lib -ligraph -o build/lib.linux-x86_64-2.7/igraph/core.so
And this is my manual modification:
gcc -pthread -shared -L/system/lib64 -L/system/lib/ -I/system/include build/temp.linux-x86_64-2.7/src/*.o -L/system/lib -L/usr/local/lib -L/usr/lib /system/lib/libigraph.a -lxml2 -lz -lgmp -lstdc++ -lm -ldl -o build/lib.linux-x86_64-2.7/igraph/core.so
Section 2.3.4 of Distributing Python Modules discusses the specification of libraries, but only "library_dirs" is appropriate and those libraries are dynamically linked.
I'm using a Linux environment for development but the package will also be compiled and installed on Windows, so a portable solution is what I'm after.
Can someone tell me where to look for instructions, or how to modify the setup.py script? (Thanks in advance!)
I'm new to StackOverflow, so my apologies if I haven't correctly tagged this question, or if I have made some other error in this posting.
It is possible to produce a fully statically linked executable embedding Python on Linux. The produced binary will have no external library dependencies nor will it even support loading dynamic libraries. In theory, the executable can be copied between Linux machines and it will just work.
In computer science, a static library or statically-linked library is a set of routines, external functions and variables which are resolved in a caller at compile-time and copied into a target application by a compiler, linker, or binder, producing an object file and a stand-alone executable.
Dynamic linking means that the code for some external routines is located and loaded when the program is first run.
You can't do this. You have two options: Recompile the library as a shared library. Then use ctypes to call methods from the dynamically-loaded shared library.
6 - 7 years later, static linking with Python extensions is still poorly documented. As the OP points out in a comment, the usage is OS dependend.
Static libraries are linked just as object files and should go with the path and its extension into extra_objects
.
The compiler sees if the linked library is static or dynamic and the static library name goes to the libraries list and the directories to library_dir
For the example below, I will use the same library scenario as OP, linking igraph
static and z
, xml2
and gmp
dynamic. This solution is a bit hackish, but at least does for each platform the right thing.
static_libraries = ['igraph'] static_lib_dir = '/system/lib' libraries = ['z', 'xml2', 'gmp'] library_dirs = ['/system/lib', '/system/lib64'] if sys.platform == 'win32': libraries.extend(static_libraries) library_dirs.append(static_lib_dir) extra_objects = [] else: # POSIX extra_objects = ['{}/lib{}.a'.format(static_lib_dir, l) for l in static_libraries] ext = Extension('igraph.core', sources=source_file_list, libraries=libraries, library_dirs=library_dirs, include_dirs=include_dirs, extra_objects=extra_objects)
I guess this works also for MacOS (using the else
path) but I have not tested it.
If all else fails, there's always the little-documented extra_compile_args
and extra_link_args
options to the Extension
builder. (See also here.)
You might need to hack in some OS-dependent code to get the right argument format for a particular platform though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With