I've managed to build some DLLs on Linux that are necessary for my Python extension using MinGW. Something along these lines:
from setuptools.command.build_py import build_py
class BuildGo(build_py):
def run(self):
if # need to build windows binaries
self.build_win()
build_py.run(self)
def build_win(self):
if # compilers and toolchain available
try:
# builds extra libraries necessary for this extension
except subprocess.CalledProcessError as e:
print(e.stderr)
raise
try:
result = subprocess.check_output([
'x86_64-w64-mingw32-gcc-win32',
'-shared',
'-pthread',
'-o',
EXTRA_DLL,
FAKE_WIN_BINDINGS,
ARCHIVE_GENERATED_IN_PREVIOUS_STEP,
'-lwinmm',
'-lntdll',
'-lws2_32',
])
print(result)
except subprocess.CalledProcessError as e:
print(e.stderr)
raise
I was now hoping I could avoid extending build_ext
in the same painful way to get it to cross-compile Cython code for Windows... I looked into the abyss of "elegant interplay of setuptools
, distutils
and cython
", and before the abyss has a chance to look back into me... Isn't there a way to just specify some flag... like a name of compiler and Python binary for desired platform and... it would just do it?
I've read this article: http://whatschrisdoing.com/blog/2009/10/16/cross-compiling-python-extensions/ - it's almost 10 years old. And it just made me want to cry... did anything change since it was written? Or are these steps more or less what I'll have to do to compile for the platform other than the one I'm running on?
Or, is there an example project on the web which does it?
My ultimate goal is to produce an egg
package which will contain both PE and ELF binaries in it and will install them in the correct location on either platform when installed by pip
or pipenv
. It should compile on Linux (compiling it on MS Windows isn't necessary).
mingw32 exists as a package for Linux. You can cross-compile and -link Windows applications with it.
Explanation: GCC, a free software collection of compilers, also can be used as cross compile. It supports many languages and platforms.
I'm posting this as community wiki because it's a pretty unsatisfactory answer: it only tells you why it's very hard rather than offers really solutions.
The official Python distributions on Windows are compiled with Microsoft Visual C (MSVC), and when compiling a Python extension it's generally necessary to use the same version as the one that Python was compiled with. This shows you that an exact compiler match is pretty important.
It is possible to get versions of Python compiled with Mingw, and these would then be compatible with modules compiled with Mingw. This could probably be made to work as a cross-compiler on Linux but the modules would only be useful to a very small subset of people that have this custom build of Python (so doesn't help create a useful distributable .egg file).
A reasonable effort has gone also into making a version of Mingw that can build compatible Python extensions on Windows: https://mingwpy.github.io/ (and I think also https://anaconda.org/msys2/m2w64-toolchain). The main driver for this seems to be the lack of freely Fortran compiler for Windows that is compatible with MSVC, hence the ability to build Fortran modules is very useful. The mingwpy toolchain worked pretty well in my experience, until Python 3.4 when the switch to a more recent version of MSVC brought a whole exciting new set of compatibility issues.
My feeling would be that any viable solution would probably be based around these mostly-working Mingw compilers for windows.
According to https://docs.python.org/3/distutils/builtdist.html , distutils
only supports cross-compiling between win32
and win_amd64
as of this writing (3.7
).
Moreover, building extensions with compilers other than the MSVC that Python is built with is not officially supported.
It is theoretically possible by getting a Linux toolchain for win32/64
(including the necessary headers and link libraries), a set of the necessary Python for Windows binaries to link against, then forge compiler and linker paths and/or options in setup.py
-- though it will still be an unsupported setup.
So you'll be better off using a Windows VM or an online build service like AppVeyor.
I had the same issue once, but I just used a virtual machine to compile my most painfuly microsoft dependant programs.
https://developer.microsoft.com/en-us/windows/downloads/virtual-machines
If you don't have access to a windows machine or your programs uses very specific machiney like a fortran compiler optimized or some POSIX dependant stuff or newest features from VS redistributable versions, you better give a try to a virtual machine based compilation system.
Here is a proof of concept for cross compiling (Cython-) extensions for Windows on Linux, which follows more or less the steps for building with mingw-w64 on Windows.
But first a word of warning: While possible, the workflow is not really supported (it starts with the fact that the only supported windows compiler is MSVC), so it can be broken with changes in future versions. I use Python 3.7 for 64bit, things might be (slightly) different for other versions.
There might be legit scenarios for cross-compilation for Windows, but the python world seems to live quite good without, so probably cross-compilation is not the right direction in the most cases.
Preliminaries:
sudo apt-get install mingw-w64
) - the compiler for 64bit is x86_64-w64-mingw32-gcc
.distutils
does not support mingw-w64
, so we will perform all steps manually.
1. C code generation
Let's take the following simple Cython-extension foo.pyx
print("It is me!")
which can be transformed to C-code via:
>>> cython -3 foo.pyx
which creates the foo.c
-file.
2. Compilation
The compilation step is:
>>> x86_64-w64-mingw32-gcc -c foo.c -o foo.o -I <path_to_windows_includes> -DMS_WIN64 -O2 <other compile flags>
I guess one can be minimalistic and only use -O2
compile flag in most cases. It is however important to define MS_WIN64
-macro (e.g. via -DMS_WIN64
). In order to build for x64 on windows it must be set, but it works out of the box only for MSVC (defining _WIN64
could have slightly different outcomes):
#ifdef _WIN64
#define MS_WIN64
#endif
3. Linking
The linking command is:
>>> x86_64-w64-mingw32-gcc -shared foo.o -o foo.pyd -L <path_to_windows_dll> -lpython37
It is important, that the python-library (python37
) should be the dll itself and not the lib
(see this SO-post).
One probably should add the proper suffix to the resulting pyd-file, I use the old convention for simplicity here.
4. Running:
Copying pyd-file to windows and now:
import foo
# prints "It is me!"
Done!
Embeded Python:
In case the python should be embeded, i.e. C-code is generated via cython -3 --embed foo.pyx
, the compilation steps stays as above.
The linker step becomes:
>>> x86_64-w64-mingw32-gcc foo.o -o foo.exe -L <path_to_windows_dll> -lpython37 -municode
There are two noticeable differences:
-shared
is no longer should be used, as the result is no longer a dynamic library (that is what *.pyd-file is after all) but an executable.-municode
is needed, because for Windows, Cython defines int wmain(int argc, wchar_t **argv)
instead of int main(int argc, char** argv)
. Without this option, an error message like In function 'main': /build/mingw-w64-_1w3Xm/mingw-w64-4.0.4/mingw-w64-crt/crt/crt0_c.c:18: undefined reference to 'WinMain' collect2: error: ld returned 1 exit status
would appear (see this SO-post for more information).Note: for the resulting executable to run, a whole python-distribution (and not only the dll) is needed (see also this SO-post).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With