So the story goes: I need a MPI wrapper for Python.
I know there's mpi4py. For the current work I (mostly) use Python and Windows, I'd like to use the Microsoft HPC Cluster Pack, having access to a few pretty "strong" machines running Win 2008 Server. Just to mention, besides Win-experience, I do have a bit of *nix experience with MPI and stuff, but that's pretty moot point for this problem.
My interest in mpi4py was renewed when I ran into Python Tools for Visual Studio. That's some seriously great stuff. Anyone that's a fan of Visual Studio and Python should try it. Good work, and a great debugger.
The doc pages of PTVS state the installation of mpi4py is easy... and for ActiveState Python it seems to be true. However, if you don't use ActiveState's Python and instead use the "normal" Python distribution from python.org, you seem to be a bit out of luck.
My development machine is a laptop w/ Win7 64bit and Python 2.6, both in 64-bit and 32-bit flavors. I have installed the MS HPC Pack 2008 R2 MS MPI and SDK. I have Visual Studio 2008 and 2010, everything dutifully patched.
There's no binary installer and knowing how Unix MPIs can be exceedingly picky wrt to the MPI version they're linked with, I wanted to build my own mpi4py. mpi4py basically relies on having a MPI .dll (.pyd actually) that binds the python calls to MPI libs.
easy_installing of mpi4py and building of that library has failed - not being able to point to the MPI libs. OK, no problem, I downloaded the mpi4py tarball, extracted it and altered the mpi.cfg file so that it points to the correct folders:
# Microsoft MPI example
# ---------------------
[msmpi]
define_macros = MS_MPI=1
mpi_dir = $CCP_HOME
include_dirs = %(mpi_dir)s\Inc
libraries = msmpi
library_dirs = %(mpi_dir)s\lib\i386
The MS MPI installer registers an environment variable CCP_HOME pointing to the exact install location of the Pack. The name "CCP" must be leftover from the days it was called Microsoft Compute Cluster Pack). Got to pass this to the original mpi4py developer.
After this, the compilation passes fine, but I can't link - there are three unresolved externals:
MPI.obj : error LNK2019: unresolved external symbol _MPI_Type_create_f90_integer@8 referenced in ...
MPI.obj : error LNK2019: unresolved external symbol _MPI_Type_create_f90_real@12 ...
MPI.obj : error LNK2019: unresolved external symbol _MPI_Type_create_f90_complex@12 ...
Seems that the MS MPI msmpi.lib from HPC 2008 R2 does not implement these, so I can't build the MPI.pyd.
I could try to comment out these in the mpi4py C source file, but I don't think this is the right path.
Thanks in advance!
I was in the talk with @Hrvoje and the current maintainer of the source code at https://code.google.com/p/mpi4py/
Thanks guys for all your help.
I used Visual Studio 2012, Python 2.7.3 (64bit) and MPI4PY 1.3
Here are the changes:
mpi headers and libs are now in a different location (MS HPC 2008 R2) so my msmpi part of the mpi.cfg looks like this now:
[msmpi]
mpi_dir = $ProgramFiles\Microsoft HPC Pack 2008 R2
include_dirs = %(mpi_dir)s\inc
libraries = msmpi
#library_dirs = %(mpi_dir)s\lib\i386
library_dirs = %(mpi_dir)s\lib\amd64
Because the Python build environment looks for Visual Studio 2008 I had to manually add the variable it was looking for. It actually points to a VS 2012 directory, but all the build tools are compatible so it worked. This is what I've added.
VS90COMNTOOLS = C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\Tools\
Other than that, there were no changes and the setup.py produced both .exe and .msi without any issues.
I know it's a bit strange to answer my own question, but it may help someone. A similar problem occurs on Linux since not all the MPI implementations have all the declared calls actually implemented.
Seems that the mpi4py author has also had quite a workload when he worked things out...
If you make the union of missing/broken MPI-2 stuff in MPICH1/LAM/OpenMPI/MPICH2 (and derived implementations like Deino,Microsoft/Sun/SGI), you end-up having to test for a lot of stuff...
For the reasons stated above, you can compile the mpi4py lib without some functions. A "missing.h" source file in mpi4py tarball handles these situations.
So I have defined these:
PyMPI_MISSING_MPI_Type_create_f90_integer
PyMPI_MISSING_MPI_Type_create_f90_real
PyMPI_MISSING_MPI_Type_create_f90_complex
The wrapper lib compiled this way will raise an error if any of these missing functions ever gets called. It's missing.h from mpi4py that takes care of this. You can do this either by adding the #defines in the relevant file directly, or by adding this at the very end of setup.cfg file in the mpi4py tarball:
[build_ext]
define = PyMPI_MISSING_MPI_Type_create_f90_integer, PyMPI_MISSING_MPI_Type_create_f90_real, PyMPI_MISSING_MPI_Type_create_f90_complex
So, good luck in using mpi4py and MS MPI... Hope this helped someone else but myself...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With