I have a Python module which wraps a C++ library. The library uses MPI and is compiled with mpicxx. Everything works great on some machines, but on others I get this:
ImportError: ./_pyCombBLAS.so: undefined symbol: _ZN3MPI3Win4FreeEv
So there's an undefined symbol from the MPI library. As far as I can tell mpicxx should link everything in, yet it doesn't. Any ideas why?
It turns out that the fault was that the wrong libraries were being loaded.
As you know, a cluster is likely to have several versions of MPI installed, sometimes the same version is compiled with several compilers. These are all likely to have the same filenames. In my case even though I compiled with, say MPICH GNU, the default path was to the OpenMPI PGI libraries. I didn't realize this, I thought compiling with MPICH GNU would mean the MPICH GNU libraries would be found at runtime.
Of course I couldn't actually use PGI-compiled OpenMPI because Python was compiled with GCC and PGI doesn't emit binaries fully compatible with GCC.
The solution is to set the LD_LIBRARY environment variable to match the MPI distribution you used to compile your code.
It's a shared library problem. Try running ldd on the extension module on both the system where it works and the system where it fails.
ldd _extension.so
This should show you all the libraries your extension depends on so you can make sure they're available.
A simple way to work around it may be to statically link the dependencies into your extension.
symbol ZN3MPI3Win4FreeEv is defined is libmpi_cxx.so, so one have to link with -lmpi_cxx instead of -lmpi
Related
I could always use mpi with C++, compiling programs via mpicxx. Some time ago, I also had to use mpi with Python. I installed the Python package mpi4py via Anaconda and my Python mpi codes worked fine.
Now, however, I went back to C++. When trying to compile with mpicxx, I get an error:
"
mpicxx -o TEST.exe TEST.cpp
The Open MPI wrapper compiler was unable to find the specified compiler
x86_64-conda_cos6-linux-gnu-c++ in your PATH.
Note that this compiler was either specified at configure time or in
one of several possible environment variables."
Apparently, installing mpi4py with Anaconda messed something up as the system tries to look for a compiler with "conda" in its name...
Can someone guide me to fix this in a way that I can freely use both, C++ and Python, with mpi?
edit: Using Ubuntu 18.04.
I'm trying to dig a bit deeper into binary packages with python with the wheel format, and I thought I might try to give wheelifying one of my dependencies as an exercise.
To maximize the reusability of the resulting package I want to build it using the manylinux containers, which compiles the binaries included in the package with the lowest common denominator that I'm likely to encounter. However I'm unsure how to avoid linking against a specific libpython.so. The docs describe this in a rather abstract way:
Note that libpythonX.Y.so.1 is not on the list of libraries that a manylinux1 extension is allowed to link to. Explicitly linking to libpythonX.Y.so.1 is unnecessary in almost all cases: the way ELF linking works, extension modules that are loaded into the interpreter automatically get access to all of the interpreter's symbols, regardless of whether or not the extension itself is explicitly linked against libpython. Furthermore, explicit linking to libpython creates problems in the common configuration where Python is not built with --enable-shared. In particular, on Debian and Ubuntu systems, apt install pythonX.Y does not even install libpythonX.Y.so.1, meaning that any wheel that did depend on libpythonX.Y.so.1 could fail to import.
So I'm wondering, SWIG generates the python wrappers for the C functions, and obviously makes use of the libpython functions, so the linker will try to link in libpython, which manylinux doesn't provide. If I understand correctly I am supposed to somehow leave the references dangling/unlinked and the interpreter will provide those symbols at runtime when loading the shared library, but how am I supposed to compile this in the first place?
I'm wondering if anyone has experience with cross-compiling to windows with pyo3 and maturin.
The pyo3 docs say:
Cross compiling PyO3 modules is relatively straightforward and
requires a few pieces of software:
A toolchain for your target. The appropriate options in your Cargo
.config for the platform you're targeting and the toolchain you are
using. A Python interpreter that's already been compiled for your
target. The headers that match the above interpreter.
I've found the interpreter from here, but I don't understand how to get a windows python interpreter and libraries, or even what that means exactly.
I'm using maturin to build the python wheels and it works well on OS X, but I don't know how to start cross-compiling for windows.
Can anyone help me out?
The goal is a wheel per Python version / architecture / OS.
An easy way to achieve this is through external runners (e.g. gitub runners) and an action that compiles the package using maturin for each of the platforms.
Here is how hyperjson does it.
pyo3 now has experimental Windows feature to bypass this process entirely using python3-dll-a. Read more here.
I'm compiling a SWIG wrapped C++ library into a python module, that should ideally be distributable for individuals to use the library transparently like a module. I'm building the library using cmake and swig on OSX 10.8.2 (System framework - Apple python2.7.2, Installed framework - python.org python 2.7.5)
The trouble I'm running into is that after linking with the framework, the compiled library is very selective of the version of python that is being run, even though otool -L shows that it is compiled with "compatability version 2.7.0". It appears that the different distributions have slightly different linker symbols and stuff starts to break
The most common problem is that it crashes the python kernel with a Fatal Python error: PyThreadState_Get: no current thread (according to: this question, indicative of a linking incompatability). I can get my library to work in the python it was compiled against.
Unfortunately this library is for use in a academic laboratory, with computers of all different ages and operating systems, many of them in permanent deprecation in order to run proprietary software that hasn't been updated in years, and I certainly don't have time to play I.T. and fix all of them, currently I've just been compiling against the version of python that comes with the latest Enthought distribution since most computers can get that in one way or another . A lot of the researchers I work with use some python IDE specific to their field that comes with an interpreter built in, but is not modifiable and not a Framework build (so I can't build against it), for the time being, they can run their experiment scripts in Enthought as a stop-gap, but its not ideal. Even when I link against the python.org distribution that is the same version as the built-in IDE python (2.7.2 I think, it even has the same damn release number), it still breaks the same way.
In any case, the question is, is there any way to link a SWIG wrapped python library so that it will run (at least on OSX) regardless of what interpreter is importing it (given certain minimum conditions, like guaranteed to be >=2.7.0).
EDIT
Compiling against canopy/python installed version with the following linker flags in cmake
set (CMAKE_SHARED_LINKER_FLAGS "-L ~/Library/Enthought/Canopy_32bit/User/lib -ldl -framework CoreFoundation -lpython2.7 -u _PyMac_Error ~/Library/Enthought/C\
anopy_32bit/User/lib")
This results in an #rpath symbol path when examining the linked library with otool, seems to work fine with enthought/canopy on other OSX systems, the -lpython seems to be optional, it adds an additional python symbol in the library reference to the osx python (not system python)
Compiling against system python with the following linker flags
set (CMAKE_SHARED_LINKER_FLAGS "-L /Library/Frameworks/Python.framework/Versions/Current/lib/python2.7/config -ldl -framework CoreFoundation -u _PyMac_Error /Library/Frameworks/Python.framework/Versions/Current/Python")
Works in enthought and system python
Neither of these work with the bundled python with psychopy, which is the target environment, compiling against the bundled python works with psychopy but no other python.
I've been getting the same error/having the same problem. I'd be interested if you've found a solution.
I have found that if I compile against the native python include directory and run the native OS X python binary /usr/bin/python that it works just fine, always. Even when I compile against some other python library (like the one I find at /Applications/Canopy.app/appdata/canopy-1.0.3.1262.macosx-x86_64/Canopy.app/Contents/include ) I can get the native OS X interpreter to work just fine.
I can't seem to get the Enthought version to work, ever. What directory are you compiling against for use with Enthought/Canopy?
There also seems to be some question of configuring SWIG at installation to know about a particular python library, but this might not be related: http://swig.10945.n7.nabble.com/SWIG-Installation-mac-osx-10-8-3-Message-w-o-attachments-td13183.html
RDFLib needs C extensions to be compiled to install on ActiveState Python 2.5; as far as I can tell, there's no binary installer anywhere obvious on the web. On attempting to install with python setup.py install, it produces the following message:
error: Python was built with Visual Studio 2003;
extensions must be built with a compiler than can generate compatible binaries.
Visual Studio 2003 was not found on this system. If you have Cygwin installed,
you can try compiling with MingW32, by passing "-c mingw32" to setup.py.
There are various resources on the web about configuring a compiler for distutils that discuss using MinGW, although I haven't got this to work yet. As an alternative I have VS2005.
Can anyone categorically tell me whether you can use the C compiler in VS2005 to build Python extension modules for a VS2003 compiled Python (in this case ActiveState Python 2.5). If this is possible, what configuration is needed?
The main problem is C run-time library. Python 2.4/2.5 linked against msvcr71.dll and therefore all C-extensions should be linked against this dll.
Another option is to use gcc (mingw) instead of VS2005, you can use it to compile python extensions only. There is decent installer that allows you to configure gcc as default compiler for your Python version:
http://www.develer.com/oss/GccWinBinaries
I can't tell you categorically, but I don't believe you can. I've only run into this problem in the inverse situation (Python built with VS2005, trying to build with VS2003). Searching the web did not turn up any way to hack around it. My eventual solution was to get VC Express, since VC2005 is when Microsoft started releasing the free editions. But that's obviously not an option for you.
I don't use ActiveState Python, but is there a newer version you could use? The source ships with project files for VS2008, and I'm pretty sure the python.org binary builds stopped using VS2003 a while ago.
As of today Mar 2012, I can categorically say it is possible with Python2.4.4 (only one I've tested) and Visual Studio 2005 and 2008. Just installing VS10 to check that. I don't know why it works and I have problems using distutils so I have to compile manually.