Compile python extension for different C++ runtime - python

I am developing a plugin for plex media server. Plex uses Python 2.7 for plugin system. I need to use python-levenshtein package, which needs to be compiled for different systems.
I found python-Levenshtein wheels (manylinux, macos, arm, windows), but the wheel for Windows compiled with VS 2008 (msvcr90.dll). And plex embeded python compiled with VS 2013 (msvcr130.dll).
Being on Windows 10 how can i compile python package with different VS runtimes?

Note: compiling extensions for versions of Python other than the official releases is not supported by the Python team, and has to be supported by the product - in this case, Plex. So assuming Plex hasn't provided a way to do this, everything below is hacks and workarounds.
Also, if the library developer is not doing the builds then you're trying to cover for them, which means you are basically at their "level" and will need to know their code as well as they do. Welcome to open source software :)
Finally, this only applies to legacy Python 2. If anyone reading this is on Python 3.5 or later, just install the latest Visual Studio and use the latest compiler - the entire 14.x line is compatible and will work fine with recent Python versions.
Assuming you've been careful about how the extension interacts with the C Runtime, it's often safe to compile your extension against the "wrong" C runtime version. Things to watch out for here are:
allocating memory that will be freed by Python
freeing memory that was allocated by Python
passing/getting file descriptors or FILE* pointers to/from Python
setting any global settings like encoding or locale
writing to standard streams
If all of these are isolated, then you'll just end up with two C runtimes in memory and they won't try to interact.
Now, if for some reason this can't be done, you can try compiling the extension using a later toolset. The easiest way to do this is:
get the source code (I'm assuming setup.py based build via wheel)
open VS 2013 Developer Command prompt
run set DISTUTILS_USE_SDK=1 (this will bypass MSVC detection)
run set MSSdk=1 (also needed to bypass detection on legacy Python 2)
run python setup.py bdist_wheel (you may need to install wheel into Python first)
Now you can take the wheel file and install it however you normally would, perhaps by extracting/copying the files or passing the full filename to pip.
In general, there's a very high chance that simply building the extension will not succeed. In that case, you'll have to modify the source code to work - a lot of names were deprecated and removed in the three MSVC versions between VC 9.0 and VC 12.0, and so you won't see deprecation warnings for any of them.
If the library developer has already made their library work with Python 3, many of the fixes should be there but may not be detected (as Python 3 either used VC 10.0, which didn't need the fixes, or VC 14.x, which may be detected as _MSC_VER >= 1900 and won't detect VC 12.0). They may appreciate it if you contribute any fixes you make so that the next person is helped, but many library developers are dropping support for versions of Python prior to 3.5 now and so they may not be interested in maintaining legacy support.

Related

OS X Python: can I explicitly set MACOSX_DEPLOYMENT_TARGET for extensions?

I use the python.org framework build of Python under recent versions of OS X (i.e., 10.11 El Capitan). I need to build some extensions that rely on recent versions of compilers (e.g., C++-11 features). However the python.org python is built to work on older systems as well, for backward compatibility.
Hence, it has the environment variable MACOSX_DEPLOYMENT_TARGET=10.6. This means that extensions are built by default with a toolchain that, I think, mimics gcc-4.2, in particular in terms of what stdlib it searches.
In the past, I have fixed this by installing more recent compilers with homebrew and explicitly setting CC, CXX, etc before installation.
However, I have tried just setting MACOSX_DEPLOYMENT_TARGET=10.11, and that seems to work. Is this safe? Are there any downsides? (I don't need to distribute these builds, just use them locally?)
The python.org OS X 64-bit/32-bit framework builds for recent Python releases were built with MACOSX_DEPLOYMENT_TARGET set to 10.6 and built on Mac OS X 10.6 to ensure compatibility with a wide range of OS X releases. Currently that range is from 10.6 Snow Leopard through 10.11 El Capitan. When building C or C++ extension modules using Python's build-in Distutils or high-level tools that use Distutils (such as pip), by default the deployment target for the compilation and link environment is set to that of the interpreter build, so in this case 10.6, to try to produce extension modules that will work with the same range of OS X releases as the interpreter build itself. There is seldom a need to change this since Apple generally is very good about maintaining backwards compatibility for the system libraries and frameworks used by Python itself. However, as you discovered, you can override the deployment target to a newer release by setting the MACOSX_DEPLOYMENT_TARGET environment variable prior to building the extension module. (Distutils checks for and does not allow setting the deployment target to a release older than that used for the interpreter build.) Distutils also honors overriding various other build values by setting the corresponding "standard" environment variables, such as CC, CXX, CFLAGS, LDSHARED, etc.
One case where you might need to change the deployment target is if you are dealing with C++ code. As has been widely discussed elsewhere, in recent releases Apple has been in the process of migrating from the GCC-based libstdc++ to the Clang/LLVM libc++ standard library for C++ programs. Apple has been shipping both. The Python interpreter and its supplied standard library do not use C++ at all so this issue does not affect Python itself. But if you use extension modules that are written in C++ (or link to a third-party library written in C++), either your own or from a third-party package (downloaded from PyPI, for example), you may need to be careful that all of the C++ code is either built using the same C++ standard library or, if not, that the differing C++ modules do not share objects. I don't have personal experience with such C++ situations so I'm not sure what is the best way to avoid any problems that might arise. But a quick-and-dirty check might be to use Apple's otool command line utility on all extension modules and the shared libraries and frameworks they link with to find all references to either libstdc++ and libc++, so something like starting with inspecting the output from:
find -E . -regex '(.*\.so)|(.*\.dylib)' -exec otool -L '{}' ';'

Can I develop C extensions for Python 2.7 through 3.1 on Windows *without* installing Visual Studio 2008?

I'm aware that Python extensions on Windows typically have to be built with the same version of Visual Studio used to compile Python itself, and I'm further aware that Python 2.7 through 3.1 are built using Visual Studio 2008. But the machine I'm on already has VS 2013 installed, and, as I've discovered countless times, one way to rapidly mess up your Windows development environments is to install Visual Studio in any order than from oldest to newest. Besides which, install VS2008 on a brand-new Windows 8.1 box seems silly. Python extensions are the only thing I have that needs 2008; if I can avoid installing it, I'd really prefer not to.
Can I avoid installing VS 2008 and still build against the official Python distributions, perhaps by installing a specific Platform SDK? If not, is there an alternative build of Python that might go with e.g. MinGW, or something that does not require I install VS 2008?
I can suggest a few possible solutions to your problem. From potentially the easiest, to probably the hardest:
Just use Visual Studio 2013 to compile your extension modules. For this to work your extension module mustn't access any C runtime objects created by the Python interpreter, nor may it pass any C runtime objects it creates to the interpreter. In particular you can't use any FILE * or file descriptor objects provided by Python. You can still read and write to files in your module, just not files that Python has opened.
Uninstall Visual Studio 2013, install Visual Studio 2008, reinstall Visual Studio 2013. As silly as this sounds it's probably going to be a quicker and lot less frustrating than either of the following solutions. This will let you build extension modules pretty much normally and you won't have to worry about what C runtime objects you use.
Use mingw32 and employ various hacks to get it to work. This page explains how one person got it to work: https://lists.launchpad.net/kicad-developers/msg09473.html
Copy the appropriate msvcrt*.lib file from VS 2008 installed on another machine. Manually edit your linker options to use this library instead of VS 2013's msvcrt*.lib of the same name. If that doesn't work, copy the include files and other libraries as well, and modify your compiler and linker options to use them instead. If that still doesn't work, copy the VS 2008 command line compiler and all of its dependent DLLs, set the PATH correctly, and then modify your build process to use that compiler instead.

Compiling shared library for python to distribute

I'm compiling a SWIG wrapped C++ library into a python module, that should ideally be distributable for individuals to use the library transparently like a module. I'm building the library using cmake and swig on OSX 10.8.2 (System framework - Apple python2.7.2, Installed framework - python.org python 2.7.5)
The trouble I'm running into is that after linking with the framework, the compiled library is very selective of the version of python that is being run, even though otool -L shows that it is compiled with "compatability version 2.7.0". It appears that the different distributions have slightly different linker symbols and stuff starts to break
The most common problem is that it crashes the python kernel with a Fatal Python error: PyThreadState_Get: no current thread (according to: this question, indicative of a linking incompatability). I can get my library to work in the python it was compiled against.
Unfortunately this library is for use in a academic laboratory, with computers of all different ages and operating systems, many of them in permanent deprecation in order to run proprietary software that hasn't been updated in years, and I certainly don't have time to play I.T. and fix all of them, currently I've just been compiling against the version of python that comes with the latest Enthought distribution since most computers can get that in one way or another . A lot of the researchers I work with use some python IDE specific to their field that comes with an interpreter built in, but is not modifiable and not a Framework build (so I can't build against it), for the time being, they can run their experiment scripts in Enthought as a stop-gap, but its not ideal. Even when I link against the python.org distribution that is the same version as the built-in IDE python (2.7.2 I think, it even has the same damn release number), it still breaks the same way.
In any case, the question is, is there any way to link a SWIG wrapped python library so that it will run (at least on OSX) regardless of what interpreter is importing it (given certain minimum conditions, like guaranteed to be >=2.7.0).
EDIT
Compiling against canopy/python installed version with the following linker flags in cmake
set (CMAKE_SHARED_LINKER_FLAGS "-L ~/Library/Enthought/Canopy_32bit/User/lib -ldl -framework CoreFoundation -lpython2.7 -u _PyMac_Error ~/Library/Enthought/C\
anopy_32bit/User/lib")
This results in an #rpath symbol path when examining the linked library with otool, seems to work fine with enthought/canopy on other OSX systems, the -lpython seems to be optional, it adds an additional python symbol in the library reference to the osx python (not system python)
Compiling against system python with the following linker flags
set (CMAKE_SHARED_LINKER_FLAGS "-L /Library/Frameworks/Python.framework/Versions/Current/lib/python2.7/config -ldl -framework CoreFoundation -u _PyMac_Error /Library/Frameworks/Python.framework/Versions/Current/Python")
Works in enthought and system python
Neither of these work with the bundled python with psychopy, which is the target environment, compiling against the bundled python works with psychopy but no other python.
I've been getting the same error/having the same problem. I'd be interested if you've found a solution.
I have found that if I compile against the native python include directory and run the native OS X python binary /usr/bin/python that it works just fine, always. Even when I compile against some other python library (like the one I find at /Applications/Canopy.app/appdata/canopy-1.0.3.1262.macosx-x86_64/Canopy.app/Contents/include ) I can get the native OS X interpreter to work just fine.
I can't seem to get the Enthought version to work, ever. What directory are you compiling against for use with Enthought/Canopy?
There also seems to be some question of configuring SWIG at installation to know about a particular python library, but this might not be related: http://swig.10945.n7.nabble.com/SWIG-Installation-mac-osx-10-8-3-Message-w-o-attachments-td13183.html

How to obtain pre-built *debug* version of Python library (e.g. Python27_d.dll) for Windows

Firstly, I should state that my current development environment is MSYS + mingw-w64 + ActivePython under Windows 7 and that on a normal day I am primarily a Linux developer. I am having no joy obtaining, or compiling, a version of the Python library with debug symbols.
I need both 32bit and 64bit debug versions of the Python27.dll file, ideally. I want to be able to embed Python and implement Python extensions in C++, and be able to call upon a seamless debugging facility using the gdb-7.4 I have built for mingw-w64, and WingIDE for the pure Python side of things.
Building Python 2.7.3 from source with my mingw-w64 toolchain is proving too problematic -- and before anyone flames me for trying: I acknowledge that this environment is unsupported, but I thought I might be able to get this working with a few judicious patches (hacks) and:
make OPT='-g -DMS_WIN32 -DWIN32 -DNDEBUG -D_WINDOWS -DUSE_DL_EXPORT'
I was wrong... I gave up at posixmodule.c since the impact of my changes became uncertain; ymmv.
I have tried building with Visual C++ 2010 Express but being primarily a Linux developer the culture-shock is too much for me to bear today; the Python project does not even import successfully. Apparently, I need Visual C++ 2008, yet I am already convinced I don't want to go down this road if at all possible...
It's really surprising to me that there is not a zip-file providing the requisite .dlls somewhere on the Internet. ActiveState should really provide these as an optional download with each release of ActivePython that they make -- perhaps that's where the paid support comes in ;-).
What is the best way to obtain the Python debug library files given my environment?
I've just built CPython 2.7.5 in debug mode with Visual Studio 2012 Express (free).
I documented the process via wiki page: https://wiki.python.org/moin/VS2012
The best way to create a debug version of Python under Windows is to use the Debug build in the Visual Studio projects that come with the Python source, using the compiler version needed for the specific Python release, i.e. VS 2008.
There may be other ways, but this is certainly the best way.
If you really need a 64-bit debug build also, the best way is to buy a copy of VS 2008 (i.e. not use the Express version). It may be possible to create an AMD64 debug build using the SDK 64-bit compiler, but again, using the officially-supported procedures is the best way.
The libs can be aquired from the official Python site https://www.python.org/ftp/python/. Just navigate to the specific version you like and then download the respective installer.
For installation you can call the installers like this :
your_installer.msi targetdir="the_path_to_your_directory_of_choice"
or simply Run them as Administrator.
You may also use the Python installer to download the debug symbols as well(use the webinstall version).

Can I use VS2005 to build extensions for a Python system built with VS2003

RDFLib needs C extensions to be compiled to install on ActiveState Python 2.5; as far as I can tell, there's no binary installer anywhere obvious on the web. On attempting to install with python setup.py install, it produces the following message:
error: Python was built with Visual Studio 2003;
extensions must be built with a compiler than can generate compatible binaries.
Visual Studio 2003 was not found on this system. If you have Cygwin installed,
you can try compiling with MingW32, by passing "-c mingw32" to setup.py.
There are various resources on the web about configuring a compiler for distutils that discuss using MinGW, although I haven't got this to work yet. As an alternative I have VS2005.
Can anyone categorically tell me whether you can use the C compiler in VS2005 to build Python extension modules for a VS2003 compiled Python (in this case ActiveState Python 2.5). If this is possible, what configuration is needed?
The main problem is C run-time library. Python 2.4/2.5 linked against msvcr71.dll and therefore all C-extensions should be linked against this dll.
Another option is to use gcc (mingw) instead of VS2005, you can use it to compile python extensions only. There is decent installer that allows you to configure gcc as default compiler for your Python version:
http://www.develer.com/oss/GccWinBinaries
I can't tell you categorically, but I don't believe you can. I've only run into this problem in the inverse situation (Python built with VS2005, trying to build with VS2003). Searching the web did not turn up any way to hack around it. My eventual solution was to get VC Express, since VC2005 is when Microsoft started releasing the free editions. But that's obviously not an option for you.
I don't use ActiveState Python, but is there a newer version you could use? The source ships with project files for VS2008, and I'm pretty sure the python.org binary builds stopped using VS2003 a while ago.
As of today Mar 2012, I can categorically say it is possible with Python2.4.4 (only one I've tested) and Visual Studio 2005 and 2008. Just installing VS10 to check that. I don't know why it works and I have problems using distutils so I have to compile manually.

Categories

Resources