Building a PEP517 package with MinGW - python

I am a package maintainer for a Python package that provides an interface to an academic project. The main code is written in C and there are some Python scripts to add some extra functionality. The core C code is relatively portable (for C) but does not currently build using the MS Visual Studio Compiler. I do not have enough Windows users that this is a problem right now, and I am fine with simply telling them that they need to do some extra work to use MinGW to install the Python package (as they need to do to use the core package).
The package uses PEP517 because deprecation warnings from prior standards gave me the impression that I needed to migrate. However, I have now discovered that the --global-option flag (specified by this post) is unavailable for PEP517 packages meaning there is no clear way to specify MinGW as the compiler.
I discovered that there is a --config-settings flag for PEP517 packages, but trying the obvious settings (compiler=mingw64) did not change anything.
How can I tell pip to use mingw64 as the compiler for my package on Windows? If there was a way to put this information in the package itself, that would be even better, but I am happy with just being able to install it with some extra command-line parameters.

Related

Check if Python Package requires Visual Studio Build Tools

From this previous SO question, I understand that some python packages rely on Visual Studio Build Tools to compile sources from C on Windows.
I want to avoid using python packages that require Visual Studio Build Tools in my open source projects. I do not want to force my open-source users to install Build Tools for my python app. This seems like a major decrease in usability and makes it harder to install.
How can I determine which open-source python packages require Visual Studio Build Tools? I have checked pypi.org and libraries.io, as well as looked at the requirements.txt in each package's GitHub repo. I cannot find any sort of requirement for Visual Studio Build Tools, but when I try pip install on the package, it says it requires Build Tools...
I did find this website of pre-compiled python libraries from another SO post. Is it safe to assume that any library on this list requires Build Tools? However, I'm assuming this list is not exhaustive. Is there a better source of information?
Please note that I'm NOT asking how to install Visual Studio Build Tools. Thanks in advance for any help!
I posted this on Reddit and received the following answer, which I believe is correct. Feel free to add another answer if needed.
Basically the answer is any time a package must compile C code to build. It's not obvious looking at a packages requirements because the build commands provided by setup tools automatically look for Visual Studio on Windows.
In the distutils configuration file you can set the compiler of your choice: https://docs.python.org/3/install/index.html#using-non-microsoft-compilers-on-windows . And then when installing via pip you can choose the --no-binary options to make sure you don't get binaries pre-compiled by Visual Studio: https://pip.pypa.io/en/stable/reference/pip_install/#cmdoption-no-binary
You might also be interested in conda-forge which provides an open source way of providing many package binaries via the conda package manager: https://conda-forge.org/
Follow up on whether there is any way to detect the use of compilers in a arbitrary python package:
I assume a lot of packages will have distutils.core.Extension called at some point in their setup.py: https://docs.python.org/3/extending/building.html#building-c-and-c-extensions-with-distutils
Though setup.py allows you to call arbitrary Python code so the actual C compilation could be hidden somewhere completely different or even use an entirely different build process.
Basically if the license or adherence to FOSS principles it important to your application you either need to base your dependency chain on work other people have already done (e.g. conda-forge) or you need to spend time really investigating the status of every dependency you use :/

How to include VTK for Python3 installation into setup.py?

VTK library cannot be installed via pip.
Though, it can be compiled and installed from sources.
My Python project depends on VTK.
I want it to install VTK automatically by calling pip install . from root directory of the project.
In this case setup.py file should be able to
download VTK sources of needed version from GitHub
call cmake in order to prepare build
compile sources and create Python bindings
install needed files into currently used site-packages (e. g., it should not be installed into /usr/local/lib/python3/site-packages if I use virtualenv, pipenv or pyenv)
Is it possible?
If yes, how can I do this?
In principle, you can include any executable code in the setup file. However, nowhere in the setuptools documentation could I find information that would solve the problem here.
Also, the installation procedure for vtk is a bit complex which is why kitware uses cmake in the first place.
So, the short answer would be "no" or "don't do that".
Further, the problems you will encounter:
Users will expect a transparent install. But achieving a cross-platform build process on the basis of the cmake build instructions for vtk will prevent you from setting the customization (path to vtk, path to the Python interpreter, platform-specific C flags).
The install process will be harder to debug. Users will come to you for VTK build problems.
Kitware themselve do not propose vtk on pypi. This suggests that it is too much time intensive, impossible or too fragile to maintain to achieve this goal.
If you wish to see a popular Python project that relies on vtk, there is mayavi. The installation instructions request to install vtk beforehand.
Looks like VTK has presented their official binding via pypi, so I can use it in setup.py file simply by appending it to install_requires list.
Works well in my project, doesn't need compilation anymore.
Though, there is a caution in the mayavi documentation
The latest VTK wheels are available on all the major platforms
(Windows, MacOS, and Linux), but only for 64 bit machines. Python 3.x
is fully supported on all these operating systems and Python 2.7.x on
MacOS and Linux. If you are out of luck, and your platform is not
supported then you will need to install VTK yourself using your
particular distribution as discussed in the General Build and
Installation instructions

Building software installer with built-in python on Windows

Our command line utility is written with Python.
On Linux/OS X this is usually not a problem since both come with Python 2.x pre installed. However on Windows, Python isn't installed by default.
Additional problem is that few of our dependencies require compiling which yet again is not a trivial problem for Windows users since it requires tinkering with MSVC/Cygwin/etc'.
Up until now we solved this issue by using Pyinstaller to create "frozen" Python package with pre-installed dependencies. This worked well, however made our utility non extendable - we cannot add additional Python modules by using utilities such as pip for example. Since our CLI depends on this ability in order to add additional usability, this limitation became a blocker for us and we would like to rethink our approach.
Searching around, I found how Rhodecode solve this. Basicly their installer brings Python and everything else (including pre-compiled dependencies).
This seem as a good idea for us, the only limitation I see here is that their installer actually installs Python from .msi which puts stuff in Windows Registry. So there can be only one Python of version X.Y installed on Windows (from .msi)
For a server application this might be reasonable since server application tends to act like it's the only thing installed on the PC, but for command line utility, this is completely unacceptable.
Looking around I found few projects that claim to make Python portable - for example Portable Python. However I'm not sure how "portable" it really is, especially after issues like this.
So questions are:
Is it possible to install same Python version multiple times on Windows without creating collisions between the instances?
Would you choose other workaround to solving this problem (please no "smart" solutions such as: drop Windows support/don't use Python)
Thanks!
Frankly I would stick with PyInstaller or something similar. That will always provide you with the correct version of Python whether or not the target machine has Python installed. It also protects you from clobbering a previously installed version of Python.
If you need to add plugins, then you should build that into your app. There are a few projects that might help you with that. Here are a couple of examples:
http://yapsy.sourceforge.net/
https://pypi.python.org/pypi/Plugins/
You might also take a look at how Django or Flask handles extensions. For example, you can add an extension to Flask to allow it to work with SQLAlchemy. You should be able to do something similar with your own application. The pip utility won't work with a frozen application after all. Alternatively, have you looked at conda? It might work for your purposes.
This is an answer for my second question, sadly I still haven't figured out a better solution for number 1.
For now here's how we changed our approach for creating setup file:
We package our code and its dependencies as set of pre-built Python wheels. It's relatively easy to create pre-built wheels on Windows since the release of Visual C++ compiler for Python2.7.
We package Python setup MSI together with Pip, Setuptools and Virtualenv wheels.
Before install starts we check whether Python, Pip and Virtualenv are already installed (by looking in the registry and \Scripts), if it's not, we install it from the packaged wheels. Pip wheel is installed by using get-pip.py script which we bundle as well.
We create separate Virtualenv and install our wheels into it.
Uninstall is done by removing the virtualenv folder. Python, Pip and Virtualenv install are left behind.
Installer created by Inno Setup.
I'm still not fully satisfied with this solution since some components are globally installed and might collide with what user had already installed before (older/newer version of python, pip, setuptools or virtualenv). This creates potential for unpredictable bugs during install or runtime. Or the possibility for user to upgrade one of the components in the future and somehow break the application.
Also, uninstall is dirty and leaves stuff behind.

Installing QuTIP 2.2.0 with existing Python distribution on Windows

Has anyone managed to install QuTIP 2.2.0 with an existing Python 2.7.5 distribution (on Win7)? The instruction manual suggests that I need to install Python(x,y) first, but the instructions are pretty vague. I'm still a Python newbie.
Understanding installation instructions
The installation instructions are pretty clear, but I remember the times, I got lost in those short lines assuming I know something obvious.
I will try translating it
install Python(X,Y) - do it. Follow the link, download the exe file and run it.
Do not forget to set the options, following defaults will fail, Cython option must be included.
edit the distutils.cfg file as instructed
download tar.gz archive for QuTIP from PyPi, unpack it in some directory, cd into it to be in the directory, where you see setup.py, and run $ python setup.py install
The distutils.cfg refers to mingw32, this is needed for compilation. If it is not installed with Python(X,Y), you would have to install it separately. Be careful and install proper version, even on 64 bit systems use 32 bit one (this I assume from proposed name of compiler in config).
Good luck. I am not on Windows for about 2 years, so I cannot confirm, it works, but I hope, it will move you on.
I also had a lot of problems installing it correctly.
Here is my working solution.
As the installation instructions suggest:
Install PythonXY (I am using 2.7.6.1) (including Cython package) (Edit: The newer versions of PythonXY do not include a compiler. Try installing from here instead: https://code.google.com/p/pythonxy/wiki/AdditionalPlugins)
Edit C:\Python27\Lib\distutils\distutils.cfg to include:
[build]
compiler = mingw32
[build_ext]
compiler = mingw32
Add C:/MinGW32-xy/bin to your PATH. It has to be before other paths with e.g. gcc in them. You can do this:
set PATH=C:/MinGW32-xy/bin;%PATH%
for a temporary add (in that console) or use the answer here.
For a permanent change go properties of your Computer. Go to advanced system settings --> Environment Variables. Change the System Variable to have the MinGW path as the first entry. It doesn't work if it is last or in the user path!
Run a Python interpreter:
import qutip; qutip.testing
qutip.testing.run()
If it doesn't crash on the 7th test you probably have a working copy of qutip.
I get 320 test in 2194.236s and SKIP=7, errors=5.
Details of which tests failed for my can be seen here.
I would use Anaconda 2.7 with the added mingw and libpython libraries and then edit the distutils.cfg as stated. The skipped tests mentioned are fortran tests that you cannot run on windows, while the errors are time dependent tests that generate cython code at run time. If you follow the above suggestions then those tests will pass.
I've tried to install qutip for several hours, unsuccessfully. Fortunately, kind people from University of California have a solution:
wheels for lots of packages for python computing
This resource is not official, 'as is', but works better!
Type 'pip install package.whl' to download and build.

Python packages installation in Windows

I recently began learning Python, and I am a bit confused about how packages are distributed and installed.
I understand that the official way of installing packages is distutils: you download the source tarball, unpack it, and run: python setup.py install, then the module will automagically install itself
I also know about setuptools which comes with easy_install helper script. It uses eggs for distribution, and from what I understand, is built on top of distutils and does the same thing as above, plus it takes care of any dependencies required, all fetched from PyPi
Then there is also pip, which I'm still not sure how it differ from the others.
Finally, as I am on a windows machine, a lot of packages also offers binary builds through a windows installer, especially the ones that requires compiling C/Fortran code, which otherwise would be a nightmare to manually compile on windows (assumes you have MSVC or MinGW/Cygwin dev environment with all necessary libraries setup.. nonetheless try to build numpy or scipy yourself and you will understand!)
So can someone help me make sense of all this, and explain the differences, pros/cons of each method. I'd like to know how each keeps track of packages (Windows Registry, config files, ..). In particular, how would you manage all your third-party libraries (be able to list installed packages, disable/uninstall, etc..)
I use pip, and not on Windows, so I can't provide comparison with the Windows-installer option, just some information about pip:
Pip is built on top of setuptools, and requires it to be installed.
Pip is a replacement (improvement) for setuptools' easy_install. It does everything easy_install does, plus a lot more (make sure all desired distributions can be downloaded before actually installing any of them to avoid broken installs, list installed distributions and versions, uninstall, search PyPI, install from a requirements file listing multiple distributions and versions...).
Pip currently does not support installing any form of precompiled or binary distributions, so any distributions with extensions requiring compilation can only be installed if you have the appropriate compiler available. Supporting installation from Windows binary installers is on the roadmap, but it's not clear when it will happen.
Until recently, pip's Windows support was flaky and untested. Thanks to a lot of work from Dave Abrahams, pip trunk now passes all its tests on Windows (and there's a continuous integration server helping us ensure it stays that way), but a release has not yet been made including that work. So more reliable Windows support should be coming with the next release.
All the standard Python package installation mechanisms store all metadata about installed distributions in a file or files next to the actual installed package(s). Distutils uses a distribution_name-X.X-pyX.X.egg-info file, pip uses a similarly-named directory with multiple metadata files in it. Easy_install puts all the installed Python code for a distribution inside its own zipfile or directory, and places an EGG-INFO directory inside that directory with metadata in it. If you import a Python package from the interactive prompt, check the value of package.__file__; you should find the metadata for that package's distribution nearby.
Info about installed distributions is only stored in any kind of global registry by OS-specific packaging tools such as Windows installers, Apt, or RPM. The standard Python packaging tools don't modify or pay attention to these listings.
Pip (or, in my opinion, any Python packaging tool) is best used with virtualenv, which allows you to create isolated per-project Python mini-environments into which you can install packages without affecting your overall system. Every new virtualenv automatically comes with pip installed in it.
A couple other projects you may want to be aware of as well (yes, there's more!):
distribute is a fork of setuptools which has some additional bugfixes and features.
distutils2 is intended to be the "next generation" of Python packaging. It is (hopefully) adopting the best features of distutils/setuptools/distribute/pip. It is being developed independently and is not ready for use yet, but eventually should replace distutils in the Python standard library and become the de facto Python packaging solution.
Hope all that helped clarify something! Good luck.
I use windows and python. It is somewhat frustrating, because pip doesn't always work to install things. Python is moving to pip, so I still use it. Pip is nice, because you can uninstall items and use
pip freeze > requirements.txt
pip install -r requirements.txt
Another reason I like pip is for virtual environments like venv with python 3.4. I have found venv a lot easier to use on windows than virtualenv.
If you cannot install a package you have to find the binary for it. http://www.lfd.uci.edu/~gohlke/pythonlibs/
I have found these binaries to be very useful.
Pip is trying to make something called a wheel for binary installations.
pip install wheel
wheel convert path\to\binary.exe
pip install converted_wheel.whl
You will also have to do this for any required libraries that do not install and are required for that package.

Categories

Resources