From this previous SO question, I understand that some python packages rely on Visual Studio Build Tools to compile sources from C on Windows.
I want to avoid using python packages that require Visual Studio Build Tools in my open source projects. I do not want to force my open-source users to install Build Tools for my python app. This seems like a major decrease in usability and makes it harder to install.
How can I determine which open-source python packages require Visual Studio Build Tools? I have checked pypi.org and libraries.io, as well as looked at the requirements.txt in each package's GitHub repo. I cannot find any sort of requirement for Visual Studio Build Tools, but when I try pip install on the package, it says it requires Build Tools...
I did find this website of pre-compiled python libraries from another SO post. Is it safe to assume that any library on this list requires Build Tools? However, I'm assuming this list is not exhaustive. Is there a better source of information?
Please note that I'm NOT asking how to install Visual Studio Build Tools. Thanks in advance for any help!
I posted this on Reddit and received the following answer, which I believe is correct. Feel free to add another answer if needed.
Basically the answer is any time a package must compile C code to build. It's not obvious looking at a packages requirements because the build commands provided by setup tools automatically look for Visual Studio on Windows.
In the distutils configuration file you can set the compiler of your choice: https://docs.python.org/3/install/index.html#using-non-microsoft-compilers-on-windows . And then when installing via pip you can choose the --no-binary options to make sure you don't get binaries pre-compiled by Visual Studio: https://pip.pypa.io/en/stable/reference/pip_install/#cmdoption-no-binary
You might also be interested in conda-forge which provides an open source way of providing many package binaries via the conda package manager: https://conda-forge.org/
Follow up on whether there is any way to detect the use of compilers in a arbitrary python package:
I assume a lot of packages will have distutils.core.Extension called at some point in their setup.py: https://docs.python.org/3/extending/building.html#building-c-and-c-extensions-with-distutils
Though setup.py allows you to call arbitrary Python code so the actual C compilation could be hidden somewhere completely different or even use an entirely different build process.
Basically if the license or adherence to FOSS principles it important to your application you either need to base your dependency chain on work other people have already done (e.g. conda-forge) or you need to spend time really investigating the status of every dependency you use :/
Related
I am a package maintainer for a Python package that provides an interface to an academic project. The main code is written in C and there are some Python scripts to add some extra functionality. The core C code is relatively portable (for C) but does not currently build using the MS Visual Studio Compiler. I do not have enough Windows users that this is a problem right now, and I am fine with simply telling them that they need to do some extra work to use MinGW to install the Python package (as they need to do to use the core package).
The package uses PEP517 because deprecation warnings from prior standards gave me the impression that I needed to migrate. However, I have now discovered that the --global-option flag (specified by this post) is unavailable for PEP517 packages meaning there is no clear way to specify MinGW as the compiler.
I discovered that there is a --config-settings flag for PEP517 packages, but trying the obvious settings (compiler=mingw64) did not change anything.
How can I tell pip to use mingw64 as the compiler for my package on Windows? If there was a way to put this information in the package itself, that would be even better, but I am happy with just being able to install it with some extra command-line parameters.
I created a python library using setuptools that contains .so files. When I try to pip install the library the .so files aren't being installed into my virtual environment.
Your question, as it stands, is somewhat vague so I can't be sure if I'm answering it. However, from the setup.py which you've pasted as a comment, I think you've not specified how to build the extension locally. Putting the .so directly as part of your package is not very wise since it's not cross platform. The opposite approach is to compile it on the target machine but they'll need a compiler toolchain locally installed.
Please refer to the example here which describes how to include C extensions in your project.The official python packaging page on this topic is incomplete and there's an issue describing this over here. You might find something useful there as well.
Assume that someone wants to package a Python (Cython) library that depends on the C++ boost library.
What is the best way to configure the setup.py so that the user is properly informed that it is required to install the boost library (i.e., apt-get install libboost-dev in Ubuntu, etc in other OSes)? Or is it a better practice to include the boost library in the python package distribution?
The question is better asked as
What is the best way to distribute a Python extension including
an external library dependency.
This is better dealt with binary wheel packages.
User does not need to know anything about setup.py, which is used for building and installing source code. User just needs to download and install a binary wheel package.
Including just the header files does not solve the problem of needing the library to build with and link to. It also opens up issues with version incompatibilities.
So setup.py need not have anything special about any of this, it just needs to know where to find headers which will be a sub-dir in your project if the library is included and which libraries to link with.
The documentation should include instructions on how to build from source, for which more than just boost is needed (python header files, appropriate compilers etc).
Tools like auditwheel then take care of bundling external library dependencies into the binary wheel, so end-users need not have the library installed to use your package.
See also manylinux for distributing binary Python extensions and this demo project.
I installed distribute and pip using the links I have just given. I also installed the Microsoft Visual C++ 2008 redistributable package. However when I try to use pip.exe I get
error: Unable to find vcvarsall.bat
How can I fix this?
Installing the Microsoft Visual C++ 2008 Redistributable Package is not sufficient to compile packages. You need to install a compiler, not just the support files.
There are three ways to do this:
Install Visual C++.
Use mingw's port of gcc instead of Visual C++.
Use cygwin's port of gcc instead of either, and a cygwin build on Python instead of the native one.
If you want to go with option 1, you need to install Visual C++ itself. The free version should work just as well as the paid version, as long as you're not going to build binary packages to redistribute to others. Unfortunately, I'm not sure where to find the 2008 version anymore. As of May 2013, the download page only has 2010 and 2012.
When you install this, it will create a batch file called vcvarsall.bat (not vcvarshall.bat!), and give you the option of putting that batch file in your PATH. Running that batch file sets up a DOS prompt for building with that version of Visual C++. (This is handy if you have multiple versions of Visual C++, or other compilers, around.) If you skip that option, you will have to do it manually.
This question shows how to use a newer Visual Studio with older Python, and also shows how to point distutils at a vcvarsall.bat that's not on your PATH, and has links to a whole lot of other relevant questions and blog posts.
Many people find option 2 simpler. Install mingw, modify your PATH in the environment to include C:\MinGW\bin (or wherever you choose to install it), and pass -c mingw32 whenever you run a setup.py script.
The problem is that it's not as clearly documented how to tell easy_install and pip to use mingw instead of VC++. To do that, you need to find or create a distutils.cfg file, find or create a [build] section within it, and add compiler=mingw32. Not too hard. This blog post looks like it explains things pretty well, or see this answer.
Option 3 is by far the simplest. Install cygwin, tell it to install the Python and gcc packages, and you're done.
The problem is that you don't have native Windows Python, you have a Unix Python running in a fake Unix environment on top of Windows. If you like Cygwin, you'll love this; otherwise, you won't.
You'll receive such error only for packages (or one of package's dependencies) that has CPython extensions. Pip internally:
downloads the source
runs distutils python setup install
install prepares setup files and tries to build CPython extensions in windows environment
windows environment calls MS Visual Studio vcvarsall.bat script which setups DOS environment variables to enable MS Visual Studio's C compiler in the shell
if vcvarsall.bat is not found - you'll get this message
Usual solution
For python libraries which have CPython extensions that are portable on windows, it is usual to have windows binary package, which are downloadable from pypi or library web site.
In such cases it is more suitable (and painless) to install library by downloading and running windows binary package.
There is a feature request for pip to Add support for installation of binary distutils packages on Windows.
New way to do it - wheels
Thanks to comment from #warren-p: That feature request has been superseeded by Wheels support in PIP.
Official description: A wheel is a ZIP-format archive with a specially formatted filename and the .whl extension.
As I have understood, if there is windows binary package with extension .whl then start by installing wheel first:
# Make sure you have the latest pip that supports wheel
pip install --upgrade pip
pip install wheel
and then install .whl like this:
pip install full-path-or-url-to-your-library.whl
References:
pythonwheels.com
https://pypi.python.org/pypi/wheel
http://wheel.readthedocs.org/en/latest/
You can download Visual Studio 2008 Express SP1 from
http://visual-studio-2008.en.malavida.com/
You can deselect the two add to browser options it offers.
I found these links on microsoft.com that still work to Install Visual C++.
http://download.microsoft.com/download/8/B/5/8B5804AD-4990-40D0-A6AA-CE894CBBB3DC/VS2008ExpressENUX1397868.iso
2008 SP1 here
http://download.microsoft.com/download/E/8/E/E8EEB394-7F42-4963-A2D8-29559B738298/VS2008ExpressWithSP1ENUX1504728.iso
I'm an author of a pure Python library that aims to be also convenient to use from a command line. For Windows users it would be nice just installing the package from an .exe or .msi package.
However I cannot get the installer to install package dependencies (especially the dependency on setuptools itself, so that running the software fails with an import error on pkg_resources). I don't believe that providing an easy .exe installer makes much sense, if the user then needs to manually install setuptools and other libraries on top. I'd rather tell them how to add easy_install to their PATH and go through this way (http://stackoverflow.com/questions/1449494/how-do-i-install-python-packages-on-windows).
I've build .exe packages in the past, but don't remember if that ever worked the way I'd preferred it to.
It is quite common to distribute packages that have dependencies, especially those as you have, but I understand your wish to make installation as simple as possible.
Have a look at deployment bootstrapper, a tool dedicated to solving the problem of delivering software including its prerequisites.
Regardless of what packaging method you eventually choose, maintain your sanity by staying away from including MSIs in other MSI in any way. That just does not work because of transactional installation requirements and locking of the Windows Installer database.