What can we put in a setup.py file to prevent pip from collecting and attempting to install a package when using an unsupported Python version?
For example magicstack is a project listed with the trove classifier:
Programming Language :: Python :: 3 :: Only
So I expect the following behaviour if pip --version is tied to python 2.7:
$ pip install magicstack
Collecting magicstack
Could not find a version that satisfies the requirement magicstack (from versions: )
No matching distribution found for magicstack
But the actual behavior is that pip collects a release, downloads it, attempts to install it, and fails. There are other Python3-only releases, curio for example, which actually install fine - because the setup.py didn't use anything Python 3 specific - only to fail at import time when some Python 3 only syntax is used. And I'm sure there are packages which install OK, import OK, and maybe only fail at runtime!
What is the correct method to specify your supported Python versions in a way that pip will respect? I've found a workaround, involving uploading only a wheel file, and refusing to uploading a .tar.gz distribution, but I would be interested to know the correct fix.
Edit: How does pip know not to download the wheel distribution if the Python/os/architecture is not matching? Does it just use the .whl filename convention or is there something more sophisticated than that happening behind the scenes? Can we somehow give the metadata to a source distribution to make pip do the right thing with .tar.gz uploads?
There is a correct way to do this, but unfortunately pip only started supporting it in version 9.0.0 (released 2016-11-02), and so users with older versions of pip will continue to download packages willy-nilly regardless of what Python version they're for.
In your setup.py file, pass setup() a python_requires argument that lists your package's supported Python versions as a PEP 440 version specifier. For example, if your package is for Python 3+ only, write:
setup(
...
python_requires='>=3',
...
)
If your package is for Python 3.3 and up but you're not willing to commit to Python 4 support yet, write:
setup(
...
python_requires='~=3.3',
...
)
If your package is for Python 2.6, 2.7, and all versions of Python 3 starting with 3.3, write:
setup(
...
python_requires='>=2.6, !=3.0.*, !=3.1.*, !=3.2.*, <4',
...
)
And so on.
Once you've done that, you will need to upgrade your version of setuptools to at least 24.2.0 in order for the python_requires argument to be processed; earlier versions will just ignore it with a warning. All of your project's sdists and wheels built afterwards will then contain the relevant metadata that tells PyPI to tell pip what Python versions they're for.
The magicstack distribution on pypi is broken. It's failing because the source distribution doesn't contain a magicstack package even though the setup.py for the source distribution says it should.
As long as pypi contains a source distribution (e.g. .tar.gz, .zip), pip will download that if it can't find a matching binary distribution (e.g. .egg, .whl) for your version of python/os/architecture.
One option is to only upload binary distributions to PyPI (preferably wheels). The other option is to check the sys.version in your setup.py for compatible versions and raise an exception otherwise.
Related
I was just preparing to make a voice assistant and an error occurred while I was installing the ecapture module in python. I used pip for installing and the error is as shown below.
Failed to build scikit-image
ERROR: Could not build wheels for scikit-image, which is required to install py.project.toml-based projects
I have tried to install it from PyPI
even I do have tried to restart my computer, reinstall python, etc.
but it doesn't just work.
Note: only use this answer if you trust binaries built by Christoph Gohlke, who maintains an excellent index of binaries here https://www.lfd.uci.edu/~gohlke/pythonlibs/
You can either grab the needed packages from there manually, or use this package (which I wrote, full disclosure):
pip install gohlkegrabber
ggrab . scikit-image
pip install scikit_image-0.19.0-cp310-cp310-win_amd64.whl
pip install ecapture
Note that the package you were lacking is scikit-image - you may be able to find binaries elsewhere as well, the site above is only provided as a suggestion. Again, only use if you trust the author.
Also note that the package was called scikit_image-0.19.0-cp310-cp310-win_amd64.whl for me, as I'm on Python 3.10 on 64-bit Windows. Yours may have a different name (if available), but the ggrab command will tell you.
Finally note that 0.19.0 just happens to be the most recent build on that site - it's not guaranteed to have the latest build, or to have the latest build for your OS/version of Python.
I'm writing a source bundle (not a fully packaged module, but some scripts with dependencies) to be installed and executed inside a framework application (Specifically, Amazon SageMaker's TensorFlow serving container - running Python 3.5).
One of my dependencies is matplotlib, which in turn needs kiwisolver, which has C++ components.
It seems like my target container doesn't have wheel installed by default, because when I supply just a requirements.txt file I get the error described in "Why is python setup.py saying invalid command 'bdist_wheel' on Travis CI?".
I think I got it working by supplying a setup.py instead, with setup_requires=["wheel"] as advised in the answers to that Travis CI question.
My Python packaging-fu is weak, so my question is: Who should be specifying this dependency, because it seems like it shouldn't be me?
Should kiwisolver be advertising that it needs wheel?
Does a framework application/environment installing user code modules via requirements.txt have an implicit contract to make wheel available in the environment, for some reason in Python's packaging ethos?
Maybe it really is on me to know that, since I'm indirectly consuming a module like kiwisolver, my package requires wheel for setup and a straight pip install -r requirements.txt won't work?
Even better if somebody can explain whether this answer is changing with PEP 518 and the deprecation of setup_requires :S
Usually wheel could be considered a build-time dependency and not an install-time dependency. But actually, wheel is just a way of distributing Python projects (libraries or applications), so it usually isn't a mandatory dependency.
The one system building the library (kiwisolver) might have a need to have the wheel tool installed. But if I am not mistaken recent versions of pip have wheel already bundled in, so nowadays there is often no need to install it explicitly.
In many cases there are wheels already built available on PyPI. But sometimes there are no wheels compatible with the target system (Python interpreter version, operating system, CPU bitness). In your case here, kiwisolver has a wide range of wheels available but not for Python 3.5.
So it seems like the system you want to install kiwisolver on, is not compatible with any of the wheels available on PyPI. So pip has to build it locally. Usually pip tries to build a wheel first, but as far as I know it's not a deal-breaker if a wheel cannot be built then pip usually just continues and installs the project without going to the wheel intermediary step.
But still pip has to be able to build the library, which might require some C/C++ compilers or that other unusual conditions are met on the local system. Which is why distributing libraries as wheel is very comfortable, since the build step is already done.
So to sum it up, from my point of view, no one really has to declare wheel as a dependency or install wheel unless they actually want to build wheels. But wheel really is just an intermediary optional step. It's a way of distributing Python projects (libraries or applications). I don't see the absolute need for adding wheel to setuptools' setup_requires (which is deprecated or on close to it) nor to pyproject.toml's build-system.requires, it's more of a (very common, and quasi standard) convenience.
Now what would I do in your situation?
Before installing from the requirements.txt file that contains kiwisolver (directly or indirectly) either make sure that pip is up-to-date or explicitly install wheel, and:
Use a version of Python for which wheels are already available on PyPI.
If you want to stay on Python 3.5:
Make sure the target system is able to build kiwisolver itself (maybe it requires a C/C++ compiler plus some other native libraries).
The installation information page of PyCryptodome says the following under the "Windows (pre-compiled)" section:
Install PyCryptodome as a wheel:
pip install pycryptodomex
To make sure everything works fine, run the test suite:
python -m Cryptodome.SelfTest
There are several problems with this though:
Contrary to what these instructions say, this will not install PyCryptoDome as a wheel, but it will rather download it and try to build it, resulting in an error if you don't have the correct build environment installed for the C components included in this package (and the entire mess related to this is the biggest benefit of using a wheel instead to begin with).
Even if I instead download the correct wheel file from PyCryptoDome's PyPi page, I must (as far as I know?) instead use a command-line as follows to install it:
pip install c:\some\path\name-of-wheel-file.whl
This in turn makes it install under the default "Crypto" package instead of the "Cryptodome" package explicitly mentioned in the instructions (and therefore colliding in a breaking fashion with any pre-existing installations of the PyCrypto package).
So, my question is:
Is there any way to install a wheel file under a different package name than the default one?
PyCryptodome does not seem to provide any specific wheel files for installing under this alternative package name, so if this is impossible, I have a big problem (because I already have PyCrypto installed). :-(
PS.
Some more context regarding the need for the alternative package name can be provided by the following quote from the same installation page that is linked above:
PyCryptodome can be used as:
1.
a drop-in replacement for the old PyCrypto library. You install it with:
pip install pycryptodome
In this case, all modules are installed under the Crypto package. You can test everything is right with:
python -m Crypto.SelfTest
One must avoid having both PyCrypto and PyCryptodome installed at the same time, as they will interfere with each other.
This option is therefore recommended only when you are sure that the whole application is deployed in a virtualenv.
2.
a library independent of the old PyCrypto. You install it with:
pip install pycryptodomex
You can test everything is right with:
python -m Cryptodome.SelfTest
In this case, all modules are installed under the Cryptodome package. PyCrypto and PyCryptodome can coexist.
So, again, all I want is to install it as described under alternative 2 in this quote, from a wheel file, but the problem is that the provided wheel files seem to only default to the package name described under alternative 1 in this quote (i.e. "Crypto").
As far as I know this is not possible. The only way to achieve this by recompiling the wheel yourself after you modified its name in the setup.py.
I'm trying to maintain dependencies using pip install -r requirements.txt. However, some of required packages do not support Python 3 directly, but can be converted manually using 2to3.
Is there a way to force pip to run 2to3 on those packages automagically when doing pip install -r requirements.txt?
No, it needs to be part of the package setup configuration instead. See Supporting both Python 2 and 3 with Distribute.
You add metadata to your package installer:
setup(
name='your.module',
version = '1.0',
description='This is your awesome module',
author='You',
author_email='your#email',
package_dir = {'': 'src'},
packages = ['your', 'your.module'],
test_suite = 'your.module.tests',
use_2to3 = True,
convert_2to3_doctests = ['src/your/module/README.txt'],
use_2to3_fixers = ['your.fixers'],
use_2to3_exclude_fixers = ['lib2to3.fixes.fix_import'],
)
Such a package would then automatically run 2to3 on installation into a Python 3 system.
2to3 is a tool, not a magic bullet, you cannot apply it to an arbitrary package pip downloads from PyPI. The package needs to support it in the way it is coded. Thus, running it automatically from pip is not going to work; the responsibility lies with the package maintainer.
Note that just because 2to3 runs successfully on a package, it does not necessarily follow the package will work in Python 3. Assumptions about bytes vs. unicode usually crop up when you actually start using the package.
Contact the maintainers of the packages you are interested in and ask what the status is for that package for Python 3. Supplying patches to them usually helps. If such requests and offers for help fall on deaf ears, for Open Source packages you can always fork them and apply the necessary changes yourself.
I have a Python library. Unfortunately I have not updated it to work with Python 3 yet.
In its setup.py, I added
install_requires=['python<3'],
My intent was to not allow this package to be installed/used under Python 3, because I know it doesn't (yet) work. I don't think this is the right way to do it, because pip then tries to download and install python 2.7.3 (which is already the installed version!).
How should I specify my library dependency on a particular range of Python interpreter versions? Should I add a Programming Language :: Python :: 2 :: Only tag? Will this actually prevent installation under Python 3? What if I also want to restrict the minimum version to Python 2.6?
I'd prefer a solution that works everywhere, but would settle for one that only works in pip (and hopefully doesn't cause easy_install to choke).
As of version 9.0.1 pip will honor a new python_requires string, specifying the Python version required for installation, e.g, for example if one wishes to enforce minimum Python version of 3.3:
setup(
...,
python_requires=">=3.3"
)
See here for more details. See also this answer on SO.
A possible solution is to test for the Python version, since pip can't satisfy the Python version except for the version it's currently running in (it installs in the current Python environment):
import sys
if not sys.version_info[0] == 2:
sys.exit("Sorry, Python 3 is not supported (yet)")
setup(...
After commenting in the answer above and receiving feedback, I thought to turn my comment into an answer. Note that the answers above are all fine, yet from my experience, I found one thing that is "missing" in these answers, that needs to be pointed out, so here I will illustrate this issue.
For simplicity and completeness of illustration, I have composed a very minimal and simple Python 3 project. The only 3rd party package it uses, is the famous SSH client package paramiko (it's official PyPi page can be found here).
The Python interpreter in the virtual environment of my project is of version 3.6.9
Now, in order to check the python_requires attribute "in action", I have added it to the project's setup.py script, which looks as follows:
from setuptools import setup, find_packages
setup(name='mySampleProject',
version='1.0',
description='Sample project in Python 3',
author='Guy Avraham',
license='MIT',
packages=find_packages(),
include_package_data=True,
python_requires='>=3.8',
install_requires=['paramiko'])
Note that I "required" that the Python version will be 3.8+. This of course should NOT work with the current Python version in the project's virtual environment which is 3.6.9.
Now, when I build the project using the "normal" use in the setup.py, meaning by running: python3 setup.py install, the project was built successfully. See the following output of the pip3 list command after running the python3 setup.py install command:
(mySampleProject_env) guya#ubuntu:~/mySampleProject$ pip3 list
DEPRECATION: The default format will switch to columns in the future. You can use --
format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
bcrypt (3.2.0)
cffi (1.14.3)
cryptography (3.1.1)
mySampleProject (1.0)
paramiko (2.7.2)
pip (9.0.1)
pkg-resources (0.0.0)
pycparser (2.20)
PyNaCl (1.4.0)
setuptools (39.0.1)
six (1.15.0)
As you can see, the project, along with all its "sub dependencies" was installed EVEN though I was NOT expecting it to.
On the other hand, when I installed the project using the command: pip3 install -e . (note the . to indicate the "current working directory"), I got the following output:
(mySampleProject_env) guya#ubuntu:~/mySampleProject$ pip3 install -e .
Obtaining file:///home/guya/mySampleProject
mySampleProject requires Python '>=3.8' but the running Python is 3.6.9
Which now, indeed, "considers" the python_requires attribute, thus "failing" the build of the project.
It is detailed in the very first paragraph in the tutorial in this page
and also during minutes ~09:00 - 11:00 in this video
NOTE: I did NOT check all the above for Python 2 (or pip for Python 2).