I have a Python library. Unfortunately I have not updated it to work with Python 3 yet.
In its setup.py, I added
install_requires=['python<3'],
My intent was to not allow this package to be installed/used under Python 3, because I know it doesn't (yet) work. I don't think this is the right way to do it, because pip then tries to download and install python 2.7.3 (which is already the installed version!).
How should I specify my library dependency on a particular range of Python interpreter versions? Should I add a Programming Language :: Python :: 2 :: Only tag? Will this actually prevent installation under Python 3? What if I also want to restrict the minimum version to Python 2.6?
I'd prefer a solution that works everywhere, but would settle for one that only works in pip (and hopefully doesn't cause easy_install to choke).
As of version 9.0.1 pip will honor a new python_requires string, specifying the Python version required for installation, e.g, for example if one wishes to enforce minimum Python version of 3.3:
setup(
...,
python_requires=">=3.3"
)
See here for more details. See also this answer on SO.
A possible solution is to test for the Python version, since pip can't satisfy the Python version except for the version it's currently running in (it installs in the current Python environment):
import sys
if not sys.version_info[0] == 2:
sys.exit("Sorry, Python 3 is not supported (yet)")
setup(...
After commenting in the answer above and receiving feedback, I thought to turn my comment into an answer. Note that the answers above are all fine, yet from my experience, I found one thing that is "missing" in these answers, that needs to be pointed out, so here I will illustrate this issue.
For simplicity and completeness of illustration, I have composed a very minimal and simple Python 3 project. The only 3rd party package it uses, is the famous SSH client package paramiko (it's official PyPi page can be found here).
The Python interpreter in the virtual environment of my project is of version 3.6.9
Now, in order to check the python_requires attribute "in action", I have added it to the project's setup.py script, which looks as follows:
from setuptools import setup, find_packages
setup(name='mySampleProject',
version='1.0',
description='Sample project in Python 3',
author='Guy Avraham',
license='MIT',
packages=find_packages(),
include_package_data=True,
python_requires='>=3.8',
install_requires=['paramiko'])
Note that I "required" that the Python version will be 3.8+. This of course should NOT work with the current Python version in the project's virtual environment which is 3.6.9.
Now, when I build the project using the "normal" use in the setup.py, meaning by running: python3 setup.py install, the project was built successfully. See the following output of the pip3 list command after running the python3 setup.py install command:
(mySampleProject_env) guya#ubuntu:~/mySampleProject$ pip3 list
DEPRECATION: The default format will switch to columns in the future. You can use --
format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
bcrypt (3.2.0)
cffi (1.14.3)
cryptography (3.1.1)
mySampleProject (1.0)
paramiko (2.7.2)
pip (9.0.1)
pkg-resources (0.0.0)
pycparser (2.20)
PyNaCl (1.4.0)
setuptools (39.0.1)
six (1.15.0)
As you can see, the project, along with all its "sub dependencies" was installed EVEN though I was NOT expecting it to.
On the other hand, when I installed the project using the command: pip3 install -e . (note the . to indicate the "current working directory"), I got the following output:
(mySampleProject_env) guya#ubuntu:~/mySampleProject$ pip3 install -e .
Obtaining file:///home/guya/mySampleProject
mySampleProject requires Python '>=3.8' but the running Python is 3.6.9
Which now, indeed, "considers" the python_requires attribute, thus "failing" the build of the project.
It is detailed in the very first paragraph in the tutorial in this page
and also during minutes ~09:00 - 11:00 in this video
NOTE: I did NOT check all the above for Python 2 (or pip for Python 2).
Related
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
There is plenty of information on the basic setup and configuration required:
SetupTools Documentation - setup() and setup.py configuration
Python Packaging User Guide - Installing Packages
PBR v3.1.1 documentation
StackOverflow: How to use version info generated using setuptools and pbr
But where is the simple info on how to actually create the distro?
i.e. I'm looking for whatever command finds the git tag with the version info and pulls it into the configuration info, so the source with that new version info can be distributed, and the version info is discoverable from the scripts, using a method like described in this answer.
Additional details
I'm working on a project that will be distributed to other developers only through a git repo, not through PyPi. The project will be released to users as an executable using pyinstaller, so this package distribution will only serve a few key purposes:
Install/Setup the package for other developers so that dependencies/environment can be recreated cleanly.
Manage versioning - Current plan is to use pbr to generate versions from the Git repo tags, so those tags can be our source of truth for versioning
Use pbr for other auto generation of mundane items from Git, such as authors, manifest.in file, release notes, etc.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
I'm sure it exists somewhere in the documentation, but after several false starts I'm asking here. It is implied everywhere I look that everyone either knows how to do this or it just magically happens as a part of the process.
Am I just missing the obvious?
Update:
Based on sinoroc's answer, it appears I need to look into development mode installs. i.e. Anyone developing the project will clone from git, and then install via using setuptools development install mode.
This wasn't directly a part of the original question, but implied, and I believe will be of interest to people in the same situation (info I couldn't easily find).
More info is available in his answer on updating some of the metadata, and via this setuptools documentation link to working in "Development Mode"
In short:
python3 setup.py sdist
python3 setup.py bdist_wheel
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
The usual commands to create (source and wheel) distributions of your Python package with setuptools are: python3 setup.py sdist and python3 setup.py bdist_wheel. The distributions can then be found in the dist directory by default.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
It is true that setuptools does not document this. It only documents the differences to distutils, and it is confusing indeed. See below for actual documentation...
But where is the simple info on how to actually create the distro?
https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
https://docs.python.org/3/distutils/sourcedist.html
https://docs.python.org/3/distutils/builtdist.html
Update
Since you don't plan on publishing distributions of your project on an index such as PyPI, and you plan on using pyinstaller instead, then you can indeed most likely disregard the setuptools commands such as sdist and bdist_wheel.
Still you might want to know these commands for the development phase:
Use commands such as python3 setup.py --version, python3 setup.py --fullname to figure out if setuptools (and in your case pbr) is catching the right info.
Use python3 setup.py develop (or pip install --editable .) to place a pseudo link (egg-link) in your site-packages that points at your work in progress. This way your changes are always installed and importable. Important: don't use python3 setup.py install, this would copy the current version to site-packages and newer changes would not be importable.
Now I don't know how all this will work once you move on to pyinstaller. Especially since you mentioned that you want the meta info (such as the version number) to be discoverable from within your scripts. The technique with setuptools pkg_resources may or may not work in the pyinstaller context.
This is how I solved the same issue, also having read several different links.
I have created a setup.py file with this content:
from setuptools import setup, find_packages
def readme():
with open('README.rst') as f:
return f.read()
def read_other_requirements(other_type):
with open(other_type+'-requirements.txt') as f:
return f.read()
setup(
setup_requires=read_other_requirements('setup'),
pbr=True,
packages=find_packages('src'),
package_dir={'': 'src'},
include_package_data=True,
zip_safe=True
)
I have the source code in ./src. Also, I have a setup-requirements.txt, with content:
pip==18.1
pbr==5.1.1
setuptools==40.7.0
And a setup.cfg with this content:
[metadata]
name = XXXXX
description = XXXXX
description-file = README.rst
home-page = https://github.com/XXXXX/XXXXX
So first, you install the setup-requirements:
pip install -r setup-requirements.txt
Then, whenever you have locally a commit which was tagged in GitHub, you can install it using:
python setup.py install
and it will be installed with the tagged version.
You can check it by doing:
python setup.py --version
What can we put in a setup.py file to prevent pip from collecting and attempting to install a package when using an unsupported Python version?
For example magicstack is a project listed with the trove classifier:
Programming Language :: Python :: 3 :: Only
So I expect the following behaviour if pip --version is tied to python 2.7:
$ pip install magicstack
Collecting magicstack
Could not find a version that satisfies the requirement magicstack (from versions: )
No matching distribution found for magicstack
But the actual behavior is that pip collects a release, downloads it, attempts to install it, and fails. There are other Python3-only releases, curio for example, which actually install fine - because the setup.py didn't use anything Python 3 specific - only to fail at import time when some Python 3 only syntax is used. And I'm sure there are packages which install OK, import OK, and maybe only fail at runtime!
What is the correct method to specify your supported Python versions in a way that pip will respect? I've found a workaround, involving uploading only a wheel file, and refusing to uploading a .tar.gz distribution, but I would be interested to know the correct fix.
Edit: How does pip know not to download the wheel distribution if the Python/os/architecture is not matching? Does it just use the .whl filename convention or is there something more sophisticated than that happening behind the scenes? Can we somehow give the metadata to a source distribution to make pip do the right thing with .tar.gz uploads?
There is a correct way to do this, but unfortunately pip only started supporting it in version 9.0.0 (released 2016-11-02), and so users with older versions of pip will continue to download packages willy-nilly regardless of what Python version they're for.
In your setup.py file, pass setup() a python_requires argument that lists your package's supported Python versions as a PEP 440 version specifier. For example, if your package is for Python 3+ only, write:
setup(
...
python_requires='>=3',
...
)
If your package is for Python 3.3 and up but you're not willing to commit to Python 4 support yet, write:
setup(
...
python_requires='~=3.3',
...
)
If your package is for Python 2.6, 2.7, and all versions of Python 3 starting with 3.3, write:
setup(
...
python_requires='>=2.6, !=3.0.*, !=3.1.*, !=3.2.*, <4',
...
)
And so on.
Once you've done that, you will need to upgrade your version of setuptools to at least 24.2.0 in order for the python_requires argument to be processed; earlier versions will just ignore it with a warning. All of your project's sdists and wheels built afterwards will then contain the relevant metadata that tells PyPI to tell pip what Python versions they're for.
The magicstack distribution on pypi is broken. It's failing because the source distribution doesn't contain a magicstack package even though the setup.py for the source distribution says it should.
As long as pypi contains a source distribution (e.g. .tar.gz, .zip), pip will download that if it can't find a matching binary distribution (e.g. .egg, .whl) for your version of python/os/architecture.
One option is to only upload binary distributions to PyPI (preferably wheels). The other option is to check the sys.version in your setup.py for compatible versions and raise an exception otherwise.
The installation information page of PyCryptodome says the following under the "Windows (pre-compiled)" section:
Install PyCryptodome as a wheel:
pip install pycryptodomex
To make sure everything works fine, run the test suite:
python -m Cryptodome.SelfTest
There are several problems with this though:
Contrary to what these instructions say, this will not install PyCryptoDome as a wheel, but it will rather download it and try to build it, resulting in an error if you don't have the correct build environment installed for the C components included in this package (and the entire mess related to this is the biggest benefit of using a wheel instead to begin with).
Even if I instead download the correct wheel file from PyCryptoDome's PyPi page, I must (as far as I know?) instead use a command-line as follows to install it:
pip install c:\some\path\name-of-wheel-file.whl
This in turn makes it install under the default "Crypto" package instead of the "Cryptodome" package explicitly mentioned in the instructions (and therefore colliding in a breaking fashion with any pre-existing installations of the PyCrypto package).
So, my question is:
Is there any way to install a wheel file under a different package name than the default one?
PyCryptodome does not seem to provide any specific wheel files for installing under this alternative package name, so if this is impossible, I have a big problem (because I already have PyCrypto installed). :-(
PS.
Some more context regarding the need for the alternative package name can be provided by the following quote from the same installation page that is linked above:
PyCryptodome can be used as:
1.
a drop-in replacement for the old PyCrypto library. You install it with:
pip install pycryptodome
In this case, all modules are installed under the Crypto package. You can test everything is right with:
python -m Crypto.SelfTest
One must avoid having both PyCrypto and PyCryptodome installed at the same time, as they will interfere with each other.
This option is therefore recommended only when you are sure that the whole application is deployed in a virtualenv.
2.
a library independent of the old PyCrypto. You install it with:
pip install pycryptodomex
You can test everything is right with:
python -m Cryptodome.SelfTest
In this case, all modules are installed under the Cryptodome package. PyCrypto and PyCryptodome can coexist.
So, again, all I want is to install it as described under alternative 2 in this quote, from a wheel file, but the problem is that the provided wheel files seem to only default to the package name described under alternative 1 in this quote (i.e. "Crypto").
As far as I know this is not possible. The only way to achieve this by recompiling the wheel yourself after you modified its name in the setup.py.
My python package contains a lot of files compiled by python-protobuf (python2-protobuf-2.5.0 on Arch Linux), I installed the package on Ubuntu server 12.04.3 (which have python-protobuf-2.4.1), tried to run the code, and hit the following error:
from google.protobuf.internal import enum_type_wrapper
ImportError: cannot import name enum_type_wrapper
I think it's because the protobuf modules in my package are compiled by protobuf-2.5.0 and they do not work with protobuf-2.4.1.
I have no idea of the environments in which my code may run, the version of protobuf may vary. How to make my package work with both protobuf 2.4 and 2.5?
(A possible way: include two different sets of protobuf libraries (one compiled by 2.4.1, the other compiled by 2.5.0) in my package, get google.protobuf version at runtime and select the protobuf libraries to import. Is it possible?
You need to specify the version of protobuf that will work with in your setup.py in the list install_requires=['protobuf>=2.5.0']. With a Python package, you can put just the name or the exact versions that will run with the package using ==. I believe you can also specify != for specific versions.
If you are not packaging it with a setup.py, you should set up a virtualenv and put a file install_requires.txt with all the specific python packages and versions in the root of the project.
That might look like:
$ cd ../project
$ virtualenv project_venv
$ source project_venv/bin/activate
$ cd project
$ pip install protobuf>=2.5.0
$ pip freeze > ./requirements.txt
Then someone you distribute to can activate their virtualenv and do:
$ pip install -r requirements.txt
Make sure your package will work from a fresh virtualenv by installing with that method. This is also good to check before installing via a setup.py. You want to make sure your requirements will get anyone working who just does a fresh sudo python setup.py install, or python setup.py install in a virtualenv context.
You can exit a virtualenv context with:
$ deactivate
Your best bet may be to include a copy of the protobuf runtime library with your package, maybe under a different package name. Then you can make sure that it matches the version of your generated code.
Another option is to invoke protoc as part of the installation process, so you get whatever version is available on the host.
I don't think packaging multiple versions of your generated code sounds like a good idea -- you'll just have problems again when the next protobuf release comes out.
I'm trying to maintain dependencies using pip install -r requirements.txt. However, some of required packages do not support Python 3 directly, but can be converted manually using 2to3.
Is there a way to force pip to run 2to3 on those packages automagically when doing pip install -r requirements.txt?
No, it needs to be part of the package setup configuration instead. See Supporting both Python 2 and 3 with Distribute.
You add metadata to your package installer:
setup(
name='your.module',
version = '1.0',
description='This is your awesome module',
author='You',
author_email='your#email',
package_dir = {'': 'src'},
packages = ['your', 'your.module'],
test_suite = 'your.module.tests',
use_2to3 = True,
convert_2to3_doctests = ['src/your/module/README.txt'],
use_2to3_fixers = ['your.fixers'],
use_2to3_exclude_fixers = ['lib2to3.fixes.fix_import'],
)
Such a package would then automatically run 2to3 on installation into a Python 3 system.
2to3 is a tool, not a magic bullet, you cannot apply it to an arbitrary package pip downloads from PyPI. The package needs to support it in the way it is coded. Thus, running it automatically from pip is not going to work; the responsibility lies with the package maintainer.
Note that just because 2to3 runs successfully on a package, it does not necessarily follow the package will work in Python 3. Assumptions about bytes vs. unicode usually crop up when you actually start using the package.
Contact the maintainers of the packages you are interested in and ask what the status is for that package for Python 3. Supplying patches to them usually helps. If such requests and offers for help fall on deaf ears, for Open Source packages you can always fork them and apply the necessary changes yourself.