Python packaging. Specify python version requirements after multiple releases - python

I Have a package published on pypi and versioned as 3.0.0.
setup.py has never mentioned python_requires directive.
In release 2.5.0 there was a change which made package to be incompatible with python 2, which went unnoticed, until now.
Since 2.5.0 there were a plenty of releases of a package published on pypi.
Now, if I want to install the package using python2 - pip will install the latest release 3.0.0 which won't work.
I need pip to install version 2.4.0 - which has no compatibility issues. But how exactly can I accomplish that? (without prior knowledge of pip install package==2.4.0 - something like using pip's backtracking mechanism)
If I specify the directive python_requires=">=3.6" in release 3.1.0 pip will backtrack to release 3.0.0 installing package which won't work.
I can think of:
cx_Oracle way. Raising exception in setup.py if minimal version does not match required for installation and specifying how to install correct one.
Create 2 new releases. One, which is essentialy 2.4.0 versioned as 3.1.0 with python_requires=">=2.7,<3.6" and one which is 3.0.0 versioned as 3.1.1 with python_requires=">=3.6"
Is there a better way?

There is a relatively new feature on PyPI: you could "yank" the release(s) which are incompatible with Python 2 but not correctly specifying that in the metadata.
A yanked release is a release that is always ignored by an installer, unless it is the only release that matches a version specifier (using either == or ===).
See PEP 592 -- Adding "Yank" Support to the Simple API for more information. In fact, what you've described is the main scenario described in the motivation section of the PEP.

Related

Install two versions of a certain package in anaconda

I have the following issue:
One package that I want to install through conda demands a certain version of another package. However, I want to install a newer version of the second package.
Specifically, the package energysim 2.1.5 demands the package fmpy 0.2.14 and when I am trying to install a newer version of fmpy I am getting error:
ERROR: energysim 2.1.5 has requirement fmpy==0.2.14, but you'll have fmpy 0.3.0 which is incompatible.
Is it possible something like that and how?
In my answer, I am assuming the following case:
You want to install packageA, which requires packageB==v1
You also want to install packageB at version v2
Your goal: Install packageB with version v1 and v2 to make this possible
I don't know of any way this can be achieved. I also don't see a way that this would even technically work. Say you do import packageB in your code. Which version should be imported? How should python know that import packageB done by packageA should be v1, but import packageB done by you should be v2?
I see these options:
not using packageA, so that you can have packageB at the version you need
If possible, have one environment where you packageA and packageB and another with only packageB at the version you want
Fork packageA and create your own custom version that works with your required version of packageB
Wouldn't you just be able to do:
conda install packageB==2.0.0
conda install packageA
The package energysim is pinned [1] to use
fmpy 0.2.14 and no other versions (older or newer). It looks like this was done intentionally [2]; the maintainer may have good reasons to enforce this pin. pip won't let you install 0.3 because of this pin.
I would reach out to them via a GitHub issue to ask if their package is compatible with newer versions. It looks like fmpy 0.2.14 is about a year and a half old, for what it's worth. It may work fine with 0.3.x, but IMHO it should be tested and released before using it.
https://github.com/dgusain1/energysim/blob/07282257073058119664f9a5e8fd4300e138a64d/setup.py#L25-L29
https://github.com/dgusain1/energysim/commit/f84dad3ab913b43eea3187da54c132319c23d1a7

pip's dependency resolver takes way too long to solve the conflict

I've been trying to install a package through pip on my rpi 3 model B
my operating system is raspbian. Debian based pip version is 21.0.1 and python version is 3.7.4
the command I'm using is:
python3 -m pip install librosa
the problem is that the dependency resolver takes way too long to resolve the conflicts.
and after a few hours, it keeps repeating this line over and over again for hours ( I even left the installation running for 2 days overnights )
INFO: pip is looking at multiple versions of <Python from requires-Python> to determine which version is compatible with other requirements. this could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run you can press ctrl + c to do so.
I've tried using a stricter constraint such as adding "numpy > 1.20.0" and other stuff but now the popped up and I have no clue what I can do now.
So as of pip 20.3, a new (not always working) resolver has been introduced. As of pip 21.0 the old (working) resolver is unsupported and slated for removal dependent on pip team resources.
Changes to the pip dependency resolver in 20.3
I have hit the same issue trying to build jupyter, my solution was to pin pip back to the 20.2 release which is the last release with the old resolver. This got past the point my builds were choking at using the new resolver under pip 21.1.1.
A second approach that might work (untested) is to use the flag:
--use-deprecated=legacy-resolver
which appears to have been added when 20.3 switched over to the new resolver. This would allow the benefits of newer pip releases, until the backtracking issue is resolved, assuming it works.
What is happening, according to the devs on this Github issue, is "pip downloads multiple versions [of each package] because the versions it downloaded conflict with other packages you specified, which is called backtracking and is a feature. The versions need to be downloaded to detect the conflicts." But it takes a very long time to download all of these versions. Pip explains this in detail, along with ways to resolve it or speed it up, at https://pip.pypa.io/en/latest/topics/dependency-resolution/.
If you run
pip install -r requirements.txt --use-deprecated=legacy-resolver
you will not get this backtracking behavior, but your install will complete, and you will see an error at the end that is useful for troubleshooting:
ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts.
apache-airflow-providers-amazon 2.6.0 requires boto3<1.19.0,>=1.15.0, but you'll have boto3 1.9.253 which is incompatible.
package_xyz 0.0.1 requires PyJWT==2.1.0, but you'll have pyjwt 1.7.1 which is incompatible.
Upgrading my pip to 21.3.1 worked
python.exe -m pip install --upgrade pip

Why pip install colorama~=0.3 installs colorama-0.4.0

I have trouble installing my project which depends on Colorama.
In the setup.py, I specified:
'colorama ~= 0.3'
But I’m surprised to see that the version 0.4 is installed (this version is new).
How to reproduce?
Create and activate a virtualenv and run:
pip install colorama~=0.3
And then look at the logs or run:
pip list
What’s wrong with the ~= operator?
note: I'm using pip v18.1 and setuptools v40.4.3
The operator ~= means "compatible release". When using semantic versioning a compatible version is such that the first number in the sequence is the same (number 0 in this case).
From the link above:
Given a version number MAJOR.MINOR.PATCH, increment the:
MAJOR version when you make incompatible API changes, MINOR version when you add functionality in a backwards-compatible manner,
and PATCH version when you make backwards-compatible bug fixes.

Allow "pip install" to ignore local versions

I'm using versioneer to manage versions, ala PEP 440.
I have uploaded a few versions to a private repository:
0.0.1
0.0.2
0.0.2+0.0.2+18.g5a127f2.dirty
My problem is that now when I use
pip install mypackage==0.0.2
I get version 0.0.2+0.0.2+18.g5a127f2.dirty when I expected to get 0.0.2.
Is there a way to have have pip ignore the "local version" and just install the exact version, without me having to upload to different indices (ie. staging and stable)?
Edit:
I have tried using the --no-cache-dir and -I flags, but the issue persists; pip is preferring the 0.0.2+ version to the 0.0.2 version.
Additional Edit:
I'm using pip 18.0 and Python 2.7
According to distutils:
Following a release number, you can have either a pre-release or post-release tag. Pre-release tags make a version be considered older than the version they are appended to. So, revision 2.4 is newer than revision 2.4c1, which in turn is newer than 2.4b1 or 2.4a1. Postrelease tags make a version be considered newer than the version they are appended to. So, revisions like 2.4-1 and 2.4pl3 are newer than 2.4, but are older than 2.4.1 (which has a higher release number).
So, while not the solution I'm looking for (full answer below) it looks like this works:
pip install "mypackage<=0.0.2"
The distutils blurb about post-releases seems to go against what is specified in PEP440
[Examples: ...] == 3.1: specifically version 3.1 (or 3.1.0), excludes all pre-releases, post releases, developmental releases and any 3.1.x maintenance releases.
...but I'm still a little fuzzy on how it's determined whether something is a "post" or "pre" release.
Nevertheless, the answer to my problem appears to be: use Aribitrary Equality:
pip install mypackage===0.0.2
This gives me exactly the version specified, ignoring versions with any pre/post/dev details.

Pip Install Semantic Version with Metadata not upgrading

I have two versions of my python build:
16.1206.43542
17.0817.221945+f4cc396
The only really difference I can see is the ending metadata. When I run pip install package, the version 16.1206.43542 is installed and not the latest. Is this the proper behavior? I would have thought pip would have honored the metadata, and installed the later package?
Thoughts? Ideas? Any would be welcomed. For transparency, I am adding the sha from a git build into the version.
I looked at this, and found that the correct answer is that anything following the normal Semantic Versioning, it labels that build as a pre-release build and must then be explicitly called to be installed, normally identified with a '-'. https://semver.org/ (Topic 9)
For pip install The version must be a NON pre-release build to be automatically installed.

Categories

Resources