requirements.txt: any version < 1.5 (incl dev and rc versions) - python

I am looking for a pattern for Python's requirement.txt (for usage with pip and python 3.10), which will cover all versions available up to a version 1.5, e.g.
1.4.2.dev5+g470a8b8
1.4.dev22+g2be722f
1.4
1.4rc0
1.5rc1
And: is there a clever way to test this without actually running "pip install" in a fresh venv?

You should be able to test with python -m pip install --dry-run without actually installing anything.
You can also install and test with packaging which is the library used by pip internally.
I guess !=1.5,<=1.5 should be what you want.
import packaging.specifiers
import packaging.version
version_strings = [
'1',
'1.0'
'1.1',
'1.4',
'1.4.2',
'1.4.2.dev5+g470a8b8',
'1.4.dev22+g2be722f',
'1.4',
'1.4rc0',
'1.5rc1',
'1.5',
'1.5.1',
'1.6',
'2',
'2.0',
]
versions = [packaging.version.Version(v) for v in version_strings]
specifier = packaging.specifiers.SpecifierSet('!=1.5,<=1.5', prereleases=True)
for version in versions:
print(f'{version} in {specifier}: {version in specifier}')
$ python test.py
1 in !=1.5,<=1.5: True
1.1.1 in !=1.5,<=1.5: True
1.4 in !=1.5,<=1.5: True
1.4.2 in !=1.5,<=1.5: True
1.4.2.dev5+g470a8b8 in !=1.5,<=1.5: True
1.4.dev22+g2be722f in !=1.5,<=1.5: True
1.4 in !=1.5,<=1.5: True
1.4rc0 in !=1.5,<=1.5: True
1.5rc1 in !=1.5,<=1.5: True
1.5 in !=1.5,<=1.5: False
1.5.post0 in !=1.5,<=1.5: False
1.5.0 in !=1.5,<=1.5: False
1.5.1 in !=1.5,<=1.5: False
1.6 in !=1.5,<=1.5: False
2 in !=1.5,<=1.5: False
2.0 in !=1.5,<=1.5: False
So you could use it like this:
python -m pip install --pre 'somepackage!=1.5,<=1.5'
or in a requirements.txt file:
--pre
somepackage !=1.5, <=1.5
Related:
https://github.com/pypa/packaging/issues/617

Related

How do I check if any of my Python dependencies have an update available? [duplicate]

Given the name of a Python package that can be installed with pip, is there any way to find out a list of all the possible versions of it that pip could install? Right now it's trial and error.
I'm trying to install a version for a third party library, but the newest version is too new, there were backwards incompatible changes made. So I'd like to somehow have a list of all the versions that pip knows about, so that I can test them.
For pip >= 21.2 use:
pip index versions pylibmc
Note that this command is experimental, and might change in the future!
For pip >= 21.1 use:
pip install pylibmc==
For pip >= 20.3 use:
pip install --use-deprecated=legacy-resolver pylibmc==
For pip >= 9.0 use:
$ pip install pylibmc==
Collecting pylibmc==
Could not find a version that satisfies the requirement pylibmc== (from
versions: 0.2, 0.3, 0.4, 0.5.1, 0.5.2, 0.5.3, 0.5.4, 0.5.5, 0.5, 0.6.1, 0.6,
0.7.1, 0.7.2, 0.7.3, 0.7.4, 0.7, 0.8.1, 0.8.2, 0.8, 0.9.1, 0.9.2, 0.9,
1.0-alpha, 1.0-beta, 1.0, 1.1.1, 1.1, 1.2.0, 1.2.1, 1.2.2, 1.2.3, 1.3.0)
No matching distribution found for pylibmc==
The available versions will be printed without actually downloading or installing any packages.
For pip < 9.0 use:
pip install pylibmc==blork
where blork can be any string that is not a valid version number.
(update: As of March 2020, many people have reported that yolk, installed via pip install yolk3k, only returns latest version. Chris's answer seems to have the most upvotes and worked for me)
The script at pastebin does work. However it's not very convenient if you're working with multiple environments/hosts because you will have to copy/create it every time.
A better all-around solution would be to use yolk3k, which is available to install with pip. E.g. to see what versions of Django are available:
$ pip install yolk3k
$ yolk -V django
Django 1.3
Django 1.2.5
Django 1.2.4
Django 1.2.3
Django 1.2.2
Django 1.2.1
Django 1.2
Django 1.1.4
Django 1.1.3
Django 1.1.2
Django 1.0.4
yolk3k is a fork of the original yolk which ceased development in 2012. Though yolk is no longer maintained (as indicated in comments below), yolk3k appears to be and supports Python 3.
Note: I am not involved in the development of yolk3k. If something doesn't seem to work as it should, leaving a comment here should not make much difference. Use the yolk3k issue tracker instead and consider submitting a fix, if possible.
You don't need a third party package to get this information. pypi provides simple JSON feeds for all packages under
https://pypi.org/pypi/{PKG_NAME}/json
Here's some Python code using only the standard library which gets all versions.
import json
import urllib2
from distutils.version import StrictVersion
def versions(package_name):
url = "https://pypi.org/pypi/%s/json" % (package_name,)
data = json.load(urllib2.urlopen(urllib2.Request(url)))
versions = data["releases"].keys()
versions.sort(key=StrictVersion)
return versions
print "\n".join(versions("scikit-image"))
That code prints (as of Feb 23rd, 2015):
0.7.2
0.8.0
0.8.1
0.8.2
0.9.0
0.9.1
0.9.2
0.9.3
0.10.0
0.10.1
Update:
As of Sep 2017 this method no longer works: --no-install was removed in pip 7
Use pip install -v, you can see all versions that available
root#node7:~# pip install web.py -v
Downloading/unpacking web.py
Using version 0.37 (newest of versions: 0.37, 0.36, 0.35, 0.34, 0.33, 0.33, 0.32, 0.31, 0.22, 0.2)
Downloading web.py-0.37.tar.gz (90Kb): 90Kb downloaded
Running setup.py egg_info for package web.py
running egg_info
creating pip-egg-info/web.py.egg-info
To not install any package, use one of following solution:
root#node7:~# pip install --no-deps --no-install flask -v
Downloading/unpacking flask
Using version 0.10.1 (newest of versions: 0.10.1, 0.10, 0.9, 0.8.1, 0.8, 0.7.2, 0.7.1, 0.7, 0.6.1, 0.6, 0.5.2, 0.5.1, 0.5, 0.4, 0.3.1, 0.3, 0.2, 0.1)
Downloading Flask-0.10.1.tar.gz (544Kb): 544Kb downloaded
or
root#node7:~# cd $(mktemp -d)
root#node7:/tmp/tmp.c6H99cWD0g# pip install flask -d . -v
Downloading/unpacking flask
Using version 0.10.1 (newest of versions: 0.10.1, 0.10, 0.9, 0.8.1, 0.8, 0.7.2, 0.7.1, 0.7, 0.6.1, 0.6, 0.5.2, 0.5.1, 0.5, 0.4, 0.3.1, 0.3, 0.2, 0.1)
Downloading Flask-0.10.1.tar.gz (544Kb): 4.1Kb downloaded
Tested with pip 1.0
root#node7:~# pip --version
pip 1.0 from /usr/lib/python2.7/dist-packages (python 2.7)
I came up with dead-simple bash script. Thanks to jq's author.
#!/bin/bash
set -e
PACKAGE_JSON_URL="https://pypi.org/pypi/${1}/json"
curl -L -s "$PACKAGE_JSON_URL" | jq -r '.releases | keys | .[]' | sort -V
Update:
Add sorting by version number.
Add -L to follow redirects.
You can use this small Python 3 script (using only standard library modules) to grab the list of available versions for a package from PyPI using JSON API and print them in reverse chronological order. Unlike some other Python solutions posted here, this doesn't break on loose versions like django's 2.2rc1 or uwsgi's 2.0.17.1:
#!/usr/bin/env python3
import json
import sys
from urllib import request
from pkg_resources import parse_version
def versions(pkg_name):
url = f'https://pypi.python.org/pypi/{pkg_name}/json'
releases = json.loads(request.urlopen(url).read())['releases']
return sorted(releases, key=parse_version, reverse=True)
if __name__ == '__main__':
print(*versions(sys.argv[1]), sep='\n')
Save the script and run it with the package name as an argument, e.g.:
python versions.py django
3.0a1
2.2.5
2.2.4
2.2.3
2.2.2
2.2.1
2.2
2.2rc1
...
After looking at pip's code for a while, it looks like the code responsible for locating packages can be found in the PackageFinder class in pip.index. Its method find_requirement looks up the versions of a InstallRequirement, but unfortunately only returns the most recent version.
The code below is almost a 1:1 copy of the original function, with the return in line 114 changed to return all versions.
The script expects one package name as first and only argument and returns all versions.
http://pastebin.com/axzdUQhZ
I can't guarantee for the correctness, as I'm not familiar with pip's code. But hopefully this helps.
Sample output
python test.py pip
Versions of pip
0.8.2
0.8.1
0.8
0.7.2
0.7.1
0.7
0.6.3
0.6.2
0.6.1
0.6
0.5.1
0.5
0.4
0.3.1
0.3
0.2.1
0.2 dev
The code:
import posixpath
import pkg_resources
import sys
from pip.download import url_to_path
from pip.exceptions import DistributionNotFound
from pip.index import PackageFinder, Link
from pip.log import logger
from pip.req import InstallRequirement
from pip.util import Inf
class MyPackageFinder(PackageFinder):
def find_requirement(self, req, upgrade):
url_name = req.url_name
# Only check main index if index URL is given:
main_index_url = None
if self.index_urls:
# Check that we have the url_name correctly spelled:
main_index_url = Link(posixpath.join(self.index_urls[0], url_name))
# This will also cache the page, so it's okay that we get it again later:
page = self._get_page(main_index_url, req)
if page is None:
url_name = self._find_url_name(Link(self.index_urls[0]), url_name, req) or req.url_name
# Combine index URLs with mirror URLs here to allow
# adding more index URLs from requirements files
all_index_urls = self.index_urls + self.mirror_urls
def mkurl_pypi_url(url):
loc = posixpath.join(url, url_name)
# For maximum compatibility with easy_install, ensure the path
# ends in a trailing slash. Although this isn't in the spec
# (and PyPI can handle it without the slash) some other index
# implementations might break if they relied on easy_install's behavior.
if not loc.endswith('/'):
loc = loc + '/'
return loc
if url_name is not None:
locations = [
mkurl_pypi_url(url)
for url in all_index_urls] + self.find_links
else:
locations = list(self.find_links)
locations.extend(self.dependency_links)
for version in req.absolute_versions:
if url_name is not None and main_index_url is not None:
locations = [
posixpath.join(main_index_url.url, version)] + locations
file_locations, url_locations = self._sort_locations(locations)
locations = [Link(url) for url in url_locations]
logger.debug('URLs to search for versions for %s:' % req)
for location in locations:
logger.debug('* %s' % location)
found_versions = []
found_versions.extend(
self._package_versions(
[Link(url, '-f') for url in self.find_links], req.name.lower()))
page_versions = []
for page in self._get_pages(locations, req):
logger.debug('Analyzing links from page %s' % page.url)
logger.indent += 2
try:
page_versions.extend(self._package_versions(page.links, req.name.lower()))
finally:
logger.indent -= 2
dependency_versions = list(self._package_versions(
[Link(url) for url in self.dependency_links], req.name.lower()))
if dependency_versions:
logger.info('dependency_links found: %s' % ', '.join([link.url for parsed, link, version in dependency_versions]))
file_versions = list(self._package_versions(
[Link(url) for url in file_locations], req.name.lower()))
if not found_versions and not page_versions and not dependency_versions and not file_versions:
logger.fatal('Could not find any downloads that satisfy the requirement %s' % req)
raise DistributionNotFound('No distributions at all found for %s' % req)
if req.satisfied_by is not None:
found_versions.append((req.satisfied_by.parsed_version, Inf, req.satisfied_by.version))
if file_versions:
file_versions.sort(reverse=True)
logger.info('Local files found: %s' % ', '.join([url_to_path(link.url) for parsed, link, version in file_versions]))
found_versions = file_versions + found_versions
all_versions = found_versions + page_versions + dependency_versions
applicable_versions = []
for (parsed_version, link, version) in all_versions:
if version not in req.req:
logger.info("Ignoring link %s, version %s doesn't match %s"
% (link, version, ','.join([''.join(s) for s in req.req.specs])))
continue
applicable_versions.append((link, version))
applicable_versions = sorted(applicable_versions, key=lambda v: pkg_resources.parse_version(v[1]), reverse=True)
existing_applicable = bool([link for link, version in applicable_versions if link is Inf])
if not upgrade and existing_applicable:
if applicable_versions[0][1] is Inf:
logger.info('Existing installed version (%s) is most up-to-date and satisfies requirement'
% req.satisfied_by.version)
else:
logger.info('Existing installed version (%s) satisfies requirement (most up-to-date version is %s)'
% (req.satisfied_by.version, applicable_versions[0][1]))
return None
if not applicable_versions:
logger.fatal('Could not find a version that satisfies the requirement %s (from versions: %s)'
% (req, ', '.join([version for parsed_version, link, version in found_versions])))
raise DistributionNotFound('No distributions matching the version for %s' % req)
if applicable_versions[0][0] is Inf:
# We have an existing version, and its the best version
logger.info('Installed version (%s) is most up-to-date (past versions: %s)'
% (req.satisfied_by.version, ', '.join([version for link, version in applicable_versions[1:]]) or 'none'))
return None
if len(applicable_versions) > 1:
logger.info('Using version %s (newest of versions: %s)' %
(applicable_versions[0][1], ', '.join([version for link, version in applicable_versions])))
return applicable_versions
if __name__ == '__main__':
req = InstallRequirement.from_line(sys.argv[1], None)
finder = MyPackageFinder([], ['http://pypi.python.org/simple/'])
versions = finder.find_requirement(req, False)
print 'Versions of %s' % sys.argv[1]
for v in versions:
print v[1]
You could the yolk3k package instead of yolk. yolk3k is a fork from the original yolk and it supports both python2 and 3.
https://github.com/myint/yolk
pip install yolk3k
You can try to install package version that does to exist. Then pip will list available versions
pip install hell==99999
ERROR: Could not find a version that satisfies the requirement hell==99999
(from versions: 0.1.0, 0.2.0, 0.2.1, 0.2.2, 0.2.3, 0.2.4, 0.3.0,
0.3.1, 0.3.2, 0.3.3, 0.3.4, 0.4.0, 0.4.1)
ERROR: No matching distribution found for hell==99999
This works for me on OSX:
pip install docker-compose== 2>&1 \
| grep -oE '(\(.*\))' \
| awk -F:\ '{print$NF}' \
| sed -E 's/( |\))//g' \
| tr ',' '\n'
It returns the list one per line:
1.1.0rc1
1.1.0rc2
1.1.0
1.2.0rc1
1.2.0rc2
1.2.0rc3
1.2.0rc4
1.2.0
1.3.0rc1
1.3.0rc2
1.3.0rc3
1.3.0
1.3.1
1.3.2
1.3.3
1.4.0rc1
1.4.0rc2
1.4.0rc3
1.4.0
1.4.1
1.4.2
1.5.0rc1
1.5.0rc2
1.5.0rc3
1.5.0
1.5.1
1.5.2
1.6.0rc1
1.6.0
1.6.1
1.6.2
1.7.0rc1
1.7.0rc2
1.7.0
1.7.1
1.8.0rc1
1.8.0rc2
1.8.0
1.8.1
1.9.0rc1
1.9.0rc2
1.9.0rc3
1.9.0rc4
1.9.0
1.10.0rc1
1.10.0rc2
1.10.0
Or to get the latest version available:
pip install docker-compose== 2>&1 \
| grep -oE '(\(.*\))' \
| awk -F:\ '{print$NF}' \
| sed -E 's/( |\))//g' \
| tr ',' '\n' \
| gsort -r -V \
| head -1
1.10.0rc2
Keep in mind gsort has to be installed (on OSX) to parse the versions. You can install it with brew install coreutils
My project luddite has this feature.
Example usage:
>>> import luddite
>>> luddite.get_versions_pypi("python-dateutil")
('0.1', '0.3', '0.4', '0.5', '1.0', '1.1', '1.2', '1.4', '1.4.1', '1.5', '2.0', '2.1', '2.2', '2.3', '2.4.0', '2.4.1', '2.4.2', '2.5.0', '2.5.1', '2.5.2', '2.5.3', '2.6.0', '2.6.1', '2.7.0', '2.7.1', '2.7.2', '2.7.3', '2.7.4', '2.7.5', '2.8.0')
It lists all versions of a package available, by querying the JSON API of https://pypi.org/
Update:
Maybe the solution is not needed anymore, check comments to this answer.
Original Answer
With pip versions above 20.03 you can use the old solver in order to get back all the available versions:
$ pip install --use-deprecated=legacy-resolver pylibmc==
ERROR: Could not find a version that satisfies the requirement pylibmc== (from
versions: 0.2, 0.3, 0.4, 0.5, 0.5.1, 0.5.2, 0.5.3, 0.5.4, 0.5.5, 0.6, 0.6.1,
0.7, 0.7.1, 0.7.2, 0.7.3, 0.7.4, 0.8, 0.8.1, 0.8.2, 0.9, 0.9.1, 0.9.2, 1.0a0,
1.0b0, 1.0, 1.1, 1.1.1, 1.2.0, 1.2.1, 1.2.2, 1.2.3, 1.3.0, 1.4.0, 1.4.1,
1.4.2, 1.4.3, 1.5.0, 1.5.1, 1.5.2, 1.5.100.dev0, 1.6.0, 1.6.1)
ERROR: No matching distribution found for pylibmc==
The pypi-version package does an excellent job:
$ pip3 install pip-versions
$ pip-versions latest rsyncy
0.0.4
$ pip-versions list rsyncy
0.0.1
0.0.2
0.0.3
0.0.4
And this even works behind a Nexus (sonatype) proxy!
I usually run pip install packagename==somerandomstring. This returns error saying Could not find a version that satisfies the requirement packagename==somerandomstring and along with that error, pip will also list available versions on the server.
e.g.
$ pip install flask==aksjflashd
Collecting flask==aksjflashd
Could not find a version that satisfies the requirement flask==aksjflashd
(from versions: 0.1, 0.2, 0.3, 0.3.1, 0.4, 0.5, 0.5.1, 0.5.2, 0.6, 0.6.1, 0.7, 0.7.1, 0.7.2, 0.8, 0.8.1, 0.9, 0.10, 0.10.1, 0.11, 0.11.1, 0.12, 0.12.1,
0.12.2, 0.12.3, 0.12.4, 0.12.5, 1.0, 1.0.1, 1.0.2, 1.0.3, 1.0.4, 1.1.0, 1.1.1, 1.1.2)
No matching distribution found for flask==aksjflashd
$
You have to be extremely unlucky if the random string like 'aksjflashd' turns out to be actual package version!
Of course, you can use this trick with pip download too.
https://pypi.python.org/pypi/Django/ - works for packages whose maintainers choose to show all packages
https://pypi.python.org/simple/pip/ - should do the trick anyhow (lists all links)
Alternative solution is to use the Warehouse APIs:
https://warehouse.readthedocs.io/api-reference/json/#release
For instance for Flask:
import requests
r = requests.get("https://pypi.org/pypi/Flask/json")
print(r.json()['releases'].keys())
will print:
dict_keys(['0.1', '0.10', '0.10.1', '0.11', '0.11.1', '0.12', '0.12.1', '0.12.2', '0.12.3', '0.12.4', '0.2', '0.3', '0.3.1', '0.4', '0.5', '0.5.1', '0.5.2', '0.6', '0.6.1', '0.7', '0.7.1', '0.7.2', '0.8', '0.8.1', '0.9', '1.0', '1.0.1', '1.0.2'])
Here's my answer that sorts the list inside jq (for those who use systems where sort -V is not avalable) :
$ pythonPackage=certifi
$ curl -Ls https://pypi.org/pypi/$pythonPackage/json | jq -r '.releases | keys_unsorted | sort_by( split(".") | map(tonumber) )'
.............
"2019.3.9",
"2019.6.16",
"2019.9.11",
"2019.11.28",
"2020.4.5",
"2020.4.5.1",
"2020.4.5.2",
"2020.6.20",
"2020.11.8"
]
And to fetch the last version number of the package :
$ curl -Ls https://pypi.org/pypi/$pythonPackage/json | jq -r '.releases | keys_unsorted | sort_by( split(".") | map(tonumber) )[-1]'
2020.11.8
or a bit faster :
$ curl -Ls https://pypi.org/pypi/$pythonPackage/json | jq -r '.releases | keys_unsorted | max_by( split(".") | map(tonumber) )'
2020.11.8
Or even more simple :) :
$ curl -Ls https://pypi.org/pypi/$pythonPackage/json | jq -r .info.version
2020.11.8
Simple bash script that relies only on python itself (I assume that in the context of the question it should be installed) and one of curl or wget. It has an assumption that you have setuptools package installed to sort versions (almost always installed). It doesn't rely on external dependencies such as:
jq which may not be present;
grep and awk that may behave differently on Linux and macOS.
curl --silent --location https://pypi.org/pypi/requests/json | python -c "import sys, json, pkg_resources; releases = json.load(sys.stdin)['releases']; print(' '.join(sorted(releases, key=pkg_resources.parse_version)))"
A little bit longer version with comments.
Put the package name into a variable:
PACKAGE=requests
Get versions (using curl):
VERSIONS=$(curl --silent --location https://pypi.org/pypi/$PACKAGE/json | python -c "import sys, json, pkg_resources; releases = json.load(sys.stdin)['releases']; print(' '.join(sorted(releases, key=pkg_resources.parse_version)))")
Get versions (using wget):
VERSIONS=$(wget -qO- https://pypi.org/pypi/$PACKAGE/json | python -c "import sys, json, pkg_resources; releases = json.load(sys.stdin)['releases']; print(' '.join(sorted(releases, key=pkg_resources.parse_version)))")
Print sorted versions:
echo $VERSIONS
I didn't have any luck with yolk, yolk3k or pip install -v but so I ended up using this (adapted to Python 3 from eric chiang's answer):
import json
import requests
from distutils.version import StrictVersion
def versions(package_name):
url = "https://pypi.python.org/pypi/{}/json".format(package_name)
data = requests.get(url).json()
return sorted(list(data["releases"].keys()), key=StrictVersion, reverse=True)
>>> print("\n".join(versions("gunicorn")))
19.1.1
19.1.0
19.0.0
18.0
17.5
0.17.4
0.17.3
...
Works with recent pip versions, no extra tools necessary:
pip install pylibmc== -v 2>/dev/null | awk '/Found link/ {print $NF}' | uniq
This is Py3.9+ version of Limmy+EricChiang 's solution.
import json
import urllib.request
from distutils.version import StrictVersion
# print PyPI versions of package
def versions(package_name):
url = "https://pypi.org/pypi/%s/json" % (package_name,)
data = json.load(urllib.request.urlopen(url))
versions = list(data["releases"])
sortfunc = lambda x: StrictVersion(x.replace('rc', 'b').translate(str.maketrans('cdefghijklmn', 'bbbbbbbbbbbb')))
versions.sort(key=sortfunc)
return versions
As of what I understood from this and this, the releases key will be dropped in a near future so solutions using GET "https://pypi.python.org/pypi/{package_name}/json" will not work anymore.
To find all available (even incompatible) versions as well, use the -vv flag with pip >= 21.x.
pip install sklearn== --dry-run -vv
Incompatible versions will be listed in the log like this:
Skipping link: none of the wheel's tags (cp27-cp27m-win32) are compatible
My take is a combination of a couple of posted answers, with some modifications to make them easier to use from within a running python environment.
The idea is to provide a entirely new command (modeled after the install command) that gives you an instance of the package finder to use. The upside is that it works with, and uses, any indexes that pip supports and reads your local pip configuration files, so you get the correct results as you would with a normal pip install.
I've made an attempt at making it compatible with both pip v 9.x and 10.x.. but only tried it on 9.x
https://gist.github.com/kaos/68511bd013fcdebe766c981f50b473d4
#!/usr/bin/env python
# When you want a easy way to get at all (or the latest) version of a certain python package from a PyPi index.
import sys
import logging
try:
from pip._internal import cmdoptions, main
from pip._internal.commands import commands_dict
from pip._internal.basecommand import RequirementCommand
except ImportError:
from pip import cmdoptions, main
from pip.commands import commands_dict
from pip.basecommand import RequirementCommand
from pip._vendor.packaging.version import parse as parse_version
logger = logging.getLogger('pip')
class ListPkgVersionsCommand(RequirementCommand):
"""
List all available versions for a given package from:
- PyPI (and other indexes) using requirement specifiers.
- VCS project urls.
- Local project directories.
- Local or remote source archives.
"""
name = "list-pkg-versions"
usage = """
%prog [options] <requirement specifier> [package-index-options] ...
%prog [options] [-e] <vcs project url> ...
%prog [options] [-e] <local project path> ...
%prog [options] <archive url/path> ..."""
summary = 'List package versions.'
def __init__(self, *args, **kw):
super(ListPkgVersionsCommand, self).__init__(*args, **kw)
cmd_opts = self.cmd_opts
cmd_opts.add_option(cmdoptions.install_options())
cmd_opts.add_option(cmdoptions.global_options())
cmd_opts.add_option(cmdoptions.use_wheel())
cmd_opts.add_option(cmdoptions.no_use_wheel())
cmd_opts.add_option(cmdoptions.no_binary())
cmd_opts.add_option(cmdoptions.only_binary())
cmd_opts.add_option(cmdoptions.pre())
cmd_opts.add_option(cmdoptions.require_hashes())
index_opts = cmdoptions.make_option_group(
cmdoptions.index_group,
self.parser,
)
self.parser.insert_option_group(0, index_opts)
self.parser.insert_option_group(0, cmd_opts)
def run(self, options, args):
cmdoptions.resolve_wheel_no_use_binary(options)
cmdoptions.check_install_build_global(options)
with self._build_session(options) as session:
finder = self._build_package_finder(options, session)
# do what you please with the finder object here... ;)
for pkg in args:
logger.info(
'%s: %s', pkg,
', '.join(
sorted(
set(str(c.version) for c in finder.find_all_candidates(pkg)),
key=parse_version,
)
)
)
commands_dict[ListPkgVersionsCommand.name] = ListPkgVersionsCommand
if __name__ == '__main__':
sys.exit(main())
Example output
./list-pkg-versions.py list-pkg-versions pika django
pika: 0.5, 0.5.1, 0.5.2, 0.9.1a0, 0.9.2a0, 0.9.3, 0.9.4, 0.9.5, 0.9.6, 0.9.7, 0.9.8, 0.9.9, 0.9.10, 0.9.11, 0.9.12, 0.9.13, 0.9.14, 0.10.0b1, 0.10.0b2, 0.10.0, 0.11.0b1, 0.11.0, 0.11.1, 0.11.2, 0.12.0b2
django: 1.1.3, 1.1.4, 1.2, 1.2.1, 1.2.2, 1.2.3, 1.2.4, 1.2.5, 1.2.6, 1.2.7, 1.3, 1.3.1, 1.3.2, 1.3.3, 1.3.4, 1.3.5, 1.3.6, 1.3.7, 1.4, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6, 1.4.7, 1.4.8, 1.4.9, 1.4.10, 1.4.11, 1.4.12, 1.4.13, 1.4.14, 1.4.15, 1.4.16, 1.4.17, 1.4.18, 1.4.19, 1.4.20, 1.4.21, 1.4.22, 1.5, 1.5.1, 1.5.2, 1.5.3, 1.5.4, 1.5.5, 1.5.6, 1.5.7, 1.5.8, 1.5.9, 1.5.10, 1.5.11, 1.5.12, 1.6, 1.6.1, 1.6.2, 1.6.3, 1.6.4, 1.6.5, 1.6.6, 1.6.7, 1.6.8, 1.6.9, 1.6.10, 1.6.11, 1.7, 1.7.1, 1.7.2, 1.7.3, 1.7.4, 1.7.5, 1.7.6, 1.7.7, 1.7.8, 1.7.9, 1.7.10, 1.7.11, 1.8a1, 1.8b1, 1.8b2, 1.8rc1, 1.8, 1.8.1, 1.8.2, 1.8.3, 1.8.4, 1.8.5, 1.8.6, 1.8.7, 1.8.8, 1.8.9, 1.8.10, 1.8.11, 1.8.12, 1.8.13, 1.8.14, 1.8.15, 1.8.16, 1.8.17, 1.8.18, 1.8.19, 1.9a1, 1.9b1, 1.9rc1, 1.9rc2, 1.9, 1.9.1, 1.9.2, 1.9.3, 1.9.4, 1.9.5, 1.9.6, 1.9.7, 1.9.8, 1.9.9, 1.9.10, 1.9.11, 1.9.12, 1.9.13, 1.10a1, 1.10b1, 1.10rc1, 1.10, 1.10.1, 1.10.2, 1.10.3, 1.10.4, 1.10.5, 1.10.6, 1.10.7, 1.10.8, 1.11a1, 1.11b1, 1.11rc1, 1.11, 1.11.1, 1.11.2, 1.11.3, 1.11.4, 1.11.5, 1.11.6, 1.11.7, 1.11.8, 1.11.9, 1.11.10, 1.11.11, 1.11.12, 2.0, 2.0.1, 2.0.2, 2.0.3, 2.0.4
pypi-has() { set -o pipefail; curl -sfL https://pypi.org/pypi/$1/json | jq -e --arg v $2 'any( .releases | keys[]; . == $v )'; }
Usage:
$ pypi-has django 4.0x ; echo $?
false
1
$ pypi-has djangos 4.0x ; echo $?
22
$ pypi-has djangos 4.0 ; echo $?
22
$ pypi-has django 4.0 ; echo $?
true
0
Providing a programmatic approach to Chris's answer using pip install <package_name>==
import re
import subprocess
from packaging.version import VERSION_PATTERN as _VRESION_PATTERN
VERSION_PATTERN = re.compile(_VRESION_PATTERN , re.VERBOSE | re.IGNORECASE)
def get_available_versions(package_name):
process = subprocess.run(['pip', 'install', f'{package_name}=='], stdout=subprocess.DEVNULL, stderr=subprocess.PIPE)
versions = []
for line in process.stderr.decode('utf-8').splitlines():
if 'Could not find a version that satisfies the requirement' in line:
for match in VERSION_PATTERN.finditer(line.split('from versions:')[1]):
versions.append(match.group(0))
return versions
It can be used like
>>> get_available_versions('tensorflow')
['2.2.0rc1', '2.2.0rc2', '2.2.0rc3', '2.2.0rc4', '2.2.0', '2.2.1', '2.2.2', '2.2.3', '2.3.0rc0', '2.3.0rc1', '2.3.0rc2', '2.3.0', '2.3.1', '2.3.2', '2.3.3', '2.3.4', '2.4.0rc0', '2.4.0rc1', '2.4.0rc2', '2.4.0rc3', '2.4.0rc4', '2.4.0', '2.4.1', '2.4.2', '2.4.3', '2.4.4', '2.5.0rc0', '2.5.0rc1', '2.5.0rc2', '2.5.0rc3', '2.5.0', '2.5.1', '2.5.2', '2.5.3', '2.6.0rc0', '2.6.0rc1', '2.6.0rc2', '2.6.0', '2.6.1', '2.6.2', '2.6.3', '2.7.0rc0', '2.7.0rc1', '2.7.0', '2.7.1', '2.8.0rc0', '2.8.0rc1', '2.8.0']
and return a list of versions.
Note: it seems to provide compatible releases rather than all releases. To get full list, use json approach from Eric.
To fetch the latest version for a GitLab private package, the below works.
pip index versions package-name --index-url https://<personal_access_token_name>:<personal_access_token>#gitlab.com/api/v4/projects/<project-id>/packages/pypi/simple/ | grep 'LATEST:' | sed -E 's/LATEST:| //g'

Broke my conda. Can't install packages from custom channel any longer

I've inherited a machine, with an Anaconda installation, on which I must manage my python environments.
I have a custom channel, on a network share, that I host my conda packages on. It has the highest priority.
Everything was working to my satisfaction, except that conda update -n base -c defaults conda has become extremely slow. Sometimes I have to kill it after a few hours.
After looking for solutions to speed up environment solve, I edited my conda config by doing:
conda config --set channel_priority strict
conda config --set default_threads 4
Neither of which helped, so I reverted to my previous conda config settings.
Now, when I try to update/install a package from my channel to any environment it fails. Installing standard packages like numpy still works.
To keep things simple, I have created a "scratch" environment with python only. It contains:
# packages in environment at C:\ProgramData\Anaconda3\envs\scratch:
#
# Name Version Build Channel
ca-certificates 2021.10.26 haa95532_4
certifi 2021.10.8 py39haa95532_2
openssl 1.1.1m h2bbff1b_0
pip 21.2.4 py39haa95532_0
python 3.9.7 h6244533_1
setuptools 58.0.4 py39haa95532_0
sqlite 3.37.2 h2bbff1b_0
tzdata 2021e hda174b7_0
vc 14.2 h21ff451_1
vs2015_runtime 14.27.29016 h5e58377_2
wheel 0.37.1 pyhd3eb1b0_0
wincertstore 0.2 py39haa95532_2
I have a package foo_package, which conda sees on my channel:
(scratch) PS C:\Users\griffin> conda search foo_package
Loading channels: done
# Name Version Build Channel
foo_package 1.0 py36_0 custom_packages
foo_package 1.0 py37_0 custom_packages
foo_package 1.0 py38_0 custom_packages
foo_package 1.0 py39_0 custom_packages
When I try to install it:
(scratch) PS C:\Users\griffin> conda install foo_package
Collecting package metadata (repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Solving environment: |
Found conflicts! Looking for incompatible packages.
failed
UnsatisfiableError: The following specifications were found
to be incompatible with the existing python installation in your environment:
Specifications:
- foo_package -> python[version='>=2.7,<2.8.0a0|>=3.5,<3.6.0a0']
Your python: python=3.9
If python is on the left-most side of the chain, that's the version you've asked for.
When python appears to the right, that indicates that the thing on the left is somehow
not available for the python version you are constrained to. Note that conda will not
change your python version to a different minor version unless you explicitly specify
that.
I have no idea where it is getting that python[version='>=2.7,<2.8.0a0|>=3.5,<3.6.0a0'] dependency from. The meta data for my package is very simple:
package:
name: foo_package
version: 1.0
source:
path: .
requirements:
host:
- python
run:
- python
- redis-py
The same set up continues to work on my other machines. So I feel it must be something to do with the configuration.
I looked at all the possible places a condarc file could be. I found two, and they seemed contradictory. So I deleted them both and re-added my custom channel to the list.
But the issue persists. What can I do?
My current conda config looks like so:
(base) PS C:\Users\griffin> conda config --show
add_anaconda_token: True
add_pip_as_python_dependency: True
aggressive_update_packages:
- ca-certificates
- certifi
- openssl
allow_conda_downgrades: False
allow_cycles: True
allow_non_channel_urls: False
allow_softlinks: False
always_copy: False
always_softlink: False
always_yes: None
anaconda_upload: None
auto_activate_base: True
auto_stack: 0
auto_update_conda: True
bld_path:
changeps1: True
channel_alias: https://conda.anaconda.org
channel_priority: flexible
channels:
- N:\path\to\channel\custom_packages
- defaults
client_ssl_cert: None
client_ssl_cert_key: None
clobber: False
conda_build: {}
create_default_packages: []
croot: C:\ProgramData\Anaconda3\conda-bld
custom_channels:
pkgs/main: https://repo.anaconda.com
pkgs/r: https://repo.anaconda.com
pkgs/msys2: https://repo.anaconda.com
pkgs/pro: https://repo.anaconda.com
custom_multichannels:
defaults:
- https://repo.anaconda.com/pkgs/main
- https://repo.anaconda.com/pkgs/r
- https://repo.anaconda.com/pkgs/msys2
local:
debug: False
default_channels:
- https://repo.anaconda.com/pkgs/main
- https://repo.anaconda.com/pkgs/r
- https://repo.anaconda.com/pkgs/msys2
default_python: 3.7
default_threads: None
deps_modifier: not_set
dev: False
disallowed_packages: []
download_only: False
dry_run: False
enable_private_envs: False
env_prompt: ({default_env})
envs_dirs:
- C:\ProgramData\Anaconda3\envs
- C:\Users\griffin\.conda\envs
- C:\Users\griffin\AppData\Local\conda\conda\envs
error_upload_url: https://conda.io/conda-post/unexpected-error
execute_threads: 1
extra_safety_checks: False
force: False
force_32bit: False
force_reinstall: False
force_remove: False
ignore_pinned: False
json: False
local_repodata_ttl: 1
migrated_channel_aliases: []
migrated_custom_channels: {}
non_admin_enabled: True
notify_outdated_conda: True
offline: False
override_channels_enabled: True
path_conflict: clobber
pinned_packages: []
pip_interop_enabled: False
pkgs_dirs:
- C:\ProgramData\Anaconda3\pkgs
- C:\Users\griffin\.conda\pkgs
- C:\Users\griffin\AppData\Local\conda\conda\pkgs
proxy_servers: {}
quiet: False
remote_backoff_factor: 1
remote_connect_timeout_secs: 9.15
remote_max_retries: 3
remote_read_timeout_secs: 60.0
repodata_fns:
- current_repodata.json
- repodata.json
repodata_threads: None
report_errors: None
restore_free_channel: False
rollback_enabled: True
root_prefix: C:\ProgramData\Anaconda3
safety_checks: warn
sat_solver: pycosat
separate_format_cache: False
shortcuts: True
show_channel_urls: None
signing_metadata_url_base: None
solver_ignore_timestamps: False
ssl_verify: True
subdir: win-64
subdirs:
- win-64
- noarch
target_prefix_override:
track_features: []
unsatisfiable_hints: True
unsatisfiable_hints_check_depth: 2
update_modifier: update_specs
use_index_cache: False
use_local: False
use_only_tar_bz2: True
verbosity: 0
verify_threads: 1
whitelist_channels: []

Import error for python lxml module

I am using conda virtual environment and I have intsalled the lxml module using conda install lxml. The package was successfully installed but when I open python in my terminal and do import lxml, I get the following error:
`>>> import lxml
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named 'lxml'
`
I don't understand what is the issue here.
My installed packages are as follows:
`# packages in environment at /users/deepayan/miniconda3/envs/pytorch:
#
hdf5 1.8.17 1
jbig 2.1 0
jpeg 8d 2
libiconv 1.14 0 anaconda
libpng 1.6.27 0
libtiff 4.0.6 2
libxml2 2.9.4 0 anaconda
libxslt 1.1.29 0 anaconda
lxml 3.8.0 py36_0
mkl 2017.0.1 0
numpy 1.12.1 py36_0
opencv 3.1.0 np112py36_1
openssl 1.0.2l 0
pip 9.0.1 py36_1
python 3.6.1 2
readline 6.2 2
setuptools 27.2.0 py36_0
sqlite 3.13.0 0
tk 8.5.18 0
wheel 0.29.0 py36_0
xz 5.2.2 1
zlib 1.2.8 3
`
All other packages can be imported error free, the problem is only with lxml. If anyone can suggest a way out, I'd be very grateful.
Thanks

Installing self-made package in conda environment does not make it importable

I want to install a package I made into one of my conda envorinments (For Python 3.5). Yet, when I use conda list, I can find it (called spbusiness):
...
sockjs-tornado 1.0.3 py35_0
spbusiness 0.1 <pip>
sphinx 1.4.6 py35_0
spyder 3.0.0 py35_0
...
Now when I want to use it with iPython it raises:
ImportError: No module named 'spbusiness'
When I import pip and asking for all installed packages:
import pip
pip.get_installed_distributions()
It is found in the \anaconda3\lib\site-packages folder:
...
SQLAlchemy 1.0.13 (c:\users\martin\anaconda3\lib\site-packages),
spyder 3.0.0 (c:\users\martin\anaconda3\lib\site-packages),
spbusiness 0.1 (c:\users\martin\anaconda3\lib\site-packages),
sockjs-tornado 1.0.3 (c:\users\martin\anaconda3\lib\site-packages),
snowballstemmer 1.2.1 (c:\users\martin\anaconda3\lib\site-packages),
...
I used the Python interpreter from Anaconda directory ~\Anaconda3\python.exe for setup.py install. It returned:
running install
running build
running build_py
running install_lib
running install_egg_info
Removing [...]\Anaconda3\Lib\site-packages\spbusiness-0.1-py3.5.egg-info
Writing [...]\Anaconda3\Lib\site-packages\spbusiness-0.1-py3.5.egg-info
Still I can't use it in iPython nor in Jupyter Notebooks. I probably made something wrong but I can't get it the right way.
Thanks for help in forward.
Martin

numpy asarray float32 works on Ubuntu but not Windows 7

I followed the instructions here
Steps 1 and 2 have been checked. My Intel(R) Core(TM) i7-2860QM CPU # 2.50GHz CPU can support 64-bit and Intel virtalization tech and it is currently using virtualization tech according to my BIOS. Checking step 3: I'm on Ubuntu so no antivirus software and I'm not running any system level debugging. Now see the attached image, even though I set the VM to be 64-bit on the left, it is still 32-bit on the right.
I know that the settings are merely for organizational purposes and that they can't actually change the bitness of the VM. I downloaded the VM here - https://developer.microsoft.com/en-us/microsoft-edge/tools/vms/linux/. None of them are marked as 64-bit, so I do not know how to guarantee that I have a 64-bit Windows image
This is not the main issue I'm trying to solve though. It has been inferred as being the cause of my main issue.
Same code works on Ubuntu 14.04 but not the Windows 7 VM. Below you'll see me debugging and all variables look identical.
Next I type the error-causing line into the console and sure enough on one OS we have no issues and on the other we blow up
>>> np.asarray(frames, dtype=np.float32)
Traceback (most recent call last):
File "C:\Program Files\JetBrains\PyCharm 2016.1.2\helpers\pydev\_pydevd_bundle\pydevd_exec.py", line 3, in Exec
exec exp in global_vars, local_vars
File "<input>", line 1, in <module>
File "C:\Python27\lib\site-packages\numpy\core\numeric.py", line 482, in asarray
return array(a, dtype, copy=False, order=order)
MemoryError
Now the part that gives me the creeps. When I go back to the debugging tab from the console in Ubuntu I see many new variables have spontaneously been created even though I only typed one line into the Python console
I figure there should be a package problem.
I'm using Python 2.7.11 on the Windows 7 VM and these packages are installed
FITS-tools 0.0.dev0
Pillow 3.2.0 3.2.0
Pillow-PIL 0.1.dev0 0.1dev
PyQt4 4.11.4 4.11.4
astropy 1.1.2 1.1.2
cycler 0.10.0
image-registration 0.2.2.dev272
matplotlib 1.5.1 1.5.1
numpy 1.11.0 1.11.0
parmap 1.2.3 1.2.3
pip 8.1.1 8.1.1
pyfits 3.4 3.4
pyparsing 2.1.1 2.1.1
pyqtgraph 0.9.10 0.9.10
python-dateutil 2.5.3 2.5.3
pytz 2016.4 2016.4
scipy 0.17.0 0.17.0
setuptools 20.10.1 21.0.0
six 1.10.0 1.10.0
wheel 0.29.0 0.29.0
On Ubuntu - to my surprise - I'm using Python 2.7.6 (and for some reason I get a make: *** [libinstall] Error 1 when I try to upgrade to 2.7.11 but that's another issue). Here are the packages I ave installed on the working Ubuntu side
BeautifulSoup 3.2.1 3.2.1
CherryPy 3.2.2 5.3.0
Cython 0.22 0.24
Django 1.9.1 1.9.6
Markdown 2.4 2.6.6
PAM 0.4.2
Pillow 2.3.0 3.2.0
PyOpenGL 3.0.2 3.1.1a1
Pygments 1.6 2.1.3
Routes 2.0 2.3.1
Twisted-Core 13.2.0
Twisted-Web 13.2.0
VTK 5.8.0
WebOb 1.3.1 1.6.0
adium-theme-ubuntu 0.3.4
amqplib 1.0.2 1.0.2
apptools 4.3.0 4.4.0
apsw 3.8.2-r1 3.9.2-r1
apt-xapian-index 0.45
argparse 1.2.1 1.4.0
astropy 1.1.2 1.1.2
cffi 0.8.6 1.6.0
chardet 2.0.1 2.3.0
colorama 0.2.5 0.3.7
command-not-found 0.3
configobj 5.0.6 5.0.6
cssselect 0.9.1 0.9.1
cssutils 0.9.10 1.0.1
debtagshw 0.1
defer 1.0.6 1.0.4
deluge 1.3.6
dirspec 13.10 13.08
dnspython 1.11.1 1.12.0
duplicity 0.6.23
envisage 4.1.0 4.5.1
feedparser 5.1.3 5.2.1
h5py 2.2.1 2.6.0
html5lib 0.999 0.9999999
httplib2 0.8 0.9.2
image-registration 0.2.2.dev272
ipython 3.1.0 4.2.0
libtfr 1.0.4 2.0.0b4
lockfile 0.8 0.12.2
lxml 3.3.3 3.6.0
matplotlib 1.4.3 1.5.1
mayavi 4.4.3 4.4.4
mechanize 0.2.5 0.2.5
mock 1.0.1 2.0.0
netifaces 0.8 0.10.4
nose 1.3.7 1.3.7
numexpr 2.2.2 2.5.2
numpy 1.9.2 1.11.0
oauthlib 0.6.1 1.1.1
oneconf 0.3.7.14.04.1 0.0.1.dev0
pandas 0.16.1 0.18.1
parmap 1.2.3 1.2.3
pexpect 3.1 4.0.1
pip 1.5.4 8.1.1
piston-mini-client 0.7.5 0.7.5
plotly 1.6.17 1.9.10
ply 3.4 3.8
py 1.4.31 1.4.31
pyFFTW 0.9.2 0.10.1
pyOpenSSL 0.13 16.0.0
pycparser 2.10 2.14
pycrypto 2.6.1 2.6.1
pycups 1.9.66 1.9.73
pyface 5.0.0 5.1.0
pygame 1.9.1release
pygobject 3.12.0
pygpgme 0.3 0.3
pyparsing 2.0.3 2.1.1
pyqtgraph 0.9.10 0.9.10
pyserial 2.6 3.0.1
pysmbc 1.0.14.1 1.0.15.5
pytest 2.9.1 2.9.1
python-apt 0.9.3.5ubuntu2 0.7.8
python-dateutil 2.4.2 2.5.3
python-debian 0.1.21-nmu2ubuntu2 0.1.23
python-libtorrent 0.16.13 1.1.0
pytz 2015.4 2016.4
pyxdg 0.25 0.25
pyzmq 14.7.0 15.2.0
reportlab 3.0 3.3.0
repoze.lru 0.6 0.6
requests 2.2.1 2.10.0
scikit-learn 0.17.1 0.17.1
scipy 0.15.1 0.17.0
sessioninstaller 0.0.0
setuptools 3.3 21.0.0
simplejson 3.7.3 3.8.2
six 1.5.2 1.10.0
sklearn 0.0 0.0
software-center-aptd-plugins 0.0.0
system-service 0.1.6
tables 3.1.1 3.2.2
traits 4.5.0 4.5.0
traitsui 5.0.0 5.1.0
uTidylib 0.2 0.2
unity-lens-photos 1.0
urllib3 1.7.1 1.15.1
vboxapi 1.0 1.0
wheel 0.24.0 0.29.0
wsgiref 0.1.2 0.1.2
wxPython 2.8.12.1 2.9.1.1
wxPython-common 2.8.12.1 2.6.3.3
xdiagnose 3.6.3build2
xppy 0.7.0
zope.interface 4.0.5 4.1.3
You've got MemoryError. Means you requested allocation which is beyond available memory on your VM. Some potential reasons:
not enough memory allocated for VM (try to increase it)
2GB limit for process on Windows (run LARGEADDRESSAWARE python or move to 64 bit)
memory corruption so your heap is corrupted (debug your code)
Similar discussion Memory errors and list limits?

Categories

Resources