pip install dependency links - python

I am using python version 2.7 and pip version is 1.5.6.
I want to install extra libraries from url like a git repo on setup.py is being installed.
I was putting extras in install_requires parameter in setup.py. This means, my library requires extra libraries and they must also be installed.
...
install_requires=[
"Django",
....
],
...
But urls like git repos are not valid string in install_requires in setup.py. Assume that, I want to install a library from github. I have searched about that issue and I found something which I can put libraries such that in dependency_links in setup.py. But that still doesn't work. Here is my dependency links definition;
dependency_links=[
"https://github.com/.../tarball/master/#egg=1.0.0",
"https://github.com/.../tarball/master#egg=0.9.3",
],
The links are valid. I can download them from a internet browser with these urls. These extra libraries are still not installed with my setting up. I also tried --process-dependency-links parameter to force pip. But result is same. I take no error when pipping.
After installation, I see no library in pip freeze result in dependency_links.
How can I make them to be downloaded with my setup.py installation?
Edited:
Here is my complete setup.py
from setuptools import setup
try:
long_description = open('README.md').read()
except IOError:
long_description = ''
setup(
name='esef-sso',
version='1.0.0.0',
description='',
url='https://github.com/egemsoft/esef-sso.git',
keywords=["django", "egemsoft", "sso", "esefsso"],
install_requires=[
"Django",
"webservices",
"requests",
"esef-auth==1.0.0.0",
"django-simple-sso==0.9.3"
],
dependency_links=[
"https://github.com/egemsoft/esef-auth/tarball/master/#egg=1.0.0.0",
"https://github.com/egemsoft/django-simple-sso/tarball/master#egg=0.9.3",
],
packages=[
'esef_sso_client',
'esef_sso_client.models',
'esef_sso_server',
'esef_sso_server.models',
],
include_package_data=True,
zip_safe=False,
platforms=['any'],
)
Edited 2:
Here is pip log;
Downloading/unpacking esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Getting page https://pypi.python.org/simple/esef-auth/
Could not fetch URL https://pypi.python.org/simple/esef-auth/: 404 Client Error: Not Found
Will skip URL https://pypi.python.org/simple/esef-auth/ when looking for download links for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Getting page https://pypi.python.org/simple/
URLs to search for versions for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0):
* https://pypi.python.org/simple/esef-auth/1.0.0.0
* https://pypi.python.org/simple/esef-auth/
Getting page https://pypi.python.org/simple/esef-auth/1.0.0.0
Could not fetch URL https://pypi.python.org/simple/esef-auth/1.0.0.0: 404 Client Error: Not Found
Will skip URL https://pypi.python.org/simple/esef-auth/1.0.0.0 when looking for download links for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Getting page https://pypi.python.org/simple/esef-auth/
Could not fetch URL https://pypi.python.org/simple/esef-auth/: 404 Client Error: Not Found
Will skip URL https://pypi.python.org/simple/esef-auth/ when looking for download links for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Could not find any downloads that satisfy the requirement esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Cleaning up...
Removing temporary dir /Users/ahmetdal/.virtualenvs/esef-sso-example/build...
No distributions at all found for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
Exception information:
Traceback (most recent call last):
File "/Users/ahmetdal/.virtualenvs/esef-sso-example/lib/python2.7/site-packages/pip/basecommand.py", line 122, in main
status = self.run(options, args)
File "/Users/ahmetdal/.virtualenvs/esef-sso-example/lib/python2.7/site-packages/pip/commands/install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "/Users/ahmetdal/.virtualenvs/esef-sso-example/lib/python2.7/site-packages/pip/req.py", line 1177, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "/Users/ahmetdal/.virtualenvs/esef-sso-example/lib/python2.7/site-packages/pip/index.py", line 277, in find_requirement
raise DistributionNotFound('No distributions at all found for %s' % req)
DistributionNotFound: No distributions at all found for esef-auth==1.0.0.0 (from esef-sso==1.0.0.0)
It seems, it does not use the sources in dependency_links.

The --process-dependency-links option to enable dependency_links was removed in Pip 19.0.
Instead, you can use a PEP 508 URL to specify your dependency, which is supported since Pip 18.1. Here's an example excerpt from setup.py:
install_requires=[
"numpy",
"package1 # git+https://github.com/user1/package1",
"package2 # git+https://github.com/user2/package2#branch1",
],
Note that Pip does not support installing packages with such dependencies from PyPI and in the future you will not be able to upload them to PyPI (see news entry for Pip 18.1).

Pip removed support for dependency_links a while back. The latest version of pip that supports dependency_links is 1.3.1, to install it
pip install pip==1.3.1
your dependency links should work at that point. Please note, that dependency_links were always the last resort for pip, ie. if a package with the same name exists on pypi it will be chosen over yours.
Note, https://github.com/pypa/pip/pull/1955 seems to start allowing dependency_links, pip kept it, but you might need to use some command line switches to use a newer version of pip.
EDIT: As of pip 7 ... they rethought dep links and have enabled them, even though they haven't removed the deprecation notice, from the discussions they seem to be here to stay. With pip>=7 here is how you can install things
pip install -e . --process-dependency-links --allow-all-external
Or add the following to a pip.conf, e.g. /etc/pip.conf
[install]
process-dependency-links = yes
allow-all-external = yes
trusted-host =
bitbucket.org
github.com
EDIT
A trick I have learnt is to bump up the version number to something really high to make sure that pip doesn't prefer the non dependency link version (if that is something you want). From the example above, make the dependency link look like:
"https://github.com/egemsoft/django-simple-sso/tarball/master#egg=999.0.0",
Also make sure the version either looks like the example or is the date version, any other versioning will make pip think its a dev version and wont install it.

You need to make sure you include the dependency in your install_requires too.
Here's an example setup.py
#!/usr/bin/env python
from setuptools import setup
setup(
name='foo',
version='0.0.1',
install_requires=[
'balog==0.0.7'
],
dependency_links=[
'https://github.com/balanced/balog/tarball/master#egg=balog-0.0.7'
]
)
Here's the issue with your example setup.py:
You're missing the egg name in the dependency links you setup.
You have
https://github.com/egemsoft/esef-auth/tarball/master/#egg=1.0.0.0
You need
https://github.com/egemsoft/esef-auth/tarball/master/#egg=esef-auth-1.0.0.0

I faced a similar situation where I want to use shapely as one of my package dependency. Shapely, however, has a caveat that if you are using windows, you have to use the .whl file from http://www.lfd.uci.edu/~gohlke/pythonlibs/. Otherwise, you have to install a C compiler, which is something I don't want. I want the user to simply use pip install mypackage instead of installing a bunch of other stuffs.
And if you have the typical setup with dependency_links
setup(
name = 'streettraffic',
packages = find_packages(), # this must be the same as the name above
version = '0.1',
description = 'A random test lib',
author = 'Costa Huang',
author_email = 'Costa.Huang#outlook.com',
install_requires=['Shapely==1.5.17'],
dependency_links = ['http://www.lfd.uci.edu/~gohlke/pythonlibs/ru4fxw3r/Shapely-1.5.17-cp36-cp36m-win_amd64.whl']
)
and run pip install ., it is simply going to pick the shapely on Pypi and cause trouble on Windows installation. After hours of researching, I found this link Force setuptools to use dependency_links to install mysqlclient and basically use from setuptools.command.install import install as _install to manually install shapely.
from setuptools.command.install import install as _install
from setuptools import setup, find_packages
import pip
class install(_install):
def run(self):
_install.do_egg_install(self)
# just go ahead and do it
pip.main(['install', 'http://localhost:81/Shapely-1.5.17-cp36-cp36m-win_amd64.whl'])
setup(
name = 'mypackage',
packages = find_packages(), # this must be the same as the name above
version = '0.1',
description = 'A random test lib',
author = 'Costa Huang',
author_email = 'test#outlook.com',
cmdclass={'install': install}
)
And the script works out nicely. Hope it helps.

Related

'requests' distribution not found when listed in install_requires together with requests_oauthlib

I have a setup.py file for a python package and when I try to run python setup.py develop it gets stuck on installing the requests package, apparently because of requests_oauthlib also being a requirement. The following minimal example setup.py file still results in the same error:
from setuptools import setup
setup(
name='fakepackage',
install_requires=[
'requests_oauthlib',
'requests',
]
)
The error is "error: The 'requests' distribution was not found and is required by fakepackage". This happens regardless of the order in which I list the elements of the install_requires list.
If I run pip install requests manually, it installs fine, so I'm not sure what it's not finding. If I don't list requests at all in the setup.py file, and only list requests_oauthlib, it also installs fine (including installing requests, which is a dependency of requests_oauthlib).
I'm using Python 3.8, setuptools version 44.0.0.

Specify setup time dependency with `--global-option` for a python package

I'm trying to package a python library that has setup-time (and also run-time) dependencies: it imports the modules so that the modules can inform the setup process of the location of some provided C headers:
from distutils.extension import Extension
from pybedtools.helpers import get_includes as pybedtools_get_includes
from pysam import get_include as pysam_get_include
# [...]
extensions = [
Extension(
"bam25prime.libcollapsesam", ["bam25prime/libcollapsesam.pyx"],
include_dirs=pysam_get_include()),
Extension(
"bam25prime.libcollapsebed", ["bam25prime/libcollapsebed.pyx"],
include_dirs=pybedtools_get_includes(),
language="c++"),
]
# [...]
However, one of the dependencies (pybedtools) needs to be installed with a specific --global-option pip option (see at the end of the post what happens when the option is not provided).
If I understand correctly, the currently up-to-date way to automatically have some dependencies available before setup.py is used is to indicate them in the [build-system] section of a pyproject.toml file.
I tried the following pyproject.toml:
[build-system]
requires = [
"pysam",
"pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'",
]
build-backend = "setuptools.build_meta"
(By the way, it took me quite some time to figure out how to specify the build-backend, the documentation is not easily discoverable.)
However, this generates the following error upon pip install:
ERROR: Invalid requirement: "pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'"
Hint: It looks like a path. File 'pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers --global-option='cythonize'' does not exist.
How can I correctly specify options for dependencies ?
If I simply specify the package and its URL ("pybedtools # git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers), the install fails as follows:
Exception:
Cython-generated file 'pybedtools/cbedtools.cpp' not found.
Please install Cython and run
python setup.py cythonize
It was while trying to tackle the above error that I found out about the --global-option pip option.
I can manually run pip install --global-option="cythonize" git+https://github.com/blaiseli/pybedtools.git#fix_missing_headers, and the install works, provided the dependencies of that package are already installed, otherwise their install fails because of an unrecognized "cythonize" option (which is another issue...).
Note that this option is only needed when installing "from source" (for instance when installing from a fork on github, as is my case here).
Same thing as in your other question, I suspect cythonize is a setuptools command and not a global option.
If it's indeed the case, then you would be better off setting an alias in your setup.cfg. If you run python setup.py alias install cythonize install, this should add the following to your setup.cfg:
[aliases]
install = cythonize install
When running pip install later, pip will honor this alias and the cythonize command will be executed right before the install command.

pip install on archive fails to install git dependency

I have a package that depends on the following github project: https://github.com/jbittencourt/python-escpos.git (not to be confused with the python-escpos package that exists on PyPi).
So in my package's requirements.txt I have git+https://github.com/jbittencourt/python-escpos.git#egg=escpos.
When I install my package using either pip install -r requirements.txt or with python setup.py install (which parses the same requirements.txt file for its install_requires parameter) - everything works great.
The problem is when I archive my package into mypackage.tar.gz. After archiving the very same code that worked a second ago, and running pip install mypackage.tar.gz I get the following error:
Collecting escpos (from shopic==1.10.0)
Could not find a version that satisfies the requirement escpos (from shopic==1.10.0) (from versions: )
No matching distribution found for escpos (from shopic==1.10.0)
From playing around with it, I noticed that w/e I type after the "#egg=" part of git+https://github.com/jbittencourt/python-escpos.git#egg=escpos - it simply tries to get it off PyPi (and fails, in this case). E.g. if I wrote (for some odd reason) git+https://github.com/jbittencourt/python-escpos.git#egg=requests it'd install the requests package.
I don't understand why it works differently when all I do is pack the files in a .tar.gz, and I have no idea how to work around it. I cannot use one of the other installation methods because I have another package that has this one as its dependency, and when I run pip install on that package, it has this tar.gz file as one of its dependencies, and so it fails.
Any ideas why this happens and how to resolve it? Thanks.
Edit: Here's my setup.py for reference:
from setuptools import setup, find_packages
import pip
links = []
requires = []
requirements = pip.req.parse_requirements('requirements.txt', session=pip.download.PipSession())
for item in requirements:
# we want to handle package names and also repo urls
if getattr(item, 'url', None): # older pip has url
links.append(str(item.url))
if getattr(item, 'link', None): # newer pip has link
links.append(str(item.link))
if item.req:
requires.append(str(item.req))
setup(name='mypackage',
version='1.1.0',
description='My Package',
author='EK',
author_email='me#example.com',
url='http://www.example.com',
license='MIT',
packages=find_packages(),
package_data={'': ['*.jar']},
zip_safe=False,
install_requires=requires,
dependency_links=links)

How can I install packages hosted in a private PyPI using setup.py?

I'm trying to write the setup.py install file for a private project, which has both public and private dependencies. The public ones are hosted on PyPI, whereas the private ones are hosted on a server running simplepypi.
I would like both public and private dependencies to be resolved and fetched during installation.
I thus added my dependencies to setup.py:
setup(
...
install_requires = [
# public dependencies
'argparse==1.2.1',
'beautifulsoup4==4.1.3',
'lxml==3.1.0',
'mongoengine==0.8.2',
'pymongo==2.5.2',
'requests==1.1.0',
'Cython==0.18',
# private dependencies
'myprivatepackage1',
'myprivatepackage2'
],
dependency_links=['http://pypi.myserver.com/packages'],
...
)
I build the package tarball using the command python setup.py sdist and install it in an activated virtualenv using pip install --verbose path/to/tarball.tar.gz.
However, the pip log lines do not mention my private PyPI server anywhere, and https://pypi.python.org/simple/ seems to have been queried twice.
Running setup.py egg_info for package from file:///home/b/code/mapado/mypackage/dist/mypackage-0.5.1.tar.gz
running egg_info
creating pip-egg-info/mypackage.egg-info
writing requirements to pip-egg-info/mypackage.egg-info/requires.txt
writing pip-egg-info/mypackage.egg-info/PKG-INFO
writing top-level names to pip-egg-info/mypackage.egg-info/top_level.txt
writing dependency_links to pip-egg-info/mypackage.egg-info/dependency_links.txt
writing manifest file 'pip-egg-info/mypackage.egg-info/SOURCES.txt'
warning: manifest_maker: standard file '-c' not found
reading manifest file 'pip-egg-info/mypackage.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'pip-egg-info/mypackage.egg-info/SOURCES.txt'
Downloading/unpacking myprivatepackage (from mypackage==0.5.1)
Could not fetch URL https://pypi.python.org/simple/myprivatepackage/: HTTP Error 404: Not Found (myprivatepackage does not have any releases)
Will skip URL https://pypi.python.org/simple/myprivatepackage/ when looking for download links for myprivatepackage (from mypackage==0.5.1)
Could not fetch URL https://pypi.python.org/simple/myprivatepackage/: HTTP Error 404: Not Found (myprivatepackage does not have any releases)
Will skip URL https://pypi.python.org/simple/myprivatepackage/ when looking for download links for myprivatepackage (from mypackage==0.5.1)
Could not find any downloads that satisfy the requirement myprivatepackage (from mypackage==0.5.1)
Cleaning up...
What am I missing?
it looks like you didnt specify your host like the doc of simplepy said you need to setup your ~/.pypirc with the good hostname like
To use it run "simplepypi". You can upload packages by:
[...]
Not using twine yet? Here is the legacy way of uploading Python packages (not recommended):
Modify your ~/.pypirc so it looks like:
[distutils]
index-servers =
pypi
local
[local]
username: <whatever>
password: <doesn't matter, see above>
repository: http://127.0.0.1:8000
[pypi]
...
then you'll upload your package on it
python setup.py sdist upload -r local
and could install it from there
pip install -i http://127.0.0.1:8000/pypi <your favorite package>
Hope this will help.
dependency_links is ignored by default (at least in pip 9.0.1)
In order for it to reach out to your sever you need to add --process-dependency-links
I believe pip 10 will bring a new mechanism, but for now this has got it working for me
I also had to update dependency_links to include the package name, for example:
dependency_links=[
"http://internal-pyp:5678/simple/your_package_name"
]
You could make your package as a normal pip package and publish it to the private repo. To install it, you can specify global option --extra-index-url in the config file:
$ cat ~/.pip/pip.conf
[global]
extra-index-url = https://...

How to install py-yajl through setup.py?

I want to install py-yajl through setup.py like this.
from setuptools import setup, find_packages
here = os.path.abspath(os.path.dirname(__file__))
requires = [
'yajl',
]
setup(
...
install_requires=requires,
tests_require=requires,
...
)
But this setup.py has some errors.
>>> Creating a symlink for compilationg: includes/yajl -> yajl/src/api
error: SandboxViolation: symlink('../yajl/src/api', 'includes/yajl') {}
The package setup script has attempted to modify files on your system
that are not within the EasyInstall build area, and has been aborted.
This package cannot be safely installed by EasyInstall, and may not
support alternate installation locations even if you run its setup
script by hand. Please inform the package's author and the EasyInstall
maintainers to find out if a fix or workaround is available.
I know I can install yajl using pip.
If some config could be changed by setup.py to use pip, I will install yajl but I don't know how to do it.
Does someone have a good idea?

Categories

Resources