Python dependency resolution - python

I have previously created a python package and uploaded it to pypi. The package depends upon 2 other packages defined within the setup.py file:
from setuptools import setup
from dominos.version import Version
def readme():
with open('README.rst') as file:
return file.read()
setup(name='dominos',
version=Version('0.0.1').number,
author='Tomas Basham',
url='https://github.com/tomasbasham/dominos',
license='MIT',
packages=['dominos'],
install_requires=[
'ratelimit',
'requests'
],
include_package_data=True,
zip_safe=False)
As both of these were already installed within my virtualenv this package would run fine.
Now trying to consume this package within another python application (and within a separate virtualenv) I have defined the following requirements.txt file:
dominos==0.0.1
geocoder==1.13.0
For reference dominos is the package I uploaded to pypi. Now running pip install --no-cache-dir -r requirements.txt fails because dependencies of dominos are missing:
ImportError: No module named ratelimit
Surely pip should be resolving these dependencies since I have defined them in the setup.py file of dominos. Clarity on this would be great.

Related

'requests' distribution not found when listed in install_requires together with requests_oauthlib

I have a setup.py file for a python package and when I try to run python setup.py develop it gets stuck on installing the requests package, apparently because of requests_oauthlib also being a requirement. The following minimal example setup.py file still results in the same error:
from setuptools import setup
setup(
name='fakepackage',
install_requires=[
'requests_oauthlib',
'requests',
]
)
The error is "error: The 'requests' distribution was not found and is required by fakepackage". This happens regardless of the order in which I list the elements of the install_requires list.
If I run pip install requests manually, it installs fine, so I'm not sure what it's not finding. If I don't list requests at all in the setup.py file, and only list requests_oauthlib, it also installs fine (including installing requests, which is a dependency of requests_oauthlib).
I'm using Python 3.8, setuptools version 44.0.0.

setup.py fails to install google-cloud-pubsub

I'm trying to prepare setup.py that will install all necessary dependencies, including google-cloud-pubsub. However, python setup.py install fails with
pkg_resources.UnknownExtra: googleapis-common-protos 1.6.0b6 has no such extra feature 'grpc'
The weird thing is that I can install those dependencies through pip install in my virtualenv.
How can I fix it or get around it? I use Python 2.7.15.
Here's minimal configuration to reproduce the problem:
setup.py
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub']
)
In your setup.py use the following:
from setuptools import setup
setup(
name='example',
install_requires=['google-cloud-pubsub', 'googleapis-common-protos==1.5.3']
)
That seems to get around it

Unable to import dependency installed from git in setup.py

I am trying to use setup.py to install a Python package that is kept in a git repository, which we'll call my_dependency. In my_package, I have a setup.py file with:
setup(
...
install_requires=[
...
'my_dependency=VERSION'
],
dependency_links=['git+https://...my_dependency.git#egg=my_dependency-VERSION',]
)
When I run my setup file (python setup.py develop), the dependency appears to install; it shows up as my_dependency==VERSION when I run pip freeze. However, when I start a python session and call import my_dependency, I get ImportError: No module named my_dependency.
I don't know if this is possibly the source of the problem, but when running setup.py, I get a warning:
Processing dependencies for my_package==0.1
Searching for my_dependency==VERSION
Doing git clone from https://.../my_dependency.git to /var/folders/.../.../T/easy_install-_rWjyp/my_dependency.git
Best match: my_dependency VERSION
Processing my_dependency.git
Writing /var/folders/.../my_dependency.git/setup.cfg
Running setup.py -q bdist_egg --dist-dir /var/folders/.../my_dependency.git/egg-dist-tmp-UMiNdL
warning: install_lib: 'build/lib' does not exist -- no Python modules to install
Copying my_dependency-VERSION-py2.7.egg to /.../my_package/venv/lib/python2.7/site-packages
Adding my_dependency VERSION to easy-install.pth file
However, I am able to use the package if I install it through pip, like this: pip install -e git+https://.../my_dependency.git#egg=my_dependency-VERSION
For reference, the dependency package structure looks like this:
my_dependency/
my_dependency/
__init__.py
setup.py
And its setup.py contains this:
from setuptools import setup
setup(
name='my_dependency',
version='VERSION',
description='...',
author='...',
url='https://...',
license='MIT',
install_requires=[
'numpy',
],
zip_safe=False,
)
The solution was (in retrospect) pretty silly. My dependency package was missing this line in its setup.py:
packages=['my_dependency'],
That meant the package was correctly building and installing, but it wasn't actually including the code in the package. This became apparent when I looked at the SOURCES.txt in the egg-info: it didn't include any of the Python source files in the package.

pip install on archive fails to install git dependency

I have a package that depends on the following github project: https://github.com/jbittencourt/python-escpos.git (not to be confused with the python-escpos package that exists on PyPi).
So in my package's requirements.txt I have git+https://github.com/jbittencourt/python-escpos.git#egg=escpos.
When I install my package using either pip install -r requirements.txt or with python setup.py install (which parses the same requirements.txt file for its install_requires parameter) - everything works great.
The problem is when I archive my package into mypackage.tar.gz. After archiving the very same code that worked a second ago, and running pip install mypackage.tar.gz I get the following error:
Collecting escpos (from shopic==1.10.0)
Could not find a version that satisfies the requirement escpos (from shopic==1.10.0) (from versions: )
No matching distribution found for escpos (from shopic==1.10.0)
From playing around with it, I noticed that w/e I type after the "#egg=" part of git+https://github.com/jbittencourt/python-escpos.git#egg=escpos - it simply tries to get it off PyPi (and fails, in this case). E.g. if I wrote (for some odd reason) git+https://github.com/jbittencourt/python-escpos.git#egg=requests it'd install the requests package.
I don't understand why it works differently when all I do is pack the files in a .tar.gz, and I have no idea how to work around it. I cannot use one of the other installation methods because I have another package that has this one as its dependency, and when I run pip install on that package, it has this tar.gz file as one of its dependencies, and so it fails.
Any ideas why this happens and how to resolve it? Thanks.
Edit: Here's my setup.py for reference:
from setuptools import setup, find_packages
import pip
links = []
requires = []
requirements = pip.req.parse_requirements('requirements.txt', session=pip.download.PipSession())
for item in requirements:
# we want to handle package names and also repo urls
if getattr(item, 'url', None): # older pip has url
links.append(str(item.url))
if getattr(item, 'link', None): # newer pip has link
links.append(str(item.link))
if item.req:
requires.append(str(item.req))
setup(name='mypackage',
version='1.1.0',
description='My Package',
author='EK',
author_email='me#example.com',
url='http://www.example.com',
license='MIT',
packages=find_packages(),
package_data={'': ['*.jar']},
zip_safe=False,
install_requires=requires,
dependency_links=links)

Conditionally installing importlib on python2.6

I have a python library that has a dependency on importlib. importlib is in the standard library in Python 2.7, but is a third-party package for older pythons. I typically keep my dependencies in a pip-style requirements.txt. Of course, if I put importlib in here, it will fail if installed on 2.7. How can I conditionally install importlib only if it's not available in the standard lib?
I don't think this is possible with pip and a single requirements file. I can think of two options I'd choose from:
Multiple requirements files
Create a base.txt file that contains most of your packages:
# base.txt
somelib1
somelib2
And create a requirements file for python 2.6:
# py26.txt
-r base.txt
importlib
and one for 2.7:
# py27.txt
-r base.txt
Requirements in setup.py
If your library has a setup.py file, you can check the version of python, or just check if the library already exists, like this:
# setup.py
from setuptools import setup
install_requires = ['somelib1', 'somelib2']
try:
import importlib
except ImportError:
install_requires.append('importlib')
setup(
...
install_requires=install_requires,
...
)

Categories

Resources