Specifying an internal PyPI repository for Python Setup - python

I've created 2 libraries, foo and bar for use in proprietary projects. I have an internal PyPI repository that I publish these libraries to. Additionally bar depends on foo and I have added it accordingly to my requirements.txt and the install_requires field in setup.py. Here's an example:
setup(
name='bar',
...
install_requires=['foo~=1.0.0'],
dependency_links=[url_to_foo],
)
However, when I try to use bar in my other projects (let's call this foobar, I get this error:
ERROR: No matching distribution found for foo==1.0.0
Unless I specify url_to_foo in foobar's dependency links as well, like so:
setup(
name='foobar',
...
install_requires=['bar~=1.0.0'],
dependency_links=[url_to_bar, url_to_foo],
)
This would be really bad if I had other modules that depend on foobar as I would have to specify the urls to all dependencies, and so on.
pip currently has this command line argument, --extra-index-url, where I can simply specify the url to the PyPI repository. Is there an equivalent attribute I can specify in setuptools's setup function?

Related

Python - Creating egg file what is the use of description and long description

I am creating an egg file and I am able to do that successfully. However, the value I have provided in description and long_description is not visible.
setup.py
description = "desc"
long_description = "lond desc"
setup(
name="abc",
version="0.2",
packages=find_packages(),
description=description,
long_description=long_description,
author='Gaurang Shah',
author_email='gaurang.shah#abc.com'
)
Build script
rm -rf build dist dataplaform.egg-info
python setup.py bdist_egg
After installing a package, when I run following command. I don't see anything?
import abc
abc.__doc__
You would see description and/or long_description on pip show abc or on the PyPI repository. Basically on places that refer to the Python project abc.
When you type import abc; print(abc.__doc__) you refer to a Python top level package (or module) abc that coincidentally has been made available by installing the distribution (in this case a bdist_egg) of a project bearing the same name abc.
Python projects and Python packages are not the same thing though. The confusion comes from the fact that it is almost always the case that a Python project contains a single top level package of the same name, and so both are used interchangeably to great confusion. See beautifulsoup4 for a famous counter example.
In your case abc.__doc__ actually refers to the docstring of your abc/__init__.py (or eventually a top level abc.py).

How to determine the modules available in a PyPI package

Given a PyPI package name, like PyYAML, how can one programmatically
determine the modules available within the package (distribution package) that could be imported?
Detail
I'm not specifically interested in PyYAML, it's just a good example of a popular PyPI package which has a different
package name (PyYAML)
from it's primary module name (yaml)
such that you can't easily guess the module name from the package name.
I've seen other answers to questions that sound like this but are different, likely because of a naming collision
package meaning a python construct allowing for a collection of modules
package meaning a "Distribution Package", an archive file that
contains Python packages, modules, and other resource files that are used to distribute a Release.
My question is about the relationship between distribution packages and the modules within.
Possible Solution Spaces
Areas that seem like they might be fruitful (but which I've not had success with yet) are :
The pydoc.help function
(surfaced as the help built-in)
outputs a complete list of all available modules when called as help('modules'). This
shows modules that have not been imported but could be. It outputs in a human readable form
to stdout, and I've been unable to figure out how the pydoc code
enumerates the modules.
I could imagine calling this, gathering the module list, installing a new distribution package into a virtualenv with
pip programatically, calling it again and diffing the results.
Progamatically installing a distribution package with pip in order to
Iterate through elements of the python path to find modules
My project johnnydep provides exactly this feature:
$ johnnydep --fields=import_names PyYAML
name import_names
------ --------------
PyYAML yaml
Note that some distributions export multiple top-level names, some distributions export none at all, and there is not necessarily any obvious relationship between the distribution name (used with a pip install command) and the package name (used with an import statement) - though it is a common convention for them to be matched.
For example, the popular project setuptools exposes three top-level names:
$ johnnydep --fields=import_names setuptools
name import_names
---------- ---------------------------------------
setuptools easy_install, pkg_resources, setuptools
API usage is via attribute access:
>>> from johnnydep.lib import JohnnyDist
>>> jdist = JohnnyDist("setuptools")
>>> jdist.import_names
['easy_install', 'pkg_resources', 'setuptools']
If you are interested to know submodule names, not top-level names, that's possible with stdlib pkgutil, for example:
>>> import pkgutil, requests
>>> [name for finder, name, ispkg in pkgutil.walk_packages(requests.__path__)]
['__version__',
'_internal_utils',
'adapters',
'api',
'auth',
'certs',
'compat',
'cookies',
'exceptions',
'help',
'hooks',
'models',
'packages',
'sessions',
'status_codes',
'structures',
'utils']

Configure setup.py to install requirement from repository URL

I am creating a module and need to prepare my setup.py file to have some requirements. One of the requirements is a fork of one package that is already in PyPI so I want to reference my GitHub repository directly.
I tried two configurations, the first one is:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement', # This is my repository location
]
)
I create a local distribution of my module using python setup.py sdist and when I run pip install path/to/module/dist/mymodule-0.1.tar.gz it ends up installing the version on PyPI and not my repository.
The other configuration, I tried to change the requirement name to force searching for a dependency link like so:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement_alt', # The dependency name with a suffix
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt', # This is my repository location
]
)
But with this, I only end up getting an error that myrequirement_alt is not found...
So I ask, what is the right way to achieve this without using PyPI?
For dependency links to work you need to add the version number of the package to https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt.
or it will not know what to install.
e.g.:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt-1.3' # Link with version at the end
]
)
Note that I wouldn't recommend using dependency links at all as they're deprecated. You should probably, instead, use requirement files.

Usage of "provides" keyword-argument in python's setup.py

I am working on a fork of a python projet (tryton) which uses setuptools for packaging. I am trying to extend the server part of the project, and would like to be able to use the existing modules with my fork.
Those modules are distributed with setuptools packaging, and are requiring the base project for installation.
I need a way to make it so that my fork is considered an acceptable requirement for those modules.
EDIT : Here is what I used in my setup.py :
from setuptools import setup
setup(
...
provides=["trytond (2.8.2)"],
...
)
The modules I want to be able to install have those requirements :
from setuptools import setup
setup(
...
install_requires=["trytond>=2.8"]
...
)
As it is, with my package installed, trying to install a module triggers the installation of the trytond package.
Don’t use provides, it comes from a packaging specification (a metadata PEP) that is not implemented by any tool. The requiremens in the install_requires argument map to the name in your other setup.py. IOW, replace your provides with setup(name='trytond', version='2.8.2').
If you are building rpms, it is possible to use the setup.cfg as follows:
[bdist_rpm]
provides = your-package = 0.8
obsoletes = your-package

How to prepend a path to a buildout-generated script

Scripts generated by zc.buildout using zc.recipe.egg, on our <package>/bin/ directory look like this:
#! <python shebang> -S
import sys
sys.path[0:0] = [
... # some paths derived from the eggs
... # some other paths included with zc.recipe.egg `extra-path`
]
# some user initialization code from zc.recipe.egg `initialization`
# import function, call function
What I have not been able to was to find a way to programmatically prepend a path at the sys.path construction introduced in every script. Is this possible?
Why: I have a version of my python project installed globally and another version of it installed locally (off-buildout tree). I want to be able to switch between these two versions.
Note: Clearly, one can use the zc.recipe.egg/initialization property to add something like:
initialization = sys.path[0:0] = [ /add/path/to/my/eggs ]
But, is there any other way? Extra points for an example!
Finally, I got a working environment by creating my own buildout recipe that you can find here: https://github.com/idiap/local.bob.recipe. The file that contains the recipe is this one: https://github.com/idiap/local.bob.recipe/blob/master/config.py. There are lots of checks which are specific to our software at the class constructor and some extra improvements as well, but don't get bothered with that. The "real meat (TM)" is on the install() method of that class. It goes like this more or less:
egg_link = os.path.join(self.buildout['buildout']['eggs-directory'], 'external-package.egg-link')
f = open(egg_link, 'wt')
f.write(self.options['install-directory'] + '\n')
f.close()
self.options.created(egg_link)
return self.options.created()
This will do the trick. My external (CMake-based) package now only has to create the right .egg-info file in parallel with the python package(s) it builds. Than, I can tie, using the above recipe, the usage of a specific package installation like this:
[buildout]
parts = external_package python
develop = .
eggs = my_project
external_package
recipe.as.above
[external_package]
recipe = recipe.as.above:config
install-directory = ../path/to/my/local/package/build
[python]
recipe = zc.recipe.egg
interpreter = python
eggs = ${buildout:eggs}
If you wish to switch installations, just change the install-directory property above. If you wish to use the default installation available system wide, just remove altogether the recipe.as.above constructions from your buildout.cfg file. Buildout will just find the global installation w/o requiring any extra configuration. Uninstallation will work properly as well. So, switching between builds will just work.
Here is a fully working buildout .cfg file that we use here: https://github.com/idiap/bob.project.example/blob/master/localbob.cfg
The question is: Is there an easier way to achieve the same w/o having this external recipe?
Well, what you miss is probably the most useful buildout extension, mr.developer.
Typically the package, let's say foo.bar will be in some repo, let's say git.
Your buildout will look like
[buildout]
extensions = mr.developer
[sources]
foo.bar = git git#github.com:foo/foo.bar.git
If you don't have your package in a repo, you can use fs instead of git, have a look at the documentation for details.
Activating the "local" version is done by
./bin/develop a foo.bar
Deactivating by
./bin/develop d foo.bar
There are quite a few other things you can do with mr.developer, do check it out!

Categories

Resources