Multiple installation configurations for a Python package - python

I'm working on creating a Python package that is somewhat modular in nature and I was wondering what's the best way to go about handling multiple installation configurations?
Take the following setup.py for the package of a simple document parser:
setup(
name = 'my_document_parser',
...
entry_points = {
'my_document_parser.parsers': [
'markdown = my_document_parser.parsers.misaka:Parser [misaka]',
'textile = my_document_parser.parsers.textile:Parser [textile]'
]
},
install_requires = [
'some_required_package'
],
extras_require = {
'misaka' = ['misaka'],
'textile' = ['textile']
},
...
)
I'd like to make something like the following available:
A default install
python setup.py install or pip install my_document_parser
installs my_document_parser, some_required_package, misaka
A bare install
something like python setup.py install --bare
installs my_document_parser, some_required_package
A way to enable other dependencies
something like python setup.py install --bare --with-textile
installs my_document_parser, some_required_package, textile
This is my first time messing around with Python packages from a developer standpoint, so I'm open to anything anyone has to offer if I'm going about things the wrong way or such.
Thanks.

The default installation will install everything in your install_requires and packages sections. Packages under extras_require will never be installed by your setup.py script; only if you create some logic for this kind of installation.
By default, setuptools and distutils have no way to explictly say "install using these sections of my setup call". You would need to implement by yourself.
pip always installs packages using no parameter to setup.py but --single-version-externally-managed, to tell setuptools to never create a .egg file.
Conclusion: if you want people to install your package by using pip, forget --bare and --with-textile. The best you can do is a full installation including misaka and textile as dependencies.
Maybe you should read The Hitchhiker's Guide To Packaging, specifically Creating a Package section and setuptool's specification -- it is a pretty good one, it covers everything setuptools can do.

Related

Python Setuptools and PBR - how to create a package release using the git tag as the version?

How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
There is plenty of information on the basic setup and configuration required:
SetupTools Documentation - setup() and setup.py configuration
Python Packaging User Guide - Installing Packages
PBR v3.1.1 documentation
StackOverflow: How to use version info generated using setuptools and pbr
But where is the simple info on how to actually create the distro?
i.e. I'm looking for whatever command finds the git tag with the version info and pulls it into the configuration info, so the source with that new version info can be distributed, and the version info is discoverable from the scripts, using a method like described in this answer.
Additional details
I'm working on a project that will be distributed to other developers only through a git repo, not through PyPi. The project will be released to users as an executable using pyinstaller, so this package distribution will only serve a few key purposes:
Install/Setup the package for other developers so that dependencies/environment can be recreated cleanly.
Manage versioning - Current plan is to use pbr to generate versions from the Git repo tags, so those tags can be our source of truth for versioning
Use pbr for other auto generation of mundane items from Git, such as authors, manifest.in file, release notes, etc.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
I'm sure it exists somewhere in the documentation, but after several false starts I'm asking here. It is implied everywhere I look that everyone either knows how to do this or it just magically happens as a part of the process.
Am I just missing the obvious?
Update:
Based on sinoroc's answer, it appears I need to look into development mode installs. i.e. Anyone developing the project will clone from git, and then install via using setuptools development install mode.
This wasn't directly a part of the original question, but implied, and I believe will be of interest to people in the same situation (info I couldn't easily find).
More info is available in his answer on updating some of the metadata, and via this setuptools documentation link to working in "Development Mode"
In short:
python3 setup.py sdist
python3 setup.py bdist_wheel
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
The usual commands to create (source and wheel) distributions of your Python package with setuptools are: python3 setup.py sdist and python3 setup.py bdist_wheel. The distributions can then be found in the dist directory by default.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
It is true that setuptools does not document this. It only documents the differences to distutils, and it is confusing indeed. See below for actual documentation...
But where is the simple info on how to actually create the distro?
https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
https://docs.python.org/3/distutils/sourcedist.html
https://docs.python.org/3/distutils/builtdist.html
Update
Since you don't plan on publishing distributions of your project on an index such as PyPI, and you plan on using pyinstaller instead, then you can indeed most likely disregard the setuptools commands such as sdist and bdist_wheel.
Still you might want to know these commands for the development phase:
Use commands such as python3 setup.py --version, python3 setup.py --fullname to figure out if setuptools (and in your case pbr) is catching the right info.
Use python3 setup.py develop (or pip install --editable .) to place a pseudo link (egg-link) in your site-packages that points at your work in progress. This way your changes are always installed and importable. Important: don't use python3 setup.py install, this would copy the current version to site-packages and newer changes would not be importable.
Now I don't know how all this will work once you move on to pyinstaller. Especially since you mentioned that you want the meta info (such as the version number) to be discoverable from within your scripts. The technique with setuptools pkg_resources may or may not work in the pyinstaller context.
This is how I solved the same issue, also having read several different links.
I have created a setup.py file with this content:
from setuptools import setup, find_packages
def readme():
with open('README.rst') as f:
return f.read()
def read_other_requirements(other_type):
with open(other_type+'-requirements.txt') as f:
return f.read()
setup(
setup_requires=read_other_requirements('setup'),
pbr=True,
packages=find_packages('src'),
package_dir={'': 'src'},
include_package_data=True,
zip_safe=True
)
I have the source code in ./src. Also, I have a setup-requirements.txt, with content:
pip==18.1
pbr==5.1.1
setuptools==40.7.0
And a setup.cfg with this content:
[metadata]
name = XXXXX
description = XXXXX
description-file = README.rst
home-page = https://github.com/XXXXX/XXXXX
So first, you install the setup-requirements:
pip install -r setup-requirements.txt
Then, whenever you have locally a commit which was tagged in GitHub, you can install it using:
python setup.py install
and it will be installed with the tagged version.
You can check it by doing:
python setup.py --version

How to install test requirements from setup.py [duplicate]

We've made a library which depends on other libraries. But there are necessary (e.g. for server batch processing) and optional dependencies (e.g. for clients with GUI).
Is something like this possible:
pip install mylib.tar.gz # automatically downloads and installs with the minimal set of dependencies
pip install mylib.tar.gz --install-option="complete" # automatically installs with all dependencies
I've found the extra_require flag, but how can I tell pip to use them? The setup.py looks like this:
from setuptools import setup
# ...
# Hard library depencencies:
requires = [
"numpy>=1.4.1",
"scipy>=0.7.2",
"traits>=3.4.0"
]
# Soft library dependencies:
recommended = {
"mpl": ["matplotlib>=0.99.3"],
"bn": ["bottleneck>=0.6"]
}
# ...
# Installer parameters:
setup(
name = "mylib",
#...
install_requires = requires,
extras_require = recommended
)
You can install the packages in extras_require by appending the name of the recommended dependency in square brackets (i.e. [mpl] or [bn] in your case) to the package name in pip.
So to install 'mylib' with the additional requirements, you would call pip like this:
pip install 'mylib[mpl]'
pip install 'mylib[bn]'
This will first download and install the extra dependencies, and then mylib's core dependencies.
This is anologous to how you declare those dependencies with setuptools: http://pythonhosted.org/setuptools/setuptools.html#declaring-extras-optional-features-with-their-own-dependencies (see the install_requires value in the third example)
So pip is actually quite picky about installing libraries with extra requirements
pip install -e ".[extra,requirements]" # works with file paths
pip install "package[extra,requirements]" # works when downloading packages
pip install ".[extra,requirments]" # DOES NOT WORK
I think this is down to how the RequirementsSpec parser works, and pip does some extra magic with the -e flag. Anyhow after much head banging, here's a mildly ugly workaround
pip install "file:///path/to/your/python_code#egg=SomeName[extra,requirements]"
The egg=SomeName part is basically ignored, but pip correctly picks up the extra requirements
Caveats
Tested with pip 1.5.6 so make sure you're using a current version of pip.
As far as I can tell, the file:/// syntax is undocumented in pip, so I'm not sure if it'll change in the future. It looks a bit like the VCS Support syntax but I was a bit surprised it worked.
You could also get around this by running your own pypi server, but that's a bit out of scope.

How to include library dependencies with a python project?

I'm working on a program which a few other people are going to be using, made in Python. They all have python installed, however, there are several libraries that this program imports from.
I would ideally like to be able to simply send them the Python files and them be able to just run it, instead of having to tell them each of the libraries they have to install and as them to get each one manually using pip.
Is there any way I can include all the libraries that my project uses with my Python files (or perhaps set up an installer that installs it for them)? Or do I just have to give them a full list of python libraries they must install and get them to do each one manually?
That's the topic of Python packaging tutorial.
In brief, you pass them as install_requires parameter to setuptools.setup().
You can then generate a package in many formats including wheel, egg, and even a Windows installer package.
Using the standard packaging infrastructure will also give you the benefit of easy version management.
You can build a package and specify dependencies in your setup.py. This is the right way
http://python-packaging.readthedocs.io/en/latest/dependencies.html
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
install_requires=[
'markdown',
],
zip_safe=False)
The need it now hack option is some lib that lets your script interact with shell. pexpect is my favorite for automating shell interactions.
https://pexpect.readthedocs.io/en/stable/
You should use virtualenv to manage packages of your project. If you do, you can then pip freeze > requirements.txt to save all dependencies of a project. After this requirements.txt will contain all necessary requirements to run your app. Add this file to your repository. All packages from requirements.txt can be install with
pip install -r requirements.txt
The other option will be to create a PyPI package. You can find a tutorial on this topic here

Is there a way to run 2to3 with pip install?

I'm trying to maintain dependencies using pip install -r requirements.txt. However, some of required packages do not support Python 3 directly, but can be converted manually using 2to3.
Is there a way to force pip to run 2to3 on those packages automagically when doing pip install -r requirements.txt?
No, it needs to be part of the package setup configuration instead. See Supporting both Python 2 and 3 with Distribute.
You add metadata to your package installer:
setup(
name='your.module',
version = '1.0',
description='This is your awesome module',
author='You',
author_email='your#email',
package_dir = {'': 'src'},
packages = ['your', 'your.module'],
test_suite = 'your.module.tests',
use_2to3 = True,
convert_2to3_doctests = ['src/your/module/README.txt'],
use_2to3_fixers = ['your.fixers'],
use_2to3_exclude_fixers = ['lib2to3.fixes.fix_import'],
)
Such a package would then automatically run 2to3 on installation into a Python 3 system.
2to3 is a tool, not a magic bullet, you cannot apply it to an arbitrary package pip downloads from PyPI. The package needs to support it in the way it is coded. Thus, running it automatically from pip is not going to work; the responsibility lies with the package maintainer.
Note that just because 2to3 runs successfully on a package, it does not necessarily follow the package will work in Python 3. Assumptions about bytes vs. unicode usually crop up when you actually start using the package.
Contact the maintainers of the packages you are interested in and ask what the status is for that package for Python 3. Supplying patches to them usually helps. If such requests and offers for help fall on deaf ears, for Open Source packages you can always fork them and apply the necessary changes yourself.

Including Bash autocompletion with setuptools

I've got a couple packages in PyPI, and I'd like to include autocompletion features with both of them. How would you check that Bash autocompletion should be installed at all (check for /etc/bash_completion, maybe?), and how would you install it with setup.py (preferably using setuptools)?
If you're going to require OS-level packages (i.e. bash-completion), then you should distribute your library as an OS-level package. That is, in .deb, .rpm, etc. Some tips here:
Debian New Maintainer's Guide
Fedora Packaging Guidelines
As part of the package generation, you can call your setuptools script to install the Python code. To ensure bash-completion is installed, you can specify it is a required package.
You can use data_files options:
from setuptools import setup
setup(
...
data_files=[
('/etc/bash_completion.d/', ['extra/some_completion_script']),
]
)

Categories

Resources