setup.py command to show install_requires? - python

Is there any way to get the list of packages specified in the install_requires parameter to the setup command (in setup.py)?
I'm looking for something similar to pip show pkgname | grep -i requires, but for local packages (and that reports version specifiers and filters).
The real task I'm solving is to check if versions specified in setup.py and requirements.txt have diverged, so if there is a tool that can do this directly...?

For installed packages, use stdlib importlib.metadata.requires. Python 3.8+ is required, but there is a backport importlib-metadata for older Python versions. This is the easy case, because installed packages have already generated the metadata (including dependency specs) at installation time. For local source trees which aren't necessarily installed, read on.
There is a way to do this in Python APIs via distutils (which setuptools wraps):
import distutils.core
dist = distutils.core.run_setup("setup.py")
for req in dist.install_requires:
print(req)
As a shell command that might look like:
python -c 'import distutils.core; print(*distutils.core.run_setup("setup.py").install_requires, sep="\n")'
It is also possible to use the CLI to generate egg-info and then read deps from that:
python setup.py egg_info
cat myproject.egg-info/requires.txt
However, distutils is deprecated and setuptools is trying to be a library only, so using setup.py in this way is quite neglected.
You might consider moving to declarative dependencies specified in pyproject.toml and then just reading the requirements directly with a simple TOML parser.

Related

Python Setuptools and PBR - how to create a package release using the git tag as the version?

How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
There is plenty of information on the basic setup and configuration required:
SetupTools Documentation - setup() and setup.py configuration
Python Packaging User Guide - Installing Packages
PBR v3.1.1 documentation
StackOverflow: How to use version info generated using setuptools and pbr
But where is the simple info on how to actually create the distro?
i.e. I'm looking for whatever command finds the git tag with the version info and pulls it into the configuration info, so the source with that new version info can be distributed, and the version info is discoverable from the scripts, using a method like described in this answer.
Additional details
I'm working on a project that will be distributed to other developers only through a git repo, not through PyPi. The project will be released to users as an executable using pyinstaller, so this package distribution will only serve a few key purposes:
Install/Setup the package for other developers so that dependencies/environment can be recreated cleanly.
Manage versioning - Current plan is to use pbr to generate versions from the Git repo tags, so those tags can be our source of truth for versioning
Use pbr for other auto generation of mundane items from Git, such as authors, manifest.in file, release notes, etc.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
I'm sure it exists somewhere in the documentation, but after several false starts I'm asking here. It is implied everywhere I look that everyone either knows how to do this or it just magically happens as a part of the process.
Am I just missing the obvious?
Update:
Based on sinoroc's answer, it appears I need to look into development mode installs. i.e. Anyone developing the project will clone from git, and then install via using setuptools development install mode.
This wasn't directly a part of the original question, but implied, and I believe will be of interest to people in the same situation (info I couldn't easily find).
More info is available in his answer on updating some of the metadata, and via this setuptools documentation link to working in "Development Mode"
In short:
python3 setup.py sdist
python3 setup.py bdist_wheel
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
The usual commands to create (source and wheel) distributions of your Python package with setuptools are: python3 setup.py sdist and python3 setup.py bdist_wheel. The distributions can then be found in the dist directory by default.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
It is true that setuptools does not document this. It only documents the differences to distutils, and it is confusing indeed. See below for actual documentation...
But where is the simple info on how to actually create the distro?
https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
https://docs.python.org/3/distutils/sourcedist.html
https://docs.python.org/3/distutils/builtdist.html
Update
Since you don't plan on publishing distributions of your project on an index such as PyPI, and you plan on using pyinstaller instead, then you can indeed most likely disregard the setuptools commands such as sdist and bdist_wheel.
Still you might want to know these commands for the development phase:
Use commands such as python3 setup.py --version, python3 setup.py --fullname to figure out if setuptools (and in your case pbr) is catching the right info.
Use python3 setup.py develop (or pip install --editable .) to place a pseudo link (egg-link) in your site-packages that points at your work in progress. This way your changes are always installed and importable. Important: don't use python3 setup.py install, this would copy the current version to site-packages and newer changes would not be importable.
Now I don't know how all this will work once you move on to pyinstaller. Especially since you mentioned that you want the meta info (such as the version number) to be discoverable from within your scripts. The technique with setuptools pkg_resources may or may not work in the pyinstaller context.
This is how I solved the same issue, also having read several different links.
I have created a setup.py file with this content:
from setuptools import setup, find_packages
def readme():
with open('README.rst') as f:
return f.read()
def read_other_requirements(other_type):
with open(other_type+'-requirements.txt') as f:
return f.read()
setup(
setup_requires=read_other_requirements('setup'),
pbr=True,
packages=find_packages('src'),
package_dir={'': 'src'},
include_package_data=True,
zip_safe=True
)
I have the source code in ./src. Also, I have a setup-requirements.txt, with content:
pip==18.1
pbr==5.1.1
setuptools==40.7.0
And a setup.cfg with this content:
[metadata]
name = XXXXX
description = XXXXX
description-file = README.rst
home-page = https://github.com/XXXXX/XXXXX
So first, you install the setup-requirements:
pip install -r setup-requirements.txt
Then, whenever you have locally a commit which was tagged in GitHub, you can install it using:
python setup.py install
and it will be installed with the tagged version.
You can check it by doing:
python setup.py --version

How to tell setuptools that package should be installed in platform-specific directory (e.g. /usr/lib64)?

I am attempting to change how the Python bindings for a "large" C++ software project (i.e. the bindings are a minor part) are shipped so that the bindings are no longer installed as bare, metadata-free .so files in site-packages. The bindings' .so files are already built as part of the rest of the software's mess of cmake instructions, and I'm adding instructions and the proper module structure (i.e. <module>/__init__.py) so that setuptools can be in charge of the install step. In setup.py, I have the prebuilt .so files included as package_data, as was done in the related question Distributing pre-built libraries with python modules. And when cmake &&
make && make install gets around to invoking python setup.py build and python setup.py install --root=${CMAKE_INSTALL_PREFIX}, everything technically works. Hooray.
The problem is that python setup.py install ... drops everything under {PREFIX}/usr/lib/pythonN.M/site-packages even though I've included binaries built for 64-bit arch in the package data. I'm struggling to figure out how to get setuptools to nicely install to, e.g. for Linux, /usr/lib64. I could possibly add some cmake logic to figure out what the libdir should be and pass it in as --install-lib to setup.py install, but it seems like the right thing to do is somehow make setuptools aware, during the install step, that the contents of the package are platform-specific and to set the install location accordingly.
I assume this is largely because I'm including the bindings as package data rather than building them inside the setup script as extension modules. Is there some way to tell setuptools that a package without explicit ext_modules is platform-specific so that python setup.py install places files under /usr/lib64/... instead of /usr/lib/...?

Specifying additional dependencies in setup.py script based on implementation (PyPy / CPython support)

Preface
I have a package with PyPy support and for CPython users it has mypy as an additional dependency which I specify as
import platform
from setuptools import setup
...
install_requires = [...]
if platform.python_implementation() != 'PyPy':
install_requires.append('mypy>=0.630')
setup(...,
install_requires=install_requires)
and locally it works fine, but when I create source distribution via CPython like
> python setup.py sdist
and try to install it via PyPy
> pypy3 -m pip install path/to/package.tar.gz
it tries to install mypy (and fails since mypy uses CPython-specific packages), so it looks like dependencies are taken for CPython version (for which distribution was created).
Problem
How can I specify dependencies and create source distribution once so it will work for both CPython & PyPy versions (and upload to PyPI afterwards)?
Your current script tests the platform at build time, and not at install time.
What you need to use is not the platform module, but environment markers defined in PEP 508:
from setuptools import setup
...
install_requires = [...,
'mypy>=0.630; implementation_name != "PyPy"']
setup(...,
install_requires=install_requires)
References:
install_requires based on python version
Building and Distributing Packages with Setuptools — setuptools 40.5.0 documentation

How to "include" third party modules into my python script to make it portable?

I've developed a little script that searches through a wallpaper database online and download the wallpapers, I want to give this script to another person that isn't exactly good with computers and I'm kinda starting with python so I don't know how to include the "imports" of the third party modules in my program so it can be 100% portable, Is there something that can help me do this? or I will have to enter and analyse my third party modules and copy&paste the functions that I use?
Worse thing to do
An easy thing you can do is simply bundle the other modules in with your code. That does not mean that you should copy/paste the functions from the other modules into your code--you should definitely not do that, since you don't know what dependencies you'll be missing. Your directory structure might look like:
/myproject
mycode.py
thirdpartymodule1.py
thirdpartymodule2.py
thirdpartymodule3/
<contents>
Better thing to do
The real best way to do this is include a list of dependencies (usually called requirements.txt) in your Python package that Python's package installer, pip, could use to automatically download. Since that might be a little too complicated, you could give your friend these instructions, assuming Mac or Linux:
Run $ curl http://python-distribute.org/distribute_setup.py | python. This provides you with tools you'll need to install the package manager.
Run $ curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | python. This installs the package manager.
Give your friend a list of the names of the third-party Python modules you used in your code. For purposes of this example, we'll say you used requests, twisted, and boto.
Your friend should run from command line $ pip install <list of package names>. In our example, that would look like $ pip install requests twisted boto.
Run the Python code! The lines like import boto should then work, since your friend will have the packages installed on their computer.
The easi(er) way:
Start with a clean virtual environment.
Install packages that you need to develop your code.
Once you are done, create a list of requirements for your project.
Send this file (from step 3) to your friend.
Your friend simply does pip install -r thefile.txt to get all the requirements for your application.
Here's an example:
D:\>virtualenv --no-site-packages myproject
The --no-site-packages flag is deprecated; it is now the default behavior.
New python executable in myproject\Scripts\python.exe
Installing setuptools................done.
Installing pip...................done.
D:\>myproject\Scripts\activate.bat
(myproject) D:\>pip install requests
Downloading/unpacking requests
Downloading requests-0.14.1.tar.gz (523Kb): 523Kb downloaded
Running setup.py egg_info for package requests
warning: no files found matching 'tests\*.'
Installing collected packages: requests
Running setup.py install for requests
warning: no files found matching 'tests\*.'
Successfully installed requests
Cleaning up...
(myproject) D:\>pip freeze > requirements.txt
(myproject) D:\>type requirements.txt
requests==0.14.1

using setuptools with post-install and python dependencies

This is somewhat related to this question. Let's say I have a package that I want to deploy via rpm because I need to do some file copying on post-install and I have some non-python dependencies I want to declare. But let's also say I have some python dependencies that are easily available in PyPI. It seems like if I just package as an egg, an unzip followed by python setup.py install will automatically take care of my python dependencies, at the expense of losing any post-install functionality and non-python dependencies.
Is there any recommended way of doing this? I suppose I could specify this in a pre-install script, but then I'm getting into information duplication and not really using setuptools for much of anything.
(My current setup involves passing install_requires = ['dependency_name'] to setup, which works for python setup.py bdist_egg and unzip my_package.egg; python my_package/setup.py install, but not for python setup.py bdist_rpm --post-install post-install.sh and rpm --install my_package.rpm.)
I think it would be best if your python dependencies were available as RPMs also, and declared as dependencies in the RPM. If they aren't available elsewhere, create them yourself, and put them in your yum repository.
Running PyPI installations as a side effect of RPM installation is evil, as it won't support proper uninstallation (i.e. uninstalling your RPM will remove your package, but leave the dependencies behind, with no proper removal procedure).

Categories

Resources