I am trying to package my Python project, which comes with a configuration dotfile that I want copied into the user's home directory on installation. The quick guide to packaging says that this can be done using the data_files argument to setuptools.setup. So this is what I have:
data_files = [(os.path.expanduser("~"), [".my_config"])]
This appears to work fine if I use python setup.py install, but when I upload my package to PyPI and run pip install the dotfile isn't copied.
FWIW, I've put the dotfile in the MANIFEST.in and also tried including the package_data argument to setup. None of these steps appear to make a difference. If I pip install and poke around the site-packages directory, just the source files are here.
How can I achieve what I'm looking for?
This is an issue I had once to experience myself. Its root is that when you are building a wheel file, all the absolute paths specified in data_files will be relativized to the target site-packages directory, see this issue on github. This influences installations performed by pip install as it will build a wheel out of any source package (.tar.gz, .tar.bz2 or .zip) and install the resulting wheel:
$ pip install spam-0.1.tar.gz
Processing ./spam-0.1.tar.gz
Building wheels for collected packages: spam
Running setup.py bdist_wheel for spam ... done
Stored in directory: /Users/hoefling/Library/Caches/pip/wheels/d0/95/be/bc79f1d589d90d67139481a3e706bcc54578fdbf891aef75c0
Successfully built spam
Installing collected packages: spam
Successfully installed spam-0.1
Checking installed files yields:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
Users/hoefling/.my_config
spam-0.1.dist-info/DESCRIPTION.rst
spam-0.1.dist-info/INSTALLER
spam-0.1.dist-info/METADATA
spam-0.1.dist-info/RECORD
spam-0.1.dist-info/WHEEL
spam-0.1.dist-info/metadata.json
spam-0.1.dist-info/top_level.txt
Note the path meant to be absolute is relative to the Location dir. In the example, .my_config would be placed under /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages/Users/hoefling/.my_config.
It gets even better because these built wheels are cached on your disk, so next time you reinstall the package and the built wheel still exists in pip's cache, it will be used for the installation and you won't even see any mentions of building a wheel in the terminal log.
There is no real solution to avoid this. The most decent workaround I found is to prohibit "binary" packages when installing to enforce the execution of package's setup.py on installation:
$ pip install spam-0.1.tar.gz --no-binary=spam
Processing ./spam-0.1.tar.gz
Skipping bdist_wheel for spam, due to binaries being disabled for it.
Installing collected packages: spam
Running setup.py install for spam ... done
Successfully installed spam-0.1
The file is now placed correctly:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
../../../../../.my_config
spam-0.1-py3.6.egg-info/PKG-INFO
spam-0.1-py3.6.egg-info/SOURCES.txt
spam-0.1-py3.6.egg-info/dependency_links.txt
spam-0.1-py3.6.egg-info/top_level.txt
Unfortunately, the user must be separately informed about calling pip install with the extra key (via readme, webpage FAQ or similar) as there is no possibility to prohibit building the wheel in package metadata.
As the result, I do not include files with absolute paths anymore. Instead, I install them with the python sources in the site-packages dir. In the python code, I have to add additional logic for the existence checks and file copying if necessary:
# program entrypoint
if __name__ == '__main__':
config = os.path.join(os.path.expanduser('~'), '.my_config')
if not os.path.exists(config):
shutil.copyfile('.my_config', config)
main.run()
Besides what #hoefling said, I suggest you not using data_files at all! Because it's really unpredictable where the files will be copied to. You could test this by giving the directory something like '', '/', or '/anything/you/want'.
I suggest you use package_data instead, which just copies the files under the distributed package root on installation. Then you can copy that to anywhere you want at run time.
For more on package_data, refer to Python Doc https://docs.python.org/2/distutils/setupscript.html#installing-package-data
Related
In order to stage python project within our corporation I need to make an installable distribution.
This should include:
An egg or whl for my project
An egg or whl for every dependency of the project
(optionally) produce a requirements.txt file listing all the installable components for this release
Is there an easy plug in, (e.g. an alternative to bdist_wheel) that will not only compile one wheel but also that project's components?
Obviously I can script this, but I was hoping that there might be a short-cut that builds the package + dependencies in fewer steps.
This needs to work on Python 2.7 on Windows + Linux.
You will need to create a setup.py file for your package. Make sure you have the latest setuptools and pip installed. Then run the following:
python setup.py bdist_wheel
This will create a wheel file for your package. This assumes you don't have C/C++ headers, DLLs, etc. If you do, then you'll probably have a lot more work to do.
To get dependencies, you will want to create a requirements.txt file and run the following:
pip wheel -r requirements.txt
If your package isn't on PyPI, then you'll have to manually copy your package's wheel file into the wheel folder that this command creates. For more information see the following excellent article:
http://lucumr.pocoo.org/2014/1/27/python-on-wheels/
With the latest pip and wheel, you can simply run
pip wheel .
within your project folder, even if your application isn't on PyPi. All wheels will be stored in the current directory (.).
To change the output directory (to for example, ./wheels), you may use the -w / --wheel-dir option:
pip wheel . -w wheels
All the options available are listed at the pip documentation.
With poetry you can define your dependencies and metadata about your project in a file in the root of your project, called pyproject.toml:
[tool.poetry]
name = "my-project"
version = "0.1.0"
description = "some longer description"
authors = ["Some Author <some#author.io>"]
[tool.poetry.dependencies]
python = "*"
[tool.poetry.dev-dependencies]
pytest = "^3.4"
To build your project as a wheel, execute poetry build
$ poetry build
Building my-project (0.1.0)
- Building sdist
- Built my-project-0.1.0.tar.gz
- Building wheel
- Built my-project-0.1.0-py3-none-any.whl
a dist/ folder is created with a wheel for your project.
Trying to test editable installs out and I'm not sure how to interpret the results.
I intentionally made a typo in the egg= portion but it was still able to locate the egg without any help from me:
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git
Could not detect requirement name for 'git+https://gitlab.com/jame/clientapp.git', please specify one with #egg=your_package_name
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git#egg=
Could not detect requirement name for 'git+https://gitlab.com/jame/clientapp.git#egg=', please specify one with #egg=your_package_name
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git#egg=e
Obtaining e from git+https://gitlab.com/jame/clientapp.git#egg=e
Cloning https://gitlab.com/jame/clientapp.git to /src/e
Running setup.py (path:/src/e/setup.py) egg_info for package e produced metadata for project name clientapp. Fix your #egg=e fragments.
Installing collected packages: clientapp
Found existing installation: ClientApp 0.7
Can't uninstall 'ClientApp'. No files were found to uninstall.
Running setup.py develop for clientapp
Successfully installed clientapp
root#6be8ee41b6c9:/# pip3 freeze
asn1crypto==0.24.0
-e git+https://gitlab.com/jame/clientapp.git#5158712c426ce74613215e61cab8c21c7064105c#egg=ClientApp
cryptography==2.6.1
entrypoints==0.3
keyring==17.1.1
keyrings.alt==3.1.1
pycrypto==2.6.1
PyGObject==3.30.4
pyxdg==0.25
SecretStorage==2.3.1
six==1.12.0
So if I could mess the egg name up so bad, why is it considered an error to either leave it blank or set to something empty
Hard to answer, maybe raise this as an issue on pip's bug tracker and get an accurate answer from the developers themselves.
My guess, the egg name matters if the project is a dependency of another project. For example in a case where one wants to install A from PyPI and Z from git, but Z is a dependency of A.
pip install 'A' 'git+https://example.local/Z.git#egg=Z'
egg= is the name that's used when uninstalling unpackaged libraries that's installed from a VCS repository, and the name that's used by the dependency resolver when searching for dependant packages.
If you don't care about those two use cases, they can essentially be set to anything.
it found the egg via setup.py
It didn't find the egg via setup.py, pip found the setup.py and set the egg name for the setup.py install to whatever you specified. When you're installing from a VCS, there is no package, so there's no egg name configured, egg= configures the installation as if it has been installed with a package with that egg name.
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
There is plenty of information on the basic setup and configuration required:
SetupTools Documentation - setup() and setup.py configuration
Python Packaging User Guide - Installing Packages
PBR v3.1.1 documentation
StackOverflow: How to use version info generated using setuptools and pbr
But where is the simple info on how to actually create the distro?
i.e. I'm looking for whatever command finds the git tag with the version info and pulls it into the configuration info, so the source with that new version info can be distributed, and the version info is discoverable from the scripts, using a method like described in this answer.
Additional details
I'm working on a project that will be distributed to other developers only through a git repo, not through PyPi. The project will be released to users as an executable using pyinstaller, so this package distribution will only serve a few key purposes:
Install/Setup the package for other developers so that dependencies/environment can be recreated cleanly.
Manage versioning - Current plan is to use pbr to generate versions from the Git repo tags, so those tags can be our source of truth for versioning
Use pbr for other auto generation of mundane items from Git, such as authors, manifest.in file, release notes, etc.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
I'm sure it exists somewhere in the documentation, but after several false starts I'm asking here. It is implied everywhere I look that everyone either knows how to do this or it just magically happens as a part of the process.
Am I just missing the obvious?
Update:
Based on sinoroc's answer, it appears I need to look into development mode installs. i.e. Anyone developing the project will clone from git, and then install via using setuptools development install mode.
This wasn't directly a part of the original question, but implied, and I believe will be of interest to people in the same situation (info I couldn't easily find).
More info is available in his answer on updating some of the metadata, and via this setuptools documentation link to working in "Development Mode"
In short:
python3 setup.py sdist
python3 setup.py bdist_wheel
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
The usual commands to create (source and wheel) distributions of your Python package with setuptools are: python3 setup.py sdist and python3 setup.py bdist_wheel. The distributions can then be found in the dist directory by default.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
It is true that setuptools does not document this. It only documents the differences to distutils, and it is confusing indeed. See below for actual documentation...
But where is the simple info on how to actually create the distro?
https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
https://docs.python.org/3/distutils/sourcedist.html
https://docs.python.org/3/distutils/builtdist.html
Update
Since you don't plan on publishing distributions of your project on an index such as PyPI, and you plan on using pyinstaller instead, then you can indeed most likely disregard the setuptools commands such as sdist and bdist_wheel.
Still you might want to know these commands for the development phase:
Use commands such as python3 setup.py --version, python3 setup.py --fullname to figure out if setuptools (and in your case pbr) is catching the right info.
Use python3 setup.py develop (or pip install --editable .) to place a pseudo link (egg-link) in your site-packages that points at your work in progress. This way your changes are always installed and importable. Important: don't use python3 setup.py install, this would copy the current version to site-packages and newer changes would not be importable.
Now I don't know how all this will work once you move on to pyinstaller. Especially since you mentioned that you want the meta info (such as the version number) to be discoverable from within your scripts. The technique with setuptools pkg_resources may or may not work in the pyinstaller context.
This is how I solved the same issue, also having read several different links.
I have created a setup.py file with this content:
from setuptools import setup, find_packages
def readme():
with open('README.rst') as f:
return f.read()
def read_other_requirements(other_type):
with open(other_type+'-requirements.txt') as f:
return f.read()
setup(
setup_requires=read_other_requirements('setup'),
pbr=True,
packages=find_packages('src'),
package_dir={'': 'src'},
include_package_data=True,
zip_safe=True
)
I have the source code in ./src. Also, I have a setup-requirements.txt, with content:
pip==18.1
pbr==5.1.1
setuptools==40.7.0
And a setup.cfg with this content:
[metadata]
name = XXXXX
description = XXXXX
description-file = README.rst
home-page = https://github.com/XXXXX/XXXXX
So first, you install the setup-requirements:
pip install -r setup-requirements.txt
Then, whenever you have locally a commit which was tagged in GitHub, you can install it using:
python setup.py install
and it will be installed with the tagged version.
You can check it by doing:
python setup.py --version
Both of the following commands successfully install my package without error.
pip install git+https://path_to_repo/repo_name.git#v17.8.0
pip install git+https://path_to_repo/repo_name.git#v17.8.0#egg=repo_name
What is the difference?
I'm using pip 7.1.0 and 9.0.1
Working Out the Name and Version
For each candidate item, pip needs to know the project name and
version. For wheels (identified by the .whl file extension) this can
be obtained from the filename, as per the Wheel spec. For local
directories, or explicitly specified sdist files, the setup.py
egg_info command is used to determine the project metadata. For sdists
located via an index, the filename is parsed for the name and project
version (this is in theory slightly less reliable than using the
egg_info command, but avoids downloading and processing unnecessary
numbers of files).
Any URL may use the #egg=name syntax to explicitly state the project name.
I have a project that has python-xlib as a requirement. Unfortunately python-xlib is not on pypi, so in my requirements file I use:
svn+https://python-xlib.svn.sourceforge.net/svnroot/python-xlib/tags/xlib_0_15rc1/ as per this advice:
How do you install Python Xlib with pip?
This works fine with pip, but I want to package it with setup.py. Only actual eggs are allowed in install_requires, so this answer:
How can I make setuptools install a package that's not on PyPI?
suggests using dependency_links, which I did.
svn+https did not work in a dependency_link, so instead I link to the tarball referenced from this page: http://python-xlib.svn.sourceforge.net/viewvc/python-xlib/tags/xlib_0_15rc1/
. This tarball gets downloaded the way I expect it to, but when it is time to install it, I get:
Searching for pyxlib
Best match: pyxlib [unknown version]
Downloading http://python-xlib.svn.sourceforge.net/viewvc/python-xlib/tags/xlib_0_15rc1/?view=tar#egg=pyxlib
Processing xlib_0_15rc1
error: /tmp/easy_install-BDFVH3/xlib_0_15rc1/COPYING: Not a directory
I don't get it. Of course COPYING is not a directory. Why does setuptools (or is it distutils?) not run the setup.py that is in the python-xlib root? I suspect this must all be quite easy. How do I include python-xlib as a dependency for my egg?