distribute python package without publishing it in PyPi [duplicate] - python

In order to stage python project within our corporation I need to make an installable distribution.
This should include:
An egg or whl for my project
An egg or whl for every dependency of the project
(optionally) produce a requirements.txt file listing all the installable components for this release
Is there an easy plug in, (e.g. an alternative to bdist_wheel) that will not only compile one wheel but also that project's components?
Obviously I can script this, but I was hoping that there might be a short-cut that builds the package + dependencies in fewer steps.
This needs to work on Python 2.7 on Windows + Linux.

You will need to create a setup.py file for your package. Make sure you have the latest setuptools and pip installed. Then run the following:
python setup.py bdist_wheel
This will create a wheel file for your package. This assumes you don't have C/C++ headers, DLLs, etc. If you do, then you'll probably have a lot more work to do.
To get dependencies, you will want to create a requirements.txt file and run the following:
pip wheel -r requirements.txt
If your package isn't on PyPI, then you'll have to manually copy your package's wheel file into the wheel folder that this command creates. For more information see the following excellent article:
http://lucumr.pocoo.org/2014/1/27/python-on-wheels/

With the latest pip and wheel, you can simply run
pip wheel .
within your project folder, even if your application isn't on PyPi. All wheels will be stored in the current directory (.).
To change the output directory (to for example, ./wheels), you may use the -w / --wheel-dir option:
pip wheel . -w wheels
All the options available are listed at the pip documentation.

With poetry you can define your dependencies and metadata about your project in a file in the root of your project, called pyproject.toml:
[tool.poetry]
name = "my-project"
version = "0.1.0"
description = "some longer description"
authors = ["Some Author <some#author.io>"]
[tool.poetry.dependencies]
python = "*"
[tool.poetry.dev-dependencies]
pytest = "^3.4"
To build your project as a wheel, execute poetry build
$ poetry build
Building my-project (0.1.0)
- Building sdist
- Built my-project-0.1.0.tar.gz
- Building wheel
- Built my-project-0.1.0-py3-none-any.whl
a dist/ folder is created with a wheel for your project.

Related

Python Setuptools and PBR - how to create a package release using the git tag as the version?

How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
There is plenty of information on the basic setup and configuration required:
SetupTools Documentation - setup() and setup.py configuration
Python Packaging User Guide - Installing Packages
PBR v3.1.1 documentation
StackOverflow: How to use version info generated using setuptools and pbr
But where is the simple info on how to actually create the distro?
i.e. I'm looking for whatever command finds the git tag with the version info and pulls it into the configuration info, so the source with that new version info can be distributed, and the version info is discoverable from the scripts, using a method like described in this answer.
Additional details
I'm working on a project that will be distributed to other developers only through a git repo, not through PyPi. The project will be released to users as an executable using pyinstaller, so this package distribution will only serve a few key purposes:
Install/Setup the package for other developers so that dependencies/environment can be recreated cleanly.
Manage versioning - Current plan is to use pbr to generate versions from the Git repo tags, so those tags can be our source of truth for versioning
Use pbr for other auto generation of mundane items from Git, such as authors, manifest.in file, release notes, etc.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
I'm sure it exists somewhere in the documentation, but after several false starts I'm asking here. It is implied everywhere I look that everyone either knows how to do this or it just magically happens as a part of the process.
Am I just missing the obvious?
Update:
Based on sinoroc's answer, it appears I need to look into development mode installs. i.e. Anyone developing the project will clone from git, and then install via using setuptools development install mode.
This wasn't directly a part of the original question, but implied, and I believe will be of interest to people in the same situation (info I couldn't easily find).
More info is available in his answer on updating some of the metadata, and via this setuptools documentation link to working in "Development Mode"
In short:
python3 setup.py sdist
python3 setup.py bdist_wheel
How do I actually create a release/distro of a python package that uses a git repo tag for the versioning, using setuptools and pbr?
The usual commands to create (source and wheel) distributions of your Python package with setuptools are: python3 setup.py sdist and python3 setup.py bdist_wheel. The distributions can then be found in the dist directory by default.
Since setuptools docs focus on setting up a fully distributable and reusable package with PyPi and pip, and pbr docs only really tell you how to modify setuptools configuration to use pbr, I can't find the info on how to just run the distribution/release process.
It is true that setuptools does not document this. It only documents the differences to distutils, and it is confusing indeed. See below for actual documentation...
But where is the simple info on how to actually create the distro?
https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
https://docs.python.org/3/distutils/sourcedist.html
https://docs.python.org/3/distutils/builtdist.html
Update
Since you don't plan on publishing distributions of your project on an index such as PyPI, and you plan on using pyinstaller instead, then you can indeed most likely disregard the setuptools commands such as sdist and bdist_wheel.
Still you might want to know these commands for the development phase:
Use commands such as python3 setup.py --version, python3 setup.py --fullname to figure out if setuptools (and in your case pbr) is catching the right info.
Use python3 setup.py develop (or pip install --editable .) to place a pseudo link (egg-link) in your site-packages that points at your work in progress. This way your changes are always installed and importable. Important: don't use python3 setup.py install, this would copy the current version to site-packages and newer changes would not be importable.
Now I don't know how all this will work once you move on to pyinstaller. Especially since you mentioned that you want the meta info (such as the version number) to be discoverable from within your scripts. The technique with setuptools pkg_resources may or may not work in the pyinstaller context.
This is how I solved the same issue, also having read several different links.
I have created a setup.py file with this content:
from setuptools import setup, find_packages
def readme():
with open('README.rst') as f:
return f.read()
def read_other_requirements(other_type):
with open(other_type+'-requirements.txt') as f:
return f.read()
setup(
setup_requires=read_other_requirements('setup'),
pbr=True,
packages=find_packages('src'),
package_dir={'': 'src'},
include_package_data=True,
zip_safe=True
)
I have the source code in ./src. Also, I have a setup-requirements.txt, with content:
pip==18.1
pbr==5.1.1
setuptools==40.7.0
And a setup.cfg with this content:
[metadata]
name = XXXXX
description = XXXXX
description-file = README.rst
home-page = https://github.com/XXXXX/XXXXX
So first, you install the setup-requirements:
pip install -r setup-requirements.txt
Then, whenever you have locally a commit which was tagged in GitHub, you can install it using:
python setup.py install
and it will be installed with the tagged version.
You can check it by doing:
python setup.py --version

How to read the requirements from a setup.py file

I'm making a utility for the Python mobile app Pythonista. It is basically just a version of pip, which is not supported by default (Yes I am aware one already exists, I am making mine differently for personal use).
In packages, there exists a setup.py, but not always a requirements.txt. How can I read the file and find the dependencies from it? Alternatively, how would I fetch the dependencies? I know it should be possible, because pip itself finds the dependencies, and they don't always have a requirements.txt.
So how would I get the dependencies of a package by the setup.py, or however pip does it?
A solution not using setup.py or pip:
You can try and us pipreqs package.
pipreqs - Generate requirements.txt file for any project based on imports
Another option use the pip-tools
The pip-compile command lets you compile a requirements.txt file from your dependencies, specified in either setup.py or requirements.in.

Python Re-packaging an Existing Package for Distribution

I'm using Python 3.6.3 on Windows 7 Enterprise and when I tried to pip install the Python package "bitarray", the output indicated the need for Microsoft Visual C++ Build Tools. I downloaded and installed the build tools and installed bitarray with no problems.
Here's where the problem comes in: I now need to distribute bitarray to other employees within the company who don't have Microsoft Visual C++ Build Tools installed, but do have Python installed (and can use pip).
Can I just simply "re-package" the bitarray folder in "C:\Python363\Lib\site-packages\bitarray" (which contains the already compiled .pyd file) and just make it a local package? This way I can use pip with "file:///" to pull down a local copy of the package without the need for the build tools step?
Also, do I need to incorporate the information in the folder "C:\Python363\Lib\site-packages\bitarray-0.8.1.dist-info" to re-package?
Thanks in advance for any help!!!!
Scott
Instead of trying to work around the already installed package, why not building a distribution from source yourself? After all, you've already done the hardest part setting up the C compiler, the rest is just a sequence of commands you have to type. This is what you can do:
Clone bitarray's repository:
$ git clone https://github.com/ilanschnell/bitarray
Navigate into the cloned repository:
$ cd bitarray
Checkout the version tag you want to build (the latest one is 0.8.1):
$ git checkout 0.8.1
Ensure you have wheel installed to be able to build a static wheel:
$ pip install wheel
Build the static wheel:
$ python setup.py bdist_wheel
A new directory dist was created in the current one, check what's inside:
$ ls dist
bitarray-0.8.1-cp36-cp36m-macosx_10_6_intel.whl
(Note: This is what I would enter on my system, list the directory with dir on Windows, also your file should be either bitarray-0.8.1-cp36-cp36m-win_amd64.whl if you are building on a 64 bit system, or bitarray-0.8.1-cp36-cp36m-win32.whl on a 32 bit one).
Now you have built a static wheel that contains the C extensions compiled for Python 3.6 on Windows. It can be installed on Windows without needing to setup the C compiler on the target machine. Just enter
$ pip install bitarray-0.8.1-cp36-cp36m-win_amd64.whl
Note, however, that this wheel file can be installed only on Windows and only with Python 3.6. Should you need to provide a wheel for another setup (like Python 3.5 on Windows 32 bit), you would need to build another wheel file using the correct Python version on a correct target system, but the steps would be just the same.
Building without Git
If you don't have Git installed and you can't/don't want to install it, just download the zipped repository from Github, unzip it, navigate to the extracted directory and perform steps 4-6.

Copy configuration file on installation

I am trying to package my Python project, which comes with a configuration dotfile that I want copied into the user's home directory on installation. The quick guide to packaging says that this can be done using the data_files argument to setuptools.setup. So this is what I have:
data_files = [(os.path.expanduser("~"), [".my_config"])]
This appears to work fine if I use python setup.py install, but when I upload my package to PyPI and run pip install the dotfile isn't copied.
FWIW, I've put the dotfile in the MANIFEST.in and also tried including the package_data argument to setup. None of these steps appear to make a difference. If I pip install and poke around the site-packages directory, just the source files are here.
How can I achieve what I'm looking for?
This is an issue I had once to experience myself. Its root is that when you are building a wheel file, all the absolute paths specified in data_files will be relativized to the target site-packages directory, see this issue on github. This influences installations performed by pip install as it will build a wheel out of any source package (.tar.gz, .tar.bz2 or .zip) and install the resulting wheel:
$ pip install spam-0.1.tar.gz
Processing ./spam-0.1.tar.gz
Building wheels for collected packages: spam
Running setup.py bdist_wheel for spam ... done
Stored in directory: /Users/hoefling/Library/Caches/pip/wheels/d0/95/be/bc79f1d589d90d67139481a3e706bcc54578fdbf891aef75c0
Successfully built spam
Installing collected packages: spam
Successfully installed spam-0.1
Checking installed files yields:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
Users/hoefling/.my_config
spam-0.1.dist-info/DESCRIPTION.rst
spam-0.1.dist-info/INSTALLER
spam-0.1.dist-info/METADATA
spam-0.1.dist-info/RECORD
spam-0.1.dist-info/WHEEL
spam-0.1.dist-info/metadata.json
spam-0.1.dist-info/top_level.txt
Note the path meant to be absolute is relative to the Location dir. In the example, .my_config would be placed under /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages/Users/hoefling/.my_config.
It gets even better because these built wheels are cached on your disk, so next time you reinstall the package and the built wheel still exists in pip's cache, it will be used for the installation and you won't even see any mentions of building a wheel in the terminal log.
There is no real solution to avoid this. The most decent workaround I found is to prohibit "binary" packages when installing to enforce the execution of package's setup.py on installation:
$ pip install spam-0.1.tar.gz --no-binary=spam
Processing ./spam-0.1.tar.gz
Skipping bdist_wheel for spam, due to binaries being disabled for it.
Installing collected packages: spam
Running setup.py install for spam ... done
Successfully installed spam-0.1
The file is now placed correctly:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
../../../../../.my_config
spam-0.1-py3.6.egg-info/PKG-INFO
spam-0.1-py3.6.egg-info/SOURCES.txt
spam-0.1-py3.6.egg-info/dependency_links.txt
spam-0.1-py3.6.egg-info/top_level.txt
Unfortunately, the user must be separately informed about calling pip install with the extra key (via readme, webpage FAQ or similar) as there is no possibility to prohibit building the wheel in package metadata.
As the result, I do not include files with absolute paths anymore. Instead, I install them with the python sources in the site-packages dir. In the python code, I have to add additional logic for the existence checks and file copying if necessary:
# program entrypoint
if __name__ == '__main__':
config = os.path.join(os.path.expanduser('~'), '.my_config')
if not os.path.exists(config):
shutil.copyfile('.my_config', config)
main.run()
Besides what #hoefling said, I suggest you not using data_files at all! Because it's really unpredictable where the files will be copied to. You could test this by giving the directory something like '', '/', or '/anything/you/want'.
I suggest you use package_data instead, which just copies the files under the distributed package root on installation. Then you can copy that to anywhere you want at run time.
For more on package_data, refer to Python Doc https://docs.python.org/2/distutils/setupscript.html#installing-package-data

What is the difference between pip installing a git repo with and without #egg=

Both of the following commands successfully install my package without error.
pip install git+https://path_to_repo/repo_name.git#v17.8.0
pip install git+https://path_to_repo/repo_name.git#v17.8.0#egg=repo_name
What is the difference?
I'm using pip 7.1.0 and 9.0.1
Working Out the Name and Version
For each candidate item, pip needs to know the project name and
version. For wheels (identified by the .whl file extension) this can
be obtained from the filename, as per the Wheel spec. For local
directories, or explicitly specified sdist files, the setup.py
egg_info command is used to determine the project metadata. For sdists
located via an index, the filename is parsed for the name and project
version (this is in theory slightly less reliable than using the
egg_info command, but avoids downloading and processing unnecessary
numbers of files).
Any URL may use the #egg=name syntax to explicitly state the project name.

Categories

Resources