The following is my setup.py:
from setuptools import setup, find_packages
packages=find_packages("src")
setup(name='myapp',
version='0.2.0',
url='http://loom.st',
author='Loom',
author_email='admin#loom.st',
package_dir={'': 'src'},
packages=packages,
)
I built rpm with command python setup.py bdist_rpm and have got files:
myapp-0.2.0-1.noarch.rpm
myapp-0.2.0-1.src.rpm
myapp-0.2.0.tar.gz
Why I have 1 in rpm file names and how I can manage what to show on this place?
The 1 is called the release number. As you can see in the documentation:, when you call setup.py, you can pass him the option --release to define the release number like this:
python setup.py bdist_rpm --release=0
This number is called release number. For the same version (0.2.0 in your case) you can have various releases. E.g. because ABI of some dependency changed and you need to rebuild it agains updated dependency. Or you added some security patch. Part of the release number is usually dist tag. E.g.: myapp-0.2.0-1.el6.noarch.rpm, myapp-0.2.0-1.el5.noarch.rpm. So the ".el5" and ".el6" are in fact part of release number. And it helps you better describe what release it is actualy. Because %{python_sitelib} is on el5 is different from path on el6 so the binary RPMs are different.
Release number usualy start with 1.
You can find more information at https://fedoraproject.org/wiki/Packaging:NamingGuidelines#Release_Tag
BTW, you will get better result if you use pyp2rpm for generating rpm packges.
Related
How to add dependencies inside setup.py file ? Like, I am writing this script on VM and want to check whether certain dependencies like, jdk or docker is there or not, and if there is no dependencies installed, then need to install automatically on VM using this script.
Please do tell me as soon as possible, as it is required for my project.
In simplest form, you can add (python) dependencies which can be install via pip as follow:
from setuptools import setup
setup(
...
install_requires=["install-jdk", "docker>=4.3"],
...
)
Alternatively, write down a requirement.txt file and then use it:
with open("requirements.txt") as requirements_file:
requirements = requirements_file.readlines()
requirements = [x[:-1] for x in requirements]
setup(
...
install_requires=requirements,
...
)
Whenever you'll execute python setup.py install then these dependencies will be checked against the available libraries in your VM and if they are not available (or version mismatch) then it will be installed (or replaced). More information can be found here.
Refer the https://github.com/boto/s3transfer/blob/develop/setup.py and check the requires variables.
You can refer many other open source projects
You can add dependencies using setuptools, however it can only check dependencies on python packages.
Because of that, you could check jdk and docker installation before setup(), manually.
You could call system like the code below and check the reponse.
import os
os.system("java -version")
os.system("docker version --format \'{{.Server.Version}}\'")
I have a packaged project mytools which uses setuptools' setup to store its version in a setup.py project file e.g.
import setuptools
setuptools.setup(
name='mytools',
version='0.1.0'
)
I'd like to get the common mytools.__version__ feature based on the version value e.g.
import mytools
mytools.__version__
>>>'0.1.0'
Is there native / simple way in setuptools to do so? Couldn't find a reference to __version__ in setuptools.
Furthermore, I don't want to store the version in __init__.py because I'd prefer to keep the version in its current place (setup.py).
The many answers to similar questions do not speak to my specific problem, e.g. How can I get the version defined in setup.py (setuptools) in my package?
Adding __version__ to all top-level modules and packages is a recommendation from PEP 396.
Lately I have seen growing concerns raised about this recommendation and its actual usefulness, for example here:
https://gitlab.com/python-devs/importlib_resources/-/issues/100
https://gitlab.com/python-devs/importlib_metadata/-/merge_requests/125
some more that I can't find right now...
With that said...
Such a thing is often solved like the following:
# my_top_level_module/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
References:
https://docs.python.org/3/library/importlib.metadata.html
https://importlib-metadata.readthedocs.io/en/latest/using.html#distribution-versions
I am creating an egg file and I am able to do that successfully. However, the value I have provided in description and long_description is not visible.
setup.py
description = "desc"
long_description = "lond desc"
setup(
name="abc",
version="0.2",
packages=find_packages(),
description=description,
long_description=long_description,
author='Gaurang Shah',
author_email='gaurang.shah#abc.com'
)
Build script
rm -rf build dist dataplaform.egg-info
python setup.py bdist_egg
After installing a package, when I run following command. I don't see anything?
import abc
abc.__doc__
You would see description and/or long_description on pip show abc or on the PyPI repository. Basically on places that refer to the Python project abc.
When you type import abc; print(abc.__doc__) you refer to a Python top level package (or module) abc that coincidentally has been made available by installing the distribution (in this case a bdist_egg) of a project bearing the same name abc.
Python projects and Python packages are not the same thing though. The confusion comes from the fact that it is almost always the case that a Python project contains a single top level package of the same name, and so both are used interchangeably to great confusion. See beautifulsoup4 for a famous counter example.
In your case abc.__doc__ actually refers to the docstring of your abc/__init__.py (or eventually a top level abc.py).
I am working on a python2 package in which the setup.py contains some custom install commands. These commands actually build some Rust code and output some .dylib files that are moved into the python package.
An important point is that the Rust code is outside the python package.
setuptools is supposed to detect automatically if the python package is pure python or platform specific (if it contains some C extensions for instance).
In my case, when I run python setup.py bdist_wheel, the generated wheel is tagged as a pure python wheel: <package_name>-<version>-py2-none-any.whl.
This is problematic because I need to run this code on different platforms, and thus I need to generated one wheel per platform.
Is there a way, when building a wheel, to force the build to be platform specific ?
Here's the code that I usually look at from uwsgi
The basic approach is:
setup.py
# ...
try:
from wheel.bdist_wheel import bdist_wheel as _bdist_wheel
class bdist_wheel(_bdist_wheel):
def finalize_options(self):
_bdist_wheel.finalize_options(self)
self.root_is_pure = False
except ImportError:
bdist_wheel = None
setup(
# ...
cmdclass={'bdist_wheel': bdist_wheel},
)
The root_is_pure bit tells the wheel machinery to build a non-purelib (pyX-none-any) wheel. You can also get fancier by saying there are binary platform-specific components but no cpython abi specific components.
The modules setuptools, distutils and wheel decide whether a python distribution is pure by checking if it has ext_modules.
If you build an external module on your own, you can still list it in ext_modules so that the building tools know it exists. The trick is to provide an empty list of sources so that setuptools and distutils will not try to build it. For example,
setup(
...,
ext_modules=[
setuptools.Extension(
name='your.external.module',
sources=[]
)
]
)
This solution worked better for me than patching the bdist_wheel command. The reason is that bdist_wheel calls the install command internally and that command checks again for the existence of ext_modules to decide between purelib or platlib install. If you don't list the external module, you end up with the lib installed in a purelib subfolder inside the wheel. That causes problems when using auditwheel repair, which complains about the extensions being installed in a purelib folder.
You can also specify/spoof a specific platform name when building wheels by specifying a --plat-name:
python setup.py bdist_wheel --plat-name=manylinux1_x86_64
Neither the root_is_pure trick nor the empty ext_modules trick worked for me, but after MUCH searching myself, I finally found a working solution in 'pip setup.py bdist_wheel' no longer builds forced non-pure wheels
Basically, you override the 'has_ext_modules' function in the Distribution class, and set distclass to point to the overriding class. At that point, setup.py will believe you have a binary distribution, and will create a wheel with the specific version of python, the ABI, and the current architecture. As suggested by https://stackoverflow.com/users/5316090/py-j:
from setuptools import setup
from setuptools.dist import Distribution
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
# Tested with wheel v0.29.0
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
distclass=BinaryDistribution)
I find Anthony Sottile's answer great, but didn't work for me.
My case is that I have to enforce the wheel to be created for x86_64, but any python3, so making root impure actually caused my wheel to be py36-cp36 :(
A better way, IMO, in general is just to use sys.argv:
from setuptools import setup
import sys
sys.argv.extend(['--plat-name', 'x86_64'])
setup(name='example-wheel')
SITUATION:
I have a python library, which is controlled by git, and bundled with distutils/setuptools. And I want to automatically generate version number based on git tags, both for setup.py sdist and alike commands, and for the library itself.
For the first task I can use git describe or alike solutions (see How can I get the version defined in setup.py (setuptools) in my package?).
And when, for example, I am in a tag '0.1' and call for 'setup.py sdist', I get 'mylib-0.1.tar.gz'; or 'mylib-0.1-3-abcd.tar.gz' if I altered the code after tagging. This is fine.
THE PROBLEM IS:
The problem comes when I want to have this version number available for the library itself, so it could send it in User-Agent HTTP header as 'mylib/0.1-3-adcd'.
If I add setup.py version command as in How can I get the version defined in setup.py (setuptools) in my package?, then this version.py is generated AFTER the tag is made, since it uses the tag as a value. But in this case I need to make one more commit after the version tag is made to make the code consistent. Which, in turns, requires a new tag for further bundling.
THE QUESTION IS:
How to break this circle of dependencies (generate-commit-tag-generate-commit-tag-...)?
You could also reverse the dependency: put the version in mylib/__init__.py, parse that file in setup.py to get the version parameter, and use git tag $(setup.py --version) on the command line to create your tag.
git tag -a v$(python setup.py --version) -m 'description of version'
Is there anything more complicated you want to do that I haven’t understood?
A classic issue when toying with keyword expansion ;)
The key is to realize that your tag is part of the release management process, not part of the development (and its version control) process.
In other word, you cannot include a release management data in a development repository, because of the loop you illustrates in your question.
You need, when generating the package (which is the "release management part"), to write that information in a file that your library will look for and use (if said file exists) for its User-Agent HTTP header.
Since this topic is still alive and sometimes gets to search results, I would like to mention another solution which first appeared in 2012 and now is more or less usable:
https://github.com/warner/python-versioneer
It works in different way than all mentioned solutions: you add git tags manually, and the library (and setup.py) reads the tags, and builds the version string dynamically.
The version string includes the latest tag, distance from that tag, current commit hash, "dirtiness", and some other info. It has few different version formats.
But it still has no branch name for so called "custom builds"; and commit distance can be confusing sometimes when two branches are based on the same commit, so it is better to tag & release only one selected branch (master).
Eric's idea was the simple way to go, just in case this is useful here is the code I used (Flask's team did it this way):
import re
import ast
_version_re = re.compile(r'__version__\s+=\s+(.*)')
with open('app_name/__init__.py', 'rb') as f:
version = str(ast.literal_eval(_version_re.search(
f.read().decode('utf-8')).group(1)))
setup(
name='app-name',
version=version,
.....
)
If you found versioneer excessively convoluted, you can try bump2version.
Just add the simple bumpversion configuration file in the root of your library. This file indicates where in your repository there are strings storing the version number. Then, to update the version in all indicated places for a minor release, just type:
bumpversion minor
Use patch or major if you want to release a patch or a major.
This is not all about bumpversion. There are other flag-options, and config options, such as tagging automatically the repository, for which you can check the official documentation.
Following OGHaza's solution in a similar SO question I keep a file _version.py that I parse in setup.py. With the version string from there, I git tag in setup.py. Then I set the setup version variable to a combination of version string plus the git commit hash. So here is the relevant part of setup.py:
from setuptools import setup, find_packages
from codecs import open
from os import path
import subprocess
here = path.abspath(path.dirname(__file__))
import re, os
VERSIONFILE=os.path.join(here,"_version.py")
verstrline = open(VERSIONFILE, "rt").read()
VSRE = r"^__version__ = ['\"]([^'\"]*)['\"]"
mo = re.search(VSRE, verstrline, re.M)
if mo:
verstr = mo.group(1)
else:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
if os.path.exists(os.path.join(here, '.git')):
cmd = 'git rev-parse --verify --short HEAD'
git_hash = subprocess.check_output(cmd)
# tag git
gitverstr = 'v' + verstr
tags = subprocess.check_output('git tag')
if not gitverstr in tags:
cmd = 'git tag -a %s %s -m "tagged by setup.py to %s"' % (gitverstr, git_hash, verstr)
subprocess.check_output(cmd)
# use the git hash in the setup
verstr += ', git hash: %s' % git_hash
setup(
name='a_package',
version = verstr,
....
As was mentioned in another answer, this is related to the release process and not to the development process, as such it is not a git issue in itself, but more how is your release work process.
A very simple variant is to use this:
python setup.py egg_info -b ".`date '+%Y%m%d'`git`git rev-parse --short HEAD`" build sdist
The portion between the quotes is up for customization, however I tried to follow the typical Fedora/RedHat package names.
Of note, even if egg_info implies relation to .egg, actually it's used through the toolchain, for example for bdist_wheel as well and has to be specified in the beginning.
In general, your pre-release and post-release versions should live outside setup.py or any type of import version.py. The topic about versioning and egg_info is covered in detail here.
Example:
v1.3.4dev.20200813gitabcdef0
The v1.3.4 is in setup.py or any other variation you would like
The dev and 20200813gitabcdef0 is generated during the build process (example above)
None of the files generated during build are checked in git (usually in .gitignore they are filtered by default); sometimes there is a separate "deployment" repository, or similar, completely separate from the source one
A more complex way would be to have your release work process encoded in a Makefile which is outside the scope of this question, however a good source of inspiration can be found here and here. You will find good correspondeces between Makefile targets and setup.py commands.