How to properly implement single source project versioning in Python? - python

I love the way CMake allows me to single source version my C/C++ projects, by letting me say:
project(Tutorial VERSION 1.0)
in my CMakeLists.txt file and, then, use placeholders of the form:
#define ver_maj #Tutorial_VERSION_MAJOR#
#define ver_min #Tutorial_VERSION_MINOR#
in my *.h.in file, which, when run through configure_file(), becomes:
#define ver_maj 1
#define ver_min 0
in my equivalent *.h file.
I'm then able to include that file anywhere I need access to my project version numbers.
This is decidedly different than the experience I have with Python projects.
In that case, I'm often rebuilding, because I forgot to sync. up the version numbers in my pyproject.toml and <module>/__init__.py files.
What is the preferred way to achieve something similar to CMake style single source project versioning in a Python project?

In the __init__.py you can set the __version__ like this:
from importlib.metadata import version as _get_version
# Set PEP396 version attribute
__version__ = _get_version('<put-your-package-name-here>')
Or, if your package targeted older Python:
import sys
if sys.version_info >= (3, 8):
from importlib.metadata import version as _get_version
else:
from importlib_metadata import version as _get_version
# …
and a package requirement:
importlib-metadata = {version = ">= 3.6", markers="python_version < '3.8'"}
(use the proper syntax for your build tool)
Also, there are some extensions for build tools (e.g., setuptools-scm or hatch-vcs) that allows avoiding setting explicit version in the pyproject.toml for the sake of e.g., git. So, the only source of version becomes VCS used.

Related

Erroneous SciPy 1.7 source build

When installing SciPy 1.7.1 from source on Linux using
python setup.py build
python setup.py install
(along with environment and site.cfg hacking as needed) I end up with a broken build. My particular build recipe works for SciPy <= 1.6
Once SciPy 1.7.1 is built, importing e.g. scipy.optimize or scipy.special results in errors
AttributeError: module 'scipy.special._ufuncs_cxx' has no attribute 'pyx_capi'
ImportError: cannot import name 'levinson' from 'scipy.linalg._solve_toeplitz'
ImportError: cannot import name 'csgraph_to_dense' from 'scipy.sparse.csgraph._tools'
What has changed, and how do I solve this?
Looking at the site-packages directory I see that SciPy 1.7 installs itself as a zipped Python egg, whereas previous versions used to install as directories (though still Python eggs). This behaviour can be chosen by specifying the zip_safe argument to setuptools.setup(), called from within setup.py. In SciPy 1.7 this is called as
setup(**metadata)
with metadata not including 'zip_safe', meaning that whether zipped eggs are safe to use are automatically determined. This might also be the case for older SciPy versions, but for whatever reason the process ends up declaring zipped eggs to be safe for 1.7 on my system, which does not seem to be the case.
Manually adding
metadata['zip_safe'] = False
above setup(**metadata) prior to executing setup.py results in the egg being a directory (as opposed to a zipped archive), and the build works.
To do the patching of setup.py programatically, use e.g. (GNU sed)
sed -i "s/\(^ *\)\(setup *(.*\)$/\1metadata['zip_safe'] = False; \2/" setup.py

display __version__ using setuptools.setup values in setup.py

I have a packaged project mytools which uses setuptools' setup to store its version in a setup.py project file e.g.
import setuptools
setuptools.setup(
name='mytools',
version='0.1.0'
)
I'd like to get the common mytools.__version__ feature based on the version value e.g.
import mytools
mytools.__version__
>>>'0.1.0'
Is there native / simple way in setuptools to do so? Couldn't find a reference to __version__ in setuptools.
Furthermore, I don't want to store the version in __init__.py because I'd prefer to keep the version in its current place (setup.py).
The many answers to similar questions do not speak to my specific problem, e.g. How can I get the version defined in setup.py (setuptools) in my package?
Adding __version__ to all top-level modules and packages is a recommendation from PEP 396.
Lately I have seen growing concerns raised about this recommendation and its actual usefulness, for example here:
https://gitlab.com/python-devs/importlib_resources/-/issues/100
https://gitlab.com/python-devs/importlib_metadata/-/merge_requests/125
some more that I can't find right now...
With that said...
Such a thing is often solved like the following:
# my_top_level_module/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
References:
https://docs.python.org/3/library/importlib.metadata.html
https://importlib-metadata.readthedocs.io/en/latest/using.html#distribution-versions

cython setuptools change output filename

I am using cython to cross-compile external python module. I am using python3.6 on the host and python3.5 on the target. Also I am compiling on x86_64 for target aarch64.
My setup.py looks like:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Build import cythonize
from Cython.Distutils import build_ext
import builder_config
import os
os.environ["PATH"] = builder_config.PATH
os.environ["CC"] = builder_config.COMPILER
os.environ["LDSHARED"] = builder_config.COMPILER + " -lpython3.5m -shared"
os.environ["CFLAGS"] = builder_config.CFLAGS
os.environ["LDFLAGS"] = builder_config.LDFLAGS
os.environ["ARCH"] = "aarch64"
setup(
ext_modules = cythonize((Extension("my_ext", ["file1.pyx", "file2.pyx", "file3.pyx", "file4.pyx", "file5.pyx"]))),
)
When I run python3.6 setup.py build_ext -i I get a file named: my_ext.cpython-36m-x86_64-linux-gnu.so
My problem is that on the target the library will not be loaded unless the name is changed to:
my_ext.cpython-35m-aarch64-linux-gnu.so
How can I change the generated filename?
As stated in the comments, what you are trying to achieve is unsafe.
You can work around the architecture tag with the environment variable _PYTHON_HOST_PLATFORM (e.g. you can change it in your sitecustomize.py). But, if the modules are actually incompatible (and they most likely are), you will only get core dumps later on.
I don't think you can work around the major Python version.
In order to come back to safer grounds, I would try to rely on portable solutions. For example, it doesn't look official, but we can find some articles on the web about Conda and aarch64 (e.g. you can look for 'Archiconda'). One more time, you wouldn't be able to simply copy the conda environments from one machine to another, but, you can freeze these environments (via a 'conda export') and build similar ones on the target machine.
An option is to upgrade the target interpreter to v3.6 if that's possible for you.
Another option is to install v3.5 on the machine you're using to build with that interpreter. It's pretty uncomplicated to get several different versions of the python interpreter installed on the same machine. I don't know your specifics so I can't provide any links but I'm sure a quick search will get you what you need.

Usage of "provides" keyword-argument in python's setup.py

I am working on a fork of a python projet (tryton) which uses setuptools for packaging. I am trying to extend the server part of the project, and would like to be able to use the existing modules with my fork.
Those modules are distributed with setuptools packaging, and are requiring the base project for installation.
I need a way to make it so that my fork is considered an acceptable requirement for those modules.
EDIT : Here is what I used in my setup.py :
from setuptools import setup
setup(
...
provides=["trytond (2.8.2)"],
...
)
The modules I want to be able to install have those requirements :
from setuptools import setup
setup(
...
install_requires=["trytond>=2.8"]
...
)
As it is, with my package installed, trying to install a module triggers the installation of the trytond package.
Don’t use provides, it comes from a packaging specification (a metadata PEP) that is not implemented by any tool. The requiremens in the install_requires argument map to the name in your other setup.py. IOW, replace your provides with setup(name='trytond', version='2.8.2').
If you are building rpms, it is possible to use the setup.cfg as follows:
[bdist_rpm]
provides = your-package = 0.8
obsoletes = your-package

Distributing pre-built libraries with python modules

I use the following script to distribute a module containing pure python code.
from distutils.core import setup, Extension
import os
setup (name = 'mtester',
version = '0.1',
description = 'Python wrapper for libmtester',
packages=['mtester'],
package_dir={'mtester':'module'},
)
The problem I have is, I modified one of the files that uses an external library (a .so file), which I need to ship along with the existing module. I was suggested to use package_data to include the library. I modified the script to the following.
from distutils.core import setup, Extension
import os
data_dir = os.path.abspath('../lib64/')
setup (name = 'mtester',
version = '0.1',
description = 'Python wrapper for libmtester',
packages=['mtester'],
package_dir={'mtester':'module'},
package_data={'mtester':[data_dir+'mhelper.so']},
)
The problem is, adding package_data did not make any difference. This is not installing the mhelper.so in any location (neither in site-packages nor in site-packages/mtester).
System info: Fedora 10, 64 bit, python 2.5 (Yes it is ancient. But it is our build machine, and it needs to stay that way to maintain backward compatibility)
Any suggestions that would help me resolve this would be well appreciated!
Unfortunately package_data looks for files relative to the top of the package. One fix is to move the helper library under the module dir with the rest of the code:
% mv lib64/mhelper.so module/
Then modify the package_data argument accordingly:
package_data = {'mtester': ['mhelper.so']}
...
Then test:
% python setup.py bdist
% tar tf dist/mtester-0.1.linux-x86_64.tar.gz | grep mhelper
./usr/local/lib/python2.5/dist-packages/mtester/mhelper.so

Categories

Resources