Using pyproject.toml with flexible version from datetime - python

We version all our company packages with a simple datetime version. Now we are considering moving to pyproject.toml instead of setup.py. Is it possible to do flexible versioning there as well?
version = datetime.datetime.now().strftime('%Y.%m.%d.%H%M')
# Actual setup
setup(
name="some-package",
version=version,
description='Some description',
packages=find_namespace_packages(where='src', include=['company.project.*']),
package_dir={'': 'src'},
python_requires='>=3.6',
install_requires=[
'numpy',
'numba'
],
)
What syntax do I need to adjust the versioning in the pyproject.toml? This one is using poetry but there is no need for that.
[tool.poetry]
name = "some-package"
version = "0.1.0"
description = ""
readme = "README.md"

Poetry does not seem to support that, see issue #4299 for example.
But Flit does. Flit allows us to declare the version as "dynamic" in pyproject.toml:
[project]
name = 'some_package'
dynamic = ['version']
description = 'Description of the package.'
[build-system]
requires = ['flit_core>=3.2,<4']
build-backend = 'flit_core.buildapi'
The version is then determined by the package's __version__ attribute. For example, __init__.py could contain this:
import datetime
__version__ = datetime.datetime.now().strftime('%Y.%m.%d.%H%M')
Note, however, that in your specific case, when building the package with flit build, Flit will observe that the version number does not comply with PEP 440 and normalize it accordingly, i.e. remove leading zeros from day and month numbers.

Related

Python's distribution build won't recognise the version number in setup.cfg

I have a library on PyPI, but I've just come across a problem I'm having with building the distribution for my updated version using
python3 -m build
This generates the source archive and the build distribution in an empty dist/ folder.
My Setup
Following this tutorial, which outlines the need for a setup.cfg and a pyproject.toml file: https://packaging.python.org/en/latest/tutorials/packaging-projects/
I have the following setup.cfg file (notice version = 1.2.6):
[metadata]
name = disagree
version = 1.2.6
author = Oliver Price
author_email = op.oliverprice#gmail.com
description = Visual and statistical assessment of annotator agreements
long_description = file: README.md
long_description_content_type = text/markdown
url = https://github.com/o-P-o/disagree/
classifiers =
Programming Language :: Python :: 3
License :: OSI Approved :: MIT License
Operating System :: OS Independent
[options]
package_dir =
= disagree
packages = find:
python_requires = >=3.6
[options.packages.find]
where = disagree
and the following .toml file:
[build-system]
requires = ["setuptools >= 42",
"scipy >= 1.6.0",
"pandas >= 1.4.2",
"tqdm >= 4.51.0"]
build-backend = "setuptools.build_meta"
as well as an empty dist/ folder. These are the only 3 files plus the package, named disagree.
Problem
Once this builds, my dist/ folder keeps getting populated with source archives and build distributions on version 1.2.5, which is not specified anywhere in the files in this directory. So I'm ending up with the following in dist/:
disagree-1.2.5-py3-none-any.whl
disagree-1.2.5.tar.gz
Has anyone come across this before? Am I doing something completely stupid?

Configure setup.py to install requirement from repository URL

I am creating a module and need to prepare my setup.py file to have some requirements. One of the requirements is a fork of one package that is already in PyPI so I want to reference my GitHub repository directly.
I tried two configurations, the first one is:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement', # This is my repository location
]
)
I create a local distribution of my module using python setup.py sdist and when I run pip install path/to/module/dist/mymodule-0.1.tar.gz it ends up installing the version on PyPI and not my repository.
The other configuration, I tried to change the requirement name to force searching for a dependency link like so:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement_alt', # The dependency name with a suffix
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt', # This is my repository location
]
)
But with this, I only end up getting an error that myrequirement_alt is not found...
So I ask, what is the right way to achieve this without using PyPI?
For dependency links to work you need to add the version number of the package to https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt.
or it will not know what to install.
e.g.:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt-1.3' # Link with version at the end
]
)
Note that I wouldn't recommend using dependency links at all as they're deprecated. You should probably, instead, use requirement files.

How to write setup.py to include a Git repository as a dependency

I am trying to write setup.py for my package. My package needs to specify a dependency on another Git repository.
This is what I have so far:
from setuptools import setup, find_packages
setup(
name='abc',
packages=find_packages(),
url='https://github.abc.com/abc/myabc',
description='This is a description for abc',
long_description=open('README.md').read(),
install_requires=[
"requests==2.7.0",
"SomePrivateLib>=0.1.0",
],
dependency_links = [
"git+git://github.abc.com/abc/SomePrivateLib.git#egg=SomePrivateLib",
],
include_package_data=True,
)
When I run:
pip install -e https://github.abc.com/abc/myabc.git#egg=analyse
I get
Could not find a version that satisfies the requirement
SomePrivateLib>=0.1.0 (from analyse) (from versions: ) No matching
distribution found for SomePrivateLib>=0.1.0 (from analyse)
What am I doing wrong?
After digging through the pip issue 3939 linked by #muon in the comments above and then the PEP-508 specification, I found success getting my private repo dependency to install via setup.py using this specification pattern in install_requires (no more dependency_links):
install_requires = [
'some-pkg # git+ssh://git#github.com/someorgname/pkg-repo-name#v1.1#egg=some-pkg',
]
The #v1.1 indicates the release tag created on github and could be replaced with a branch, commit, or different type of tag.
This answer has been updated regularly as Python has evolved over the years. Scroll to the bottom for the most current answer, or read through to see how this has evolved.
Unfortunately the other answer does not work with private repositories, which is one of the most common use cases for this. I eventually did get it working with a setup.py file that looks like this (now deprecated) method:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository - needs entry in `dependency_links`
'ExampleRepo'
],
dependency_links=[
# Make sure to include the `#egg` portion so the `install_requires` recognizes the package
'git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
Newer versions of pip make this even easier by removing the need to use "dependency_links"-
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
However, with the very latest pip you'll run into issues with the EGG format handler. This is because while the egg is ignored pip is now doing direct URL matching and will consider two URLs, one with the egg fragment and the other without, to be completely different versions even if they point to the same package. As such it's best to leave any egg fragments off.
June 2021 - setup.py
So, the best way (current to June 2021) to add a dependency from Github to your setup.py that will work with public and private repositories:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git'
]
)
February 2022 - setup.cfg
Apparently setup.py is being deprecated (although my guess is it'll be around for awhile) and setup.cfg is the new thing.
[metadata]
name = MyProject
version = 0.1.1
[options]
packages = :find
install_requires =
ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git
June 16th 2022 - pyproject.toml
setup.cfg is already "pre" deprecated. as setuptools now has experimental support for pyproject.toml files.
This is the future, but since this is still experimental it should not be used in real projects for now. Even though setup.cfg is on its way out you should use it for a declarative format, otherwise setup.py is still the way to go. This answer will be updated when setuptools has stabilized their support of the new standard.
January 4th 2023 - pyproject.toml
It is now possible to define all of your dependencies in pyproject.toml. Other options such as setup.cfg still work.
[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"
[project]
dependencies = [
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git',
]
[project.optional-dependencies]
dev = ['ExtraExample # git+ssh://git#github.com/example_org/ExtraExample.git']
Note: this answer is now outdated. Have a look at this answer from #Dick Fox for up-to-date instructions: https://stackoverflow.com/a/54794506/2272172
You can find the right way to do it here.
dependency_links=['http://github.com/user/repo/tarball/master#egg=package-1.0']
The key is not to give a link to a Git repository, but a link to a tarball. GitHub creates a tarball of the master branch for you if you append /tarball/master as shown above.
A more general answer: To get the information from the requirements.txt file I do:
from setuptools import setup, find_packages
from os import path
loc = path.abspath(path.dirname(__file__))
with open(loc + '/requirements.txt') as f:
requirements = f.read().splitlines()
required = []
dependency_links = []
# Do not add to required lines pointing to Git repositories
EGG_MARK = '#egg='
for line in requirements:
if line.startswith('-e git:') or line.startswith('-e git+') or \
line.startswith('git:') or line.startswith('git+'):
line = line.lstrip('-e ') # in case that is using "-e"
if EGG_MARK in line:
package_name = line[line.find(EGG_MARK) + len(EGG_MARK):]
repository = line[:line.find(EGG_MARK)]
required.append('%s # %s' % (package_name, repository))
dependency_links.append(line)
else:
print('Dependency to a git repository should have the format:')
print('git+ssh://git#github.com/xxxxx/xxxxxx#egg=package_name')
else:
required.append(line)
setup(
name='myproject', # Required
version='0.0.1', # Required
description='Description here....', # Required
packages=find_packages(), # Required
install_requires=required,
dependency_links=dependency_links,
)
Actually if you like to make your packages installable recursively (YourCurrentPackage includes your SomePrivateLib), e.g. when you want to include YourCurrentPackage into another one (like OuterPackage → YourCurrentPackage → SomePrivateLib) you'll need both:
install_requires=[
...,
"SomePrivateLib # git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
"git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
And make sure you have a tag created with your version number.
Also if your Git project is private and you like to install it inside the container, e.g., a Docker or GitLab runner, you will need authorized access to your repository. Please consider to use Git + HTTPS with access tokens (like on GitLab: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html):
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
....
install_requires=[
...,
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
f"git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Updated:
You have to put #egg=SomePrivateLib at the end of dependency line if you like to have this dependency in requirements.txt file. Otherwise pip install -r requirements.txt won't work for you and you wil get something like:
ERROR: Could not detect requirement name for
'git+https://gitlab-ci-token:gitlabtokenvalue#gitlab.server.com/abc/SomePrivateLib.git#0.1.0',
please specify one with #egg=your_package_name
If you use reuirements.txt, this part is resposible for name of dependency’s folder that would be created inside python_home_dir/src and for name of egg-link in site-packages/
You can use a environment variable in your requirements.txt to store your dependency’s token value safe in your repo:
Example row in requrements.txt file for this case:
....
-e git+https://gitlab-ci-token:${EXPORTED_VAR_WITH_TOKEN}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib
....
I was successful with these three options in GitLab. I am using version 11 of GitLab.
Option 1 - no token specified. The shell will prompt for username/password.
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
"SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 2 - user access token specified. The token generated by going to GitLab → account top right → settings → access tokens. Create the token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 3 - repository-level token specified. The token generated by going to the repository → settings → repository → deploy tokens. From here, create a token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_USER = os.getenv('EXPORTED_TOKEN_USER')
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://{TOKEN_USER}:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
In all three, I was able to do simply: "SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git" without the #egg marking at the end.
This solution works for me when I run python setup.py install:
setuptools.setup(
...,
install_requires=[
'numpy',
'pandas',
'my_private_pkg'
],
dependency_links=["git+https://github.com/[username]/[my_private_pkg].git#main#egg=my_private_pkg"],
...
)

multiple custom plugins in py.test

My question is regarding multiple custom plugins in pytest.
I have two (or more) pytest plugins that I created which are installed using setuptools and pytest11 entry point, each plugin has its own setup.py. It seems like only the first installed plugin is active. I have verified this via print statements in the pytest_configure hook. If the first installed plugin is uninstalled, then only the second configure hook for the second plugin seems to get called. Also, the same behavior is observed with the addoption hook, options for the plugin installed second is unrecognized.
I'm thoroughly confused because I've used third party plugins and they seem to work just fine. Aren't hooks for all the installed plugins supposed to be called ?
Could this be a problem with the way plugins are installed, i.e. with setuptools ? (the command I use is python setup.py -v install). Pip correctly shows all the plugin modules as installed.
Edit:
Names are different, below are the setup files:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = suite_util.conftest',
]
},
)
and
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
},
)
If your pytest entry points both have the same name (as they do in your example above), only the first one will be loaded by pytest.
Note that this is not an inherent limitation of pkg_resources entry points but due to they way plugins are registered in pytest. There can only be one plugin with the same name - which makes sense imho.
The Pytest official document is ambiguous. The reason why your code don't work is because you followed that doc when you writing your both setup.py with same plugin name:
In this code:
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
the name_of_plugin is customizble and should be unique, otherwise pytest will load one of all plugins with same name (I guess is the last one with same name)
So, the solution to your question is:
setup.py 1:
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'automation_framework = automation_framework.conftest',
]
},
)
setup.py 2:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'suite_util = suite_util.conftest',
]
},
)
POC
2 entry points with same plugin name left hand value
2 engry points with different plugin name

Usage of "provides" keyword-argument in python's setup.py

I am working on a fork of a python projet (tryton) which uses setuptools for packaging. I am trying to extend the server part of the project, and would like to be able to use the existing modules with my fork.
Those modules are distributed with setuptools packaging, and are requiring the base project for installation.
I need a way to make it so that my fork is considered an acceptable requirement for those modules.
EDIT : Here is what I used in my setup.py :
from setuptools import setup
setup(
...
provides=["trytond (2.8.2)"],
...
)
The modules I want to be able to install have those requirements :
from setuptools import setup
setup(
...
install_requires=["trytond>=2.8"]
...
)
As it is, with my package installed, trying to install a module triggers the installation of the trytond package.
Don’t use provides, it comes from a packaging specification (a metadata PEP) that is not implemented by any tool. The requiremens in the install_requires argument map to the name in your other setup.py. IOW, replace your provides with setup(name='trytond', version='2.8.2').
If you are building rpms, it is possible to use the setup.cfg as follows:
[bdist_rpm]
provides = your-package = 0.8
obsoletes = your-package

Categories

Resources