How do I specify tox + python version specific requirements - python

Currently I have the following:
[gh-actions]
python =
3.7: py37
3.8: py38
3.9: py39
3.10: py310
pypy-3.7: pypy3
pypy-3.8: pypy3
[tox]
minversion = 1.9
envlist =
lint
py{37,38,39,py3}-django22-{sqlite,postgres}
py{37,38,39,310,py3}-django32-{sqlite,postgres}
py{38,39,310,py3}-django40-{sqlite,postgres}
py310-djangomain-{sqlite,postgres}
docs
examples
linkcheck
toxworkdir = {env:TOX_WORKDIR:.tox}
[testenv]
deps =
Pillow
SQLAlchemy
mongoengine
django22: Django>=2.2,<2.3
django32: Django>=3.2,<3.3
django40: Django>=4.0,<4.1
djangomain: https://github.com/django/django/archive/main.tar.gz
py{37,38,39,310}-django{22,32,40,main}-postgres: psycopg2-binary
py{py3}-django{22,32,40,main}-postgres: psycopg2cffi
I need to install a different psycopg2 depending on cpython vs pypy. I've tried all kinds of combinations, and nothing, it all ends in failure. I can't get any of the *-postgres envs to install.
What I'm doing wrong?

The issue is that you do not run the correct environments in your GitHub Actions.
For example. In your tox.ini you create an env with the name:
py37-django22-alchemy-mongoengine-postgres
Then you define the requirements as following:
py{37,38,39,310}-postgres: psycopg2-binary
Which means - install psycopg2-binary when the env name contains the factors py37 + postgres. This matches the above env! So far so good.
But in your gha your run:
- python-version: "3.7"
tox-environment: django22-postgres
... which does not contain the py37 factor - so no match - no installation.
The sqlite tests succeed as it sqlite comes along with Python.
I would suggest that you have a look at the django projects in the jazzband github organization. They all are heavy use of tox factors (the parts separated by dashes) and they also use gha - mostly via https://github.com/ymyzk/tox-gh-actions which I would recommend, too.
Basically you just run tox on gha and let the plugin do the heavy lifting of matching Python environments from tox to github.
Disclaimer:
I am one of the tox maintainers and you earn a prize for the most complex factor setup I have ever seen :-)

The issue was never tox or tox's configuration.
The issue was github actions, when you use tox-environment or python-version + tox-environment, tox-gh-actions won't parse it correctly. Causing it to never match.
This is what I removed.
This is what tox.ini looks like and what github actions looks like [and line 47]

Related

poetry: no way exclude dependencies for production?

I’m publishing a Python package.
It has dependencies a, b, c. It also has pytest as a dependency, defined in the group.dev as per poetry’s documentation.
However, I couldn’t find a conclusive answer for the following question:
When some installs my library with:
pip install my_library
Is pytest (defined in any group other than the main group, in this case dev) also installed? That would be very undesirable.
You can mention the dev dependency like this. It will not install the pytest. Reference
# rp_poetry/pyproject.toml (Excerpt)
[tool.poetry.dependencies]
python = "^3.9"
requests = "^2.26.0"
[tool.poetry.dev-dependencies]
pytest = "^5.2"
black = "^21.9b0"
poetry is may go to change the label of dev-dependencies in the future.

nox: Install different versions of dependency using poetry according to Python version

I am using nox in order to test my code against different Python versions. I have a dependent package that is stored on my local hard drive. I add this package via poetry with the following command:
poetry add ./path/to/package/local_package_py310.whl
My pyproject.toml file then contains the following line in the [tool.poetry.dependencies] section:
local_package = {path = "./path/to/package/local_package_py310.whl"}
This works fine for the regular Python version that I use (py 3.10). However, when using nox to test my package under Python 3.9, I need to install a different version of this package, namely ./path/to/package/local_package_py39.whl.
My noxfile.py looks like this and the tests for 3.10 do pass.
#nox.session(python=["3.10", "3.9"])
def tests(session) -> None:
"""Run the test suite."""
session.run("poetry", "install", external=True)
session.run("pytest")
However, the test are failing for 3.9 due to the fact that in this case my pyproject.toml is incorrect. It should read:
local_package = {path = "./path/to/package/local_package_py39.whl"}
Is it possible to modify the pyproject.toml according to the Python version that nox is using`?

Using tox and pyproject.toml

I am trying to switch a project from using setup.py to PEP518. I have written the following minimal pyproject.toml:
[build-system]
requires = ["cython", "setuptools", "wheel", "oldest-supported-numpy"]
build-backend = "setuptools.build_meta"
I need some custom installation logic relying on setup.py, so I cannot currently switch to a purely declarative setting.
Notably, my setup.py contains an import numpy which I use to add numpy.get_include() to the includes of an extension. I can build the sdist / wheel using python -m build, which works as intended (providing a build environment by installing the dependencies before calling into setup.py)
I also have a test suite which I run using tox. However, when I run tox in my project I see the following error:
GLOB sdist-make: /project/setup.py
ERROR: invocation failed (exit code 1), logfile: /project/.tox/log/GLOB-0.log
...
File "/project/setup.py", ...
ModuleNotFoundError: No module named 'numpy'
So, per default tox does not install the build dependencies before building the sdist to be used for testing later, causing everything to fail.
Therefore, as suggested in the tox example, I added
[tox]
isolated_build = True
[testenv]
commands = pytest
to the top of tox.ini, which should enable the isolated build. However, when I then execute tox now, all I get is
___ summary ___
congratulations :)
so nothing is actually built / tested (as opposed to a non-isolated build with numpy installed). Is this the expected behavior? How can I actually build and run tests in an isolated environment?
OK, so as it turns out, isolated builds require an envlist like this to work properly (as opposed to the ordinary one which defaults to using the current python environment):
[tox]
isolated_build = True
envlist = py310

How to combine per-env Tox deps with a Pip requirements file?

I'm trying to use Tox to test specific versions of Python and Django, but also include a general Pip requirements file of additional dependencies to use for all cases.
As the Tox docs explain, you do the first like:
deps =
django15: Django>=1.5,<1.6
django16: Django>=1.6,<1.7
py33-mysql: PyMySQL ; use if both py33 and mysql are in an env name
py26,py27: urllib3 ; use if any of py26 or py27 are in an env name
py{26,27}-sqlite: mock ; mocking sqlite in python 2.x
and you do the second like:
deps = -r{toxinidir}/pip-requirements.txt
-r{toxinidir}/pip-requirements-test.txt
but how do you combine these?
If I try to define multiple deps, Tox gives me the error "duplicate name 'deps'", but I don't see a way to combine the dictionary and list notatations for deps.
I also tried:
deps =
-r{toxinidir}/pip-requirements.txt
-r{toxinidir}/pip-requirements-test.txt
django15: Django>=1.5,<1.6
django16: Django>=1.6,<1.7
and although that doesn't give me any parsing error, when I go to run a test, I get the error:
ERROR: py27-django15: could not install deps [-r/usr/local/myproject/pip-requirements.txt, -r/usr/local/myproject/pip-requirements-test.txt, Django>=1.5,<1.6]; v = InvocationError('/usr/local/myproject/.tox/py27-django15/bin/pip install -r/usr/local/myproject/pip-requirements.txt -r/usr/local/myproject/pip-requirements-test.txt Django>=1.5,<1.6 (see /usr/local/myproject/.tox/py27-django15/log/py27-django15-1.log)', 1)
presumably because it's interpreting the requirements file as a literal Python package name.

Buildout and Virtualenv

I am messing around with the combination of buildout and virtualenv to setup an isolated development environment in python that allows to do reproducible builds.
There is a recipe for buildout that let's you integrate virtualenv into buildout:
tl.buildout_virtual_python
With this my buildout.cfg looks like this:
[buildout]
develop = .
parts = script
virtualpython
[virtualpython]
recipe = tl.buildout_virtual_python
headers = true
executable-name = vp
site-packages = false
[script]
recipe = zc.recipe.egg:scripts
eggs = foo
python = virtualpython
This will deploy two executables into ./bin/:
vp
script
When I execute vp, I get an interactive, isolated python dialog, as expected (can't load any packages from the system).
What I would expect now, is that if I run
./bin/script
that the isolated python interpreter is used. But it doesn't, it's not isolated as "vp" is (meaning I can import libraries from system level). However I can run:
./bin/vp ./bin/script
Which will run the script in an isolated environment as I wished. But there must be a way to specify this to do so without chaining commands otherwise buildout only solves half of the problems I hoped :)
Thanks for your help!
Patrick
You don't need virtualenv: buildout already provides an isolated environment, just like virtualenv.
As an example, look at files buildout generates in the bin directory. They'll have something like:
import sys
sys.path[0:0] = [
'/some/thing1.egg',
# and other things
]
So the sys.path gets completely replaced with what buildout wants to have on the path: the same isolation method as virtualenv.
zc.buildout 2.0 and later does not provide the isolated environment anymore.
But virtualenv 1.9 and later provides complete isolation (including to not install setuptools).
Thus the easiest way to get a buildout in a complete controlled environment is to run the following steps (here i.e. for Python 2.7):
cd /path/to/buildout
rm ./bin/python
/path/to/virtualenv-2.7 --no-setuptools --no-site-packages --clear .
./bin/python2.7 bootstrap.py
./bin/buildout
Preconditions:
bootstrap.py has to be a recent one matching the buildout version you are using. You'll find the latest at http://downloads.buildout.org/2/
if there are any version pins in your buildout, ensure they do not pin buildout itself or recipes/ extensions to versions not compatible with zc.buildout 2 or later.
Had issue running buildout using bootstrap on ubuntu server, from then I use virtualenv and buildout together. Simply create virualenv and install buildout in it. This way only virtualenv has to be installed into system (in theory1).
$ virtualenv [options_you_might_need] virtual
$ source virtual/bin/activate
$ pip install zc.buildout
$ buildout -c <buildout.cfg>
Also tell buildout to put its scripts in to virtual/bin/ directory, that way scripts appear on $PATH.
[buildout]
bin-directory = ${buildout:directory}/virtual/bin
...
1: In practice you probably will need to eggs what require compilation to system level that require compilation. Eggs like mysql or memcache.
I've never used that recipe before, but the first thing I would try is this:
[buildout]
develop = .
parts = script
virtualpython
[virtualpython]
recipe = tl.buildout_virtual_python
headers = true
executable-name = vp
site-packages = false
[script]
recipe = zc.recipe.egg:scripts
eggs = foo
python = virtualpython
interpreter = vp
If that doesn't work, you can usually open up the scripts (in this case vp and script) in a text editor and see the Python paths that they're using. If you're on windows there will usually be a file called <script_name>-script.py. In this case, that would be vp-script.py and script-script.py.

Categories

Resources