poetry: no way exclude dependencies for production? - python

I’m publishing a Python package.
It has dependencies a, b, c. It also has pytest as a dependency, defined in the group.dev as per poetry’s documentation.
However, I couldn’t find a conclusive answer for the following question:
When some installs my library with:
pip install my_library
Is pytest (defined in any group other than the main group, in this case dev) also installed? That would be very undesirable.

You can mention the dev dependency like this. It will not install the pytest. Reference
# rp_poetry/pyproject.toml (Excerpt)
[tool.poetry.dependencies]
python = "^3.9"
requests = "^2.26.0"
[tool.poetry.dev-dependencies]
pytest = "^5.2"
black = "^21.9b0"
poetry is may go to change the label of dev-dependencies in the future.

Related

nox: Install different versions of dependency using poetry according to Python version

I am using nox in order to test my code against different Python versions. I have a dependent package that is stored on my local hard drive. I add this package via poetry with the following command:
poetry add ./path/to/package/local_package_py310.whl
My pyproject.toml file then contains the following line in the [tool.poetry.dependencies] section:
local_package = {path = "./path/to/package/local_package_py310.whl"}
This works fine for the regular Python version that I use (py 3.10). However, when using nox to test my package under Python 3.9, I need to install a different version of this package, namely ./path/to/package/local_package_py39.whl.
My noxfile.py looks like this and the tests for 3.10 do pass.
#nox.session(python=["3.10", "3.9"])
def tests(session) -> None:
"""Run the test suite."""
session.run("poetry", "install", external=True)
session.run("pytest")
However, the test are failing for 3.9 due to the fact that in this case my pyproject.toml is incorrect. It should read:
local_package = {path = "./path/to/package/local_package_py39.whl"}
Is it possible to modify the pyproject.toml according to the Python version that nox is using`?

Can Poetry add package from AWS CodeArtifact

I find multiple answers for how to publish package using Python Poetry to CodeArtifact and this is quite simple. But now I try to add the published package poetry add sample-package and it does not work. Poetry error:
Could not find a matching version of package sample-package
With pip install it works. But not with Poetry.
My pyproject.toml specifies to you my CodeArtifact repo as default. No problem with this:
[[tool.poetry.source]]
name = "artifact"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repo"
default = true
Did anyone figure out how to do it?
I found my mistake. In the package that I publish I need to specify repository without /simple at the end. But for project where I use the package from CodeArtifact the repository needs to end with /simple.
Example: Publish package config looks like:
[[tool.poetry.source]]
name = "artifact"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repository/"
secondary=true
And the publish command is: poetry publish --build -r artifact
For project where I use my package sample-lib the config should be:
[[tool.poetry.source]]
name = "artifact-repo"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repository/simple"
secondary=true
And then Poetry command is: poetry add sample-lib --source artifact-repo

How do I specify tox + python version specific requirements

Currently I have the following:
[gh-actions]
python =
3.7: py37
3.8: py38
3.9: py39
3.10: py310
pypy-3.7: pypy3
pypy-3.8: pypy3
[tox]
minversion = 1.9
envlist =
lint
py{37,38,39,py3}-django22-{sqlite,postgres}
py{37,38,39,310,py3}-django32-{sqlite,postgres}
py{38,39,310,py3}-django40-{sqlite,postgres}
py310-djangomain-{sqlite,postgres}
docs
examples
linkcheck
toxworkdir = {env:TOX_WORKDIR:.tox}
[testenv]
deps =
Pillow
SQLAlchemy
mongoengine
django22: Django>=2.2,<2.3
django32: Django>=3.2,<3.3
django40: Django>=4.0,<4.1
djangomain: https://github.com/django/django/archive/main.tar.gz
py{37,38,39,310}-django{22,32,40,main}-postgres: psycopg2-binary
py{py3}-django{22,32,40,main}-postgres: psycopg2cffi
I need to install a different psycopg2 depending on cpython vs pypy. I've tried all kinds of combinations, and nothing, it all ends in failure. I can't get any of the *-postgres envs to install.
What I'm doing wrong?
The issue is that you do not run the correct environments in your GitHub Actions.
For example. In your tox.ini you create an env with the name:
py37-django22-alchemy-mongoengine-postgres
Then you define the requirements as following:
py{37,38,39,310}-postgres: psycopg2-binary
Which means - install psycopg2-binary when the env name contains the factors py37 + postgres. This matches the above env! So far so good.
But in your gha your run:
- python-version: "3.7"
tox-environment: django22-postgres
... which does not contain the py37 factor - so no match - no installation.
The sqlite tests succeed as it sqlite comes along with Python.
I would suggest that you have a look at the django projects in the jazzband github organization. They all are heavy use of tox factors (the parts separated by dashes) and they also use gha - mostly via https://github.com/ymyzk/tox-gh-actions which I would recommend, too.
Basically you just run tox on gha and let the plugin do the heavy lifting of matching Python environments from tox to github.
Disclaimer:
I am one of the tox maintainers and you earn a prize for the most complex factor setup I have ever seen :-)
The issue was never tox or tox's configuration.
The issue was github actions, when you use tox-environment or python-version + tox-environment, tox-gh-actions won't parse it correctly. Causing it to never match.
This is what I removed.
This is what tox.ini looks like and what github actions looks like [and line 47]

Poetry: Managing PyPI dependencies without version number

I'm trying to use Poetry to manage my python projects, but some PyPI dependencies don't have a version number such as this one.
I thus get such errors
$ poetry update
Updating dependencies
Resolving dependencies... (0.5s)
SolverProblemError
Because wworkflow depends on waapi-client-python (^0) which doesn't match any versions, version solving failed.
at /Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/poetry/puzzle/solver.py:241 in _solve
237│ packages = result.packages
238│ except OverrideNeeded as e:
239│ return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest)
240│ except SolveFailure as e:
→ 241│ raise SolverProblemError(e)
242│
243│ results = dict(
244│ depth_first_search(
245│ PackageNode(self._package, packages), aggregate_package_nodes
I got the similar error when using any as the version value of the dependency in .toml.
Is it that Poetry does not support such a usecase?
Thanks to #Lain Shelvington's comments. I got the package name wrong. I took the name from its GitHub repo, but the PyPI package is named differently.
poetry update works after fixing the name. However, Poetry's inability to distinguish the package name error from its version tag issue is a bit confusing.

Python setuptools: How can I list a private repository under install_requires?

I am creating a setup.py file for a project which depends on private GitHub repositories. The relevant parts of the file look like this:
from setuptools import setup
setup(name='my_project',
...,
install_requires=[
'public_package',
'other_public_package',
'private_repo_1',
'private_repo_2',
],
dependency_links=[
'https://github.com/my_account/private_repo_1/master/tarball/',
'https://github.com/my_account/private_repo_2/master/tarball/',
],
...,
)
I am using setuptools instead of distutils because the latter does not support the install_requires and dependency_links arguments per this answer.
The above setup file fails to access the private repos with a 404 error - which is to be expected since GitHub returns a 404 to unauthorized requests for a private repository. However, I can't figure out how to make setuptools authenticate.
Here are some things I've tried:
Use git+ssh:// instead of https:// in dependency_links as I would if installing the repo with pip. This fails because setuptools doesn't recognize this protocol ("unknown url type: git+ssh"), though the distribute documentation says it should. Ditto git+https and git+http.
https://<username>:<password>#github.com/... - still get a 404. (This method doesn't work with curl or wget from the command line either - though curl -u <username> <repo_url> -O <output_file_name> does work.)
Upgrading setuptools (0.9.7) and virtualenv (1.10) to the latest versions. Also tried installing distribute though this overview says it was merged back into setuptools. Either way, no dice.
Currently I just have setup.py print out a warning that the private repos must be downloaded separately. This is obviously less than ideal. I feel like there's something obvious that I'm missing, but can't think what it might be. :)
Duplicate-ish question with no answers here.
I was trying to get this to work for installing with pip, but the above was not working for me. From [1] I understood the PEP508 standard should be used, from [2] I retrieved an example which actually does work (at least for my case).
Please note; this is with pip 20.0.2 on Python 3.7.4
setup(
name='<package>',
...
install_requires=[
'<normal_dependency>',
# Private repository
'<dependency_name> # git+ssh://git#github.com/<user>/<repo_name>#<branch>',
# Public repository
'<dependency_name> # git+https://github.com/<user>/<repo_name>#<branch>',
],
)
After specifying my package this way installation works fine (also with -e settings and without the need to specify --process-dependency-links).
References
[1] https://github.com/pypa/pip/issues/4187
[2] https://github.com/pypa/pip/issues/5566
Here's what worked for me:
install_requires=[
'private_package_name==1.1',
],
dependency_links=[
'git+ssh://git#github.com/username/private_repo.git#egg=private_package_name-1.1',
]
Note that you have to have the version number in the egg name, otherwise it will say it can't find the package.
I couldn't find any good documentation on this, but came across the solution mainly through trial & error. Further, installing from pip & setuptools have some subtle differences; but this way should work for both.
GitHub don't (currently, as of August 2016) offer an easy way to get the zip / tarball of private repos. So you need to point setuptools to tell setuptools that you're pointing to a git repo:
from setuptools import setup
import os
# get deploy key from https://help.github.com/articles/git-automation-with-oauth-tokens/
github_token = os.environ['GITHUB_TOKEN']
setup(
# ...
install_requires='package',
dependency_links = [
'git+https://{github_token}#github.com/user/{package}.git/#{version}#egg={package}-0'
.format(github_token=github_token, package=package, version=master)
]
A couple of notes here:
For private repos, you need to authenticate with GitHub; the simplest way I found is to create an oauth token, drop that into your environment, and then include it with the URL
You need to include some version number (here is 0) at the end of the link, even if there's no package on PyPI. This has to be a actual number, not a word.
You need to preface with git+ to tell setuptools it's to clone the repo, rather than pointing at a zip / tarball
version can be a branch, a tag, or a commit hash
You need to supply --process-dependency-links if installing from pip
I found a (hacky) workaround:
#!/usr/bin/env python
from setuptools import setup
import os
os.system('pip install git+https://github-private.corp.com/user/repo.git#master')
setup( name='original-name'
, ...
, install_requires=['repo'] )
I understand that there are ethical issues with having a system call in a setup script, but I can't think of another way to do this.
Via Tom Hemmes' answer I found this is the only thing that worked for me:
install_requires=[
'<package> # https://github.com/<username>/<package>/archive/<branch_name>.zip']
Using archive URL from github works for me, for public repositories. E.g.
dependency_links = [
'https://github.com/username/reponame/archive/master.zip#egg=eggname-version',
]
With pip 20.1.1, this works for me
install_requires=[ "packson3#https://tracinsy.ewi.tudelft.nl/pubtrac/Utilities/export/138/packson3/dist/packson3-1.0.0.tar.gz"],
in setup.py
Edit: This appears to only work with public github repositories, see comments.
dependency_links=[
'https://github.com/my_account/private_repo_1/tarball/master#egg=private_repo_1',
'https://github.com/my_account/private_repo_2/tarball/master#egg=private_repo_2',
],
Above syntax seems to work for me with setuptools 1.0. At the moment at least the syntax of adding "#egg=project_name-version" to VCS dependencies is documented in the link you gave to distribute documentation.
This work for our scenario:
package is on github in a private repo
we want to install it into site-packages (not into ./src with -e)
being able to use pip install -r requirements.txt
being able to use pip install -e reposdir (or from github), where the dependencies are only specified in requirements.txt
https://github.com/pypa/pip/issues/3610#issuecomment-356687173

Categories

Resources