Pipfile - why is "name" a required field in source? - python

I recently updated python (3.9.13)/ pip (22.1.2)/ pipenv (2022.9.20) while setting up a new environment for a project. I had an existing Pipfile, using AWS CodeArtifact as the source containing PyPI and our private projects, which had worked in the past:
[[source]]
url = "https://aws:${AUTH}#${CODEARTIFACT_URI}"
verify_ssl = true
[packages]
boto3 = "*"
...
[dev-packages]
pytest = "*"
...
[requires]
python_version = "3.9"
On running pipenv sync --dev, I received a stack trace ending with:
pipenv.vendor.plette.models.base.ValidationError: {'url': 'https://aws:${AUTH}#${CODEARTIFACT_URI}', 'verify_ssl': True}
name: required field
Adding a name to the source section fixes it, but I'm trying to understand why. I was reading the Advanced use of Pipenv documentation. It says Should you wish to use an alternative default index other than PyPI: simply do not specify PyPI as one of the sources in your Pipfile which is what we do. I'm curious if this is a new validation in a newer version of Pipenv, or if we are doing something incorrectly, since I still see Pipfiles out there with only a url and verify_ssl specified.

This was answered by #matteius over on https://github.com/pypa/pipenv/discussions/5370:
We converted to plette which is more strict, primarily because we have access to make modifications to that library, which we need to do for named package categories (slated for October 🤞). Plette already requires url, verify_url and name be supplied per each source entered, hence it is more strict than Pipfile was. It is good practice anyway to name your sources, because while the first source is the default install source for all unspecified packages, to specify any other index would require a name. From the sound of it we need to update the documentation (which is quiet true in a number of ways).

Related

poetry: no way exclude dependencies for production?

I’m publishing a Python package.
It has dependencies a, b, c. It also has pytest as a dependency, defined in the group.dev as per poetry’s documentation.
However, I couldn’t find a conclusive answer for the following question:
When some installs my library with:
pip install my_library
Is pytest (defined in any group other than the main group, in this case dev) also installed? That would be very undesirable.
You can mention the dev dependency like this. It will not install the pytest. Reference
# rp_poetry/pyproject.toml (Excerpt)
[tool.poetry.dependencies]
python = "^3.9"
requests = "^2.26.0"
[tool.poetry.dev-dependencies]
pytest = "^5.2"
black = "^21.9b0"
poetry is may go to change the label of dev-dependencies in the future.

Can Poetry add package from AWS CodeArtifact

I find multiple answers for how to publish package using Python Poetry to CodeArtifact and this is quite simple. But now I try to add the published package poetry add sample-package and it does not work. Poetry error:
Could not find a matching version of package sample-package
With pip install it works. But not with Poetry.
My pyproject.toml specifies to you my CodeArtifact repo as default. No problem with this:
[[tool.poetry.source]]
name = "artifact"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repo"
default = true
Did anyone figure out how to do it?
I found my mistake. In the package that I publish I need to specify repository without /simple at the end. But for project where I use the package from CodeArtifact the repository needs to end with /simple.
Example: Publish package config looks like:
[[tool.poetry.source]]
name = "artifact"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repository/"
secondary=true
And the publish command is: poetry publish --build -r artifact
For project where I use my package sample-lib the config should be:
[[tool.poetry.source]]
name = "artifact-repo"
url = "https://test-domain-1234.d.codeartifact.region.amazonaws.com/pypi/test-repository/simple"
secondary=true
And then Poetry command is: poetry add sample-lib --source artifact-repo

Poetry: Managing PyPI dependencies without version number

I'm trying to use Poetry to manage my python projects, but some PyPI dependencies don't have a version number such as this one.
I thus get such errors
$ poetry update
Updating dependencies
Resolving dependencies... (0.5s)
SolverProblemError
Because wworkflow depends on waapi-client-python (^0) which doesn't match any versions, version solving failed.
at /Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/poetry/puzzle/solver.py:241 in _solve
237│ packages = result.packages
238│ except OverrideNeeded as e:
239│ return self.solve_in_compatibility_mode(e.overrides, use_latest=use_latest)
240│ except SolveFailure as e:
→ 241│ raise SolverProblemError(e)
242│
243│ results = dict(
244│ depth_first_search(
245│ PackageNode(self._package, packages), aggregate_package_nodes
I got the similar error when using any as the version value of the dependency in .toml.
Is it that Poetry does not support such a usecase?
Thanks to #Lain Shelvington's comments. I got the package name wrong. I took the name from its GitHub repo, but the PyPI package is named differently.
poetry update works after fixing the name. However, Poetry's inability to distinguish the package name error from its version tag issue is a bit confusing.

Pipenv not resolving dependencies correctly with two sources listed

I've got a Pipfile with two sources declared: one source is the global, public PyPI, while the other is a small local repository which hosts some private packages, but doesn't mirror PyPI itself. I've got this set up as follows:
[[source]]
url = "http://my.private.repo.example.com/pypi/simple"
verify_ssl = false
name = "private"
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"
This being in place, I use both mirrors to source packages:
[packages]
requests = "*"
some_private_package = {version="*", index="private"}
My issue is that this results in a failure to resolve some dependencies. Let's say that some_private_package depends on Flask -- which is available from public PyPI, but isn't hosted on the private repo; building some_private_package fail because Flask can't be found on the private repo, and no attemps are made to scan PyPI for it.
Is there any way to get Pipenv to search for dependencies on both available sources?
A bit of a latent answer to this: apparently the private host doesn't correctly handle a wildcard version specifier, preferring either a bare package name or a valid version specifier instead.
Explicitly pinning all packages seems to be the way to go when working with some of the self-hosted PyPI servers.

Python setuptools: How can I list a private repository under install_requires?

I am creating a setup.py file for a project which depends on private GitHub repositories. The relevant parts of the file look like this:
from setuptools import setup
setup(name='my_project',
...,
install_requires=[
'public_package',
'other_public_package',
'private_repo_1',
'private_repo_2',
],
dependency_links=[
'https://github.com/my_account/private_repo_1/master/tarball/',
'https://github.com/my_account/private_repo_2/master/tarball/',
],
...,
)
I am using setuptools instead of distutils because the latter does not support the install_requires and dependency_links arguments per this answer.
The above setup file fails to access the private repos with a 404 error - which is to be expected since GitHub returns a 404 to unauthorized requests for a private repository. However, I can't figure out how to make setuptools authenticate.
Here are some things I've tried:
Use git+ssh:// instead of https:// in dependency_links as I would if installing the repo with pip. This fails because setuptools doesn't recognize this protocol ("unknown url type: git+ssh"), though the distribute documentation says it should. Ditto git+https and git+http.
https://<username>:<password>#github.com/... - still get a 404. (This method doesn't work with curl or wget from the command line either - though curl -u <username> <repo_url> -O <output_file_name> does work.)
Upgrading setuptools (0.9.7) and virtualenv (1.10) to the latest versions. Also tried installing distribute though this overview says it was merged back into setuptools. Either way, no dice.
Currently I just have setup.py print out a warning that the private repos must be downloaded separately. This is obviously less than ideal. I feel like there's something obvious that I'm missing, but can't think what it might be. :)
Duplicate-ish question with no answers here.
I was trying to get this to work for installing with pip, but the above was not working for me. From [1] I understood the PEP508 standard should be used, from [2] I retrieved an example which actually does work (at least for my case).
Please note; this is with pip 20.0.2 on Python 3.7.4
setup(
name='<package>',
...
install_requires=[
'<normal_dependency>',
# Private repository
'<dependency_name> # git+ssh://git#github.com/<user>/<repo_name>#<branch>',
# Public repository
'<dependency_name> # git+https://github.com/<user>/<repo_name>#<branch>',
],
)
After specifying my package this way installation works fine (also with -e settings and without the need to specify --process-dependency-links).
References
[1] https://github.com/pypa/pip/issues/4187
[2] https://github.com/pypa/pip/issues/5566
Here's what worked for me:
install_requires=[
'private_package_name==1.1',
],
dependency_links=[
'git+ssh://git#github.com/username/private_repo.git#egg=private_package_name-1.1',
]
Note that you have to have the version number in the egg name, otherwise it will say it can't find the package.
I couldn't find any good documentation on this, but came across the solution mainly through trial & error. Further, installing from pip & setuptools have some subtle differences; but this way should work for both.
GitHub don't (currently, as of August 2016) offer an easy way to get the zip / tarball of private repos. So you need to point setuptools to tell setuptools that you're pointing to a git repo:
from setuptools import setup
import os
# get deploy key from https://help.github.com/articles/git-automation-with-oauth-tokens/
github_token = os.environ['GITHUB_TOKEN']
setup(
# ...
install_requires='package',
dependency_links = [
'git+https://{github_token}#github.com/user/{package}.git/#{version}#egg={package}-0'
.format(github_token=github_token, package=package, version=master)
]
A couple of notes here:
For private repos, you need to authenticate with GitHub; the simplest way I found is to create an oauth token, drop that into your environment, and then include it with the URL
You need to include some version number (here is 0) at the end of the link, even if there's no package on PyPI. This has to be a actual number, not a word.
You need to preface with git+ to tell setuptools it's to clone the repo, rather than pointing at a zip / tarball
version can be a branch, a tag, or a commit hash
You need to supply --process-dependency-links if installing from pip
I found a (hacky) workaround:
#!/usr/bin/env python
from setuptools import setup
import os
os.system('pip install git+https://github-private.corp.com/user/repo.git#master')
setup( name='original-name'
, ...
, install_requires=['repo'] )
I understand that there are ethical issues with having a system call in a setup script, but I can't think of another way to do this.
Via Tom Hemmes' answer I found this is the only thing that worked for me:
install_requires=[
'<package> # https://github.com/<username>/<package>/archive/<branch_name>.zip']
Using archive URL from github works for me, for public repositories. E.g.
dependency_links = [
'https://github.com/username/reponame/archive/master.zip#egg=eggname-version',
]
With pip 20.1.1, this works for me
install_requires=[ "packson3#https://tracinsy.ewi.tudelft.nl/pubtrac/Utilities/export/138/packson3/dist/packson3-1.0.0.tar.gz"],
in setup.py
Edit: This appears to only work with public github repositories, see comments.
dependency_links=[
'https://github.com/my_account/private_repo_1/tarball/master#egg=private_repo_1',
'https://github.com/my_account/private_repo_2/tarball/master#egg=private_repo_2',
],
Above syntax seems to work for me with setuptools 1.0. At the moment at least the syntax of adding "#egg=project_name-version" to VCS dependencies is documented in the link you gave to distribute documentation.
This work for our scenario:
package is on github in a private repo
we want to install it into site-packages (not into ./src with -e)
being able to use pip install -r requirements.txt
being able to use pip install -e reposdir (or from github), where the dependencies are only specified in requirements.txt
https://github.com/pypa/pip/issues/3610#issuecomment-356687173

Categories

Resources