How is "egg=" used in "pip install -e"? - python

Trying to test editable installs out and I'm not sure how to interpret the results.
I intentionally made a typo in the egg= portion but it was still able to locate the egg without any help from me:
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git
Could not detect requirement name for 'git+https://gitlab.com/jame/clientapp.git', please specify one with #egg=your_package_name
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git#egg=
Could not detect requirement name for 'git+https://gitlab.com/jame/clientapp.git#egg=', please specify one with #egg=your_package_name
root#6be8ee41b6c9:/# pip3 install -e git+https://gitlab.com/jame/clientapp.git#egg=e
Obtaining e from git+https://gitlab.com/jame/clientapp.git#egg=e
Cloning https://gitlab.com/jame/clientapp.git to /src/e
Running setup.py (path:/src/e/setup.py) egg_info for package e produced metadata for project name clientapp. Fix your #egg=e fragments.
Installing collected packages: clientapp
Found existing installation: ClientApp 0.7
Can't uninstall 'ClientApp'. No files were found to uninstall.
Running setup.py develop for clientapp
Successfully installed clientapp
root#6be8ee41b6c9:/# pip3 freeze
asn1crypto==0.24.0
-e git+https://gitlab.com/jame/clientapp.git#5158712c426ce74613215e61cab8c21c7064105c#egg=ClientApp
cryptography==2.6.1
entrypoints==0.3
keyring==17.1.1
keyrings.alt==3.1.1
pycrypto==2.6.1
PyGObject==3.30.4
pyxdg==0.25
SecretStorage==2.3.1
six==1.12.0
So if I could mess the egg name up so bad, why is it considered an error to either leave it blank or set to something empty

Hard to answer, maybe raise this as an issue on pip's bug tracker and get an accurate answer from the developers themselves.
My guess, the egg name matters if the project is a dependency of another project. For example in a case where one wants to install A from PyPI and Z from git, but Z is a dependency of A.
pip install 'A' 'git+https://example.local/Z.git#egg=Z'

egg= is the name that's used when uninstalling unpackaged libraries that's installed from a VCS repository, and the name that's used by the dependency resolver when searching for dependant packages.
If you don't care about those two use cases, they can essentially be set to anything.
it found the egg via setup.py
It didn't find the egg via setup.py, pip found the setup.py and set the egg name for the setup.py install to whatever you specified. When you're installing from a VCS, there is no package, so there's no egg name configured, egg= configures the installation as if it has been installed with a package with that egg name.

Related

Python Pip: pip install cannot find a version that satisfies a requirement - despite present in pyproject.toml

Python3 Pip error + Poetry Packaging
I am working in a python library that I am trying to publish to TestPypi. So far, there have been no issues with publishing my Poetry builds.
For context, as a beginner, I come from these websites :
https://python-poetry.org/docs/
https://packaging.python.org/en/latest/tutorials/packaging-projects/
The only issue that has arose is that dependencies listed in my pyproject.toml are not accounted for when installing the package with pip install.
I have attempted at updating setuptools and pip but I have done so to no avail.
My goal is to have clean dependency installation without the versioning errors.
This is the main solution I have tried.
pyproject.toml
I hid my real names.
[tool.poetry]
name = "package-name"
version = "0.1.0"
description = "<desc>"
authors = ["<myname> <myemail>"]
license = "MIT"
[tool.poetry.dependencies]
python = "^3.10"
beautifulsoup4 = {version = "4.11.1", allow-prereleases = true}
recurring-ical-events = {version = "1.0.2b0", allow-prereleases = true}
requests = {version = "2.28.0", allow-prereleases = true}
rich = {version = "12.4.4", allow-prereleases = true}
[tool.poetry.dev-dependencies]
black = {version = "22.3.0", allow-prereleases = true}
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
As the installer iterates through a dependency, it will return this error depending on whichever one is ordered first. (Throughout my monkey-patch-like attempts at fixing this, I was able to change the order of installation by modifying the strictness of each dependency version)
the error pip returns
ERROR: Could not find a version that satisfies the requirement requests==2.28.1 (from homeworkpy) (from versions: 2.5.4.1)
ERROR: No matching distribution found for requests==2.28.1
I have tried changing the strictness of the versions. (I removed the ^)
Switching to Poetry as a manager was also an attempt. My previous attempts were manual.
I have verified that the builds are corresponding to the correct builds previously published.
For extra info: I am building on a Github Codespace in which I run on 18.04.1-Ubuntu
Would anyone have any knowledge to spare of an issue like this? I am quite new to packaging and building, and I have had some success in most parts except for dependencies.
Main Error
TLDR; Pip tries to resolve dependencies with TestPypi, but they are in another index (Pypi). Workarounds at end of answer.
The fact that I am publishing to TestPypi is the reason this has happened. I will explain why what I did made this error appear, and then I will show how you, from the future, may solve this.
Difference between Pypi and TestPypi
Pypi is the Python Package Index. It's a giant index of Python packages one may install from with pip install.
TestPypi is the Python Package Index designated for testing and publishing without touching the real Package Index. It can be useful in times when learning how to publish a package. The main difference is that it is a completely separate repository. Therefore, what's on TestPypi may not be exactly what's on Pypi.
My research was limited, so if I confused anyone, the main difference is that they are two different Package Indexes. One was made for testing purposes.
I published my package to TestPypi and set my pip install to install from that repository. Not Pypi, but TestPypi.
Why dependency resolution failed
When I defined my project's dependencies, I defined them based off of their Pypi presences. Most dependencies are present in Pypi. Not TestPypi. This meant that when I asked for my package from TestPypi, pip only looked at TestPypi, and the pip installer workflow fell out to a pattern like this:
0.5. Set fetching repository to TestPypi and Not Pypi.
Pull package from TestPypi
Install and examine dependencies
Find first dependency (e.g. Beautifulsoup4)
Pull dependency from TestPypi
Successfully install Beautifulsoup4
-. This is because beautifulsoup4 is actually present in the TestPypi.
Move on to another dependency (e.g. rich)
Fail to pull from TestPypi
-. Rich is not present in TestPypi.
Return dependency not found.
Why some dependencies oddly worked
As you see in workflow step 5., the beautifulsoup4 package was found on the TestPypi. (Someone had put it up there).
image to TestPypi page with beautifulsoup4
However, as you see in step 7., Rich is not found on the TestPypi index. This issue occurs because I set my repoistiroy to install from TestPypi because my that is where my package was held. This caused pip to use TestPypi. for every single dependency as well.
How I got around it.
I got around it by using TestPypi to verify accurate build artifact publishing, and then I jumped to Normal Pypi to test installation and dependency installation.
Workarounds
Install from TestPypi
python3 -m pip install -i https://test.pypi.org/simple/ <package name>
Install from Pypi (by default)
python3 -m pip install <package name>
Install package from TestPypi but dependencies from Pypi
The Python Docs explains this very well.
If you want to allow pip to also download packages from PyPI, you can specify --extra-index-url to point to PyPI. This is useful when the package you’re testing has dependencies:
python3 -m pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ your-package

Setup.py reinstalling already installed user-written package

I am developing two python packages, pkg_a and pkg_b. pkg_a is a requirement for pkg_b, and so the setup.py for pkg_b looks like this:
from setuptools import setup
inst_reqs = [
'pkg_a # git+ssh://git#bitbucket.org/vlad/pkg_a.git',
]
setup(
name="pkg_b",
version="0.0.0",
packages=['pkg_b'],
install_requires=inst_reqs,
)
Since I am developing both packages simultaneously, pkg_a is already installed in editable mode (pip install -e .).
When pip installing pkg_b, why is the existing installation of pkg_a removed? It looks like pip will systematically cone the specified repo, uninstall the existing pkg_a and reinstall it from the cloned repo:
Successfully built pkg_a
Installing collected packages: pkg_a, pkg_b
Attempting uninstall: pkg_a
Found existing installation: pkg_a 0.0.0
Uninstalling pkg_a-0.0.0:
Successfully uninstalled pkg_a-0.0.0
Running setup.py develop for pkg_b
Successfully installed pkg_a-0.0.0 pkg_b
I'm guessing this has to with versioning but I don't know how to fix this. Any tips?
That is how VCS dependencies are handled. You should specify a fixed reference (tag, commit id):
'pkg_a # git+ssh://git#bitbucket.org/vlad/pkg_a.git#da39a3ee5e6b4b0d3255bfef95601890afd80709'
See: https://pip.pypa.io/en/stable/reference/pip_install/#git
If you do not specific a fixed reference (non-moving tag, or commit ID), then pip has to clone every time since the content of the repository might have changed since the last installation.
(To be honest, even after this change, it might still be that pip will re-clone at each installation, I do not remember the exact behavior off the top of my head.)
See similar question: pip install upgrade fail to upgrade private dependency

python and pip: dependency is never upgraded with PEP508 URL requrements

PEP508 allows specifying a URL for a dependency, in particular a VCS. This is most useful for private packages that are not on pypi. If I have a package whose setup.py looks like:
from setuptools import setup
setup(name='foo',
install_requires=['bar # git+ssh://git#github.com/me/bar#1.2.3']
)
Then when I say pip install foo, it will download and install bar from the github repo. But if I later want to install a new version of foo, (pip install --upgrade foo), which has an updated bar dependency (e.g. tag 2.3.4), pip says that the dependency is already satisfied.
Is there a way to encode version information or something that will force pip to recognize that the dependency is NOT being met? I know I can give pip the --upgrade-strategy eager option, but that would affect ALL dependencies recursively and is too heavy-handed.
This related question PEP508: why either version requirement or URL but not both? asks about not being able to specify a version, but doesn't answer why pip doesn't get the URL when asked to upgrade.

pip install fails to install dependencies [duplicate]

This question already has answers here:
Pip install from pypi works, but from testpypi fails (cannot find requirements)
(2 answers)
Closed 2 years ago.
TL;DR Even though I've specified dependencies with install_requires in setup.py, the install through pip fails because some dependencies can't be found.
I've developed a package which I intend to distribute via PyPi. I've created a built distribution wheel and uploaded it to testPyPI to see if everything is working with the upload and if the package can be installed from a user perspective.
However, when I try to pip install the package inside a vanilla python 2.7 environment, the installation process fails while installing the dependencies.
My package depends on these packages (which I added to the setup.py file accordingly):
...
install_requires=['numpy','gdal','h5py','beautifulsoup4','requests','tables','progress'],
...
So when I run pip install, everything looks normal for a moment, until I receive this error:
Could not find a version that satisfies the requirement progress (from #NAME#) (from versions: )
No matching distribution found for progress (from #NAME#)
When I remove the progress dependency (I could live without it), same thing happens for pytables:
Could not find a version that satisfies the requirement tables (from #NAME#) (from versions: )
No matching distribution found for tables (from #NAME#)
If I run pip install tables and pip install progress manually beforehand, everything works as expected.
So how can I assure that if someone downloads my package, all missing dependencies are installed with it?
Related bonus question:
Can I include a wheel file in my package (maybe through MANIFEST.in) and install it as dependency if the module is not available? If so, how?
And I think I've found the answer to my question myself.
When installing a package from testPyPI, the dependencies are also installed from there. And it seems, that while there are many packages available, pytables and progress are apparently missing. This caused the installation to fail.
Naturally, manually installing with pip install gets the package from the "normal" PyPi, which of course works. This obviously added to my confusion.
Here's a look at the output from pip install when installing the package from the testPyPi:
Downloading https://test-files.pythonhosted.org/packages/4f/96/b3329750a04fcfc316f15f658daf6d81acc3ac61e3db390abe8954574c18/nump
y-1.9.3.tar.gz (4.0MB)
while installing the wheel directly, it looks slightly different:
Downloading https://files.pythonhosted.org/packages/2e/91/504e434d3b95d943caab926f33dee5691768fbb622bc290a0fa6df77e1d8/numpy-1.1
4.2-cp27-none-win32.whl (9.8MB)
Additionally, running
pip install --index-url https://test.pypi.org/simple/ tables
produces the same error as described in my question.

How can I make setuptools install a package that's not on PyPI?

I've just started working with setuptools and virtualenv. My package requires the latest python-gearman that is only available from GitHub. The python-gearman version that's on PyPI is an old one. The Github source is setuptools-compatible, i.e. has setup.py, etc. Is there a way to make setuptools download and install the new version instead of looking for it on PyPI and installing the old one?
FYI, the new python-gearman is http://github.com/mtai/python-gearman
The key is to tell easy_install where the package can be downloaded. In this particular case, it can be found at the url http://github.com/mtai/python-gearman/tarball/master. However, that link by itself won't work, because easy_install can't tell just by looking at the URL what it's going to get.
By changing it to http://github.com/mtai/python-gearman/tarball/master#egg=gearman-2.0.0beta instead, easy_install will be able to identify the package name and its version.
The final step is to add the URL to your package's dependency_links, e.g.:
setup(
...
dependency_links = ['http://github.com/mtai/python-gearman/tarball/master#egg=gearman-2.0.0beta']
)
Now, when YOUR package is being installed, easy_install will discover that there is a "gearman 2.0.0beta" available for download from that URL, and happily pick it over the one on PyPI, if you specify "gearman>=2.0.0beta" in your dependencies..
(Normally, the way this sort of thing is done is to include a link on one's PyPI page to the downloadable source; in this case, if the author of the gearman package had included a link like the above, you'd be already set. Typically, people mark the development version with 'myproject-dev' and then people use a requirement of 'myproject>=somever,==dev', so that if there isn't a package of somever or higher, easy_install will try to check out or download the release.)
You'll need to specify --process-dependency-links when using pip. Note that dependency links processing has been deprecated and will be removed in a future release.
You can use the pip install protocol+location[#tag][#egg=Dependency] format to install directly from source using pip.
Git
pip install git+https://github.com/username/repo.git
pip install git+https://github.com/username/repo.git#MyTag
pip install git+https://github.com/username/repo.git#MyTag#egg=ProjectName
Mercurial
pip install hg+https://hg.myproject.org/MyProject/
SVN
pip install svn+svn://svn.myproject.org/svn/MyProject
Bzr
pip install bzr+http://bzr.myproject.org/MyProject/trunk
The following protocols are supported: [+git, +svn, +hg, +bzr]
Versions
#tag lets you specify a specific version/tag to check out.
#egg=name lets you specify what the project is as a dependency for others.
The order must always be #tag#egg=name.
Private Repositories
You can also install from private repositories by changing the protocol to SSH (ssh://) and adding an appropriate user (git#):
git+ssh://git#github.com/username/my_private_repo
You can also install from private repositories with a username / password.
git+https://<username>:<password>#github.com/<user>/<repo>.git
Github provides the ability to create personal OAuth tokens which can be cycled
git+https://<oauth token>:x-oauth-basic#github.com/<user>/<repo>.git
requirements.txt
requirements.txt is used to specify project dependencies:
requirements.txt
package1
package2==1.0.2
package3>=0.0.4
git+https://github.com/username/repo.git
These are not installed automatically with the package and must be installed with the command pip -r requirements.txt.
Including requirements files
Requirements files can include other requirements files:
requirements-docs.txt
sphinx
-r requirements-dev.txt
requirements-dev.txt
some-dev-tool
-r requirements.txt
requirements.txt
package1
package2==1.0.2
package3>=0.0.4
git+https://github.com/username/repo.git
setup.py
Requirements files can install dependencies specified in setup.py with the following command:
-e .
setup.py can also install from repositories using the same syntax as above, but using the dependency_links value as mentioned in this answer.
References:
https://pip.pypa.io/en/latest/user_guide.html#installing-packages
https://pip.pypa.io/en/latest/reference/pip_install.html
As I just had to do the same thing, I found another way to do this as pip's --process-dependency-links are scheduled to be removed in pip 19.0 according to this comment.
pip 18.1 includes the following feature
Allow PEP 508 URL requirements to be used as dependencies.
From the description of PEP 508, the syntax for such URL dependencies looks like:
A minimal URL based lookup:
pip # https://github.com/pypa/pip/archive/1.3.1.zip#sha1=da9234ee9982d4bbb3c72346a6de940a148ea686
So in your setup.py it would look like
setup(
...
install_requires = [
...
'python-gearman # https://github.com/mtai/python-gearman/archive/master.zip'
...
]
)
Notice, the link is an archive file and could also be a specific release or branch of a repository as described in this answer. Also, see that answer for working with other repository hosts.
To the best of my knowledge, the easiest way to update the dependency is by using pip install -I . when installing your package from its directory.
Vanilla setuptools does not support downloading directly from a git repository but you can use one of the Download Source links from that page, like:
easy_install http://github.com/mtai/python-gearman/tarball/master

Categories

Resources