pip installing from unorthodox PEP 503 repo structure - python

Organizations in my company deploy their code in a way that seems to be incompatible with pip install
What I do is deploy to, e.g. https://artifactory.example.com/my-org-pypi-local/my-project/my-project-1.2.3-py3-none-any.whl
If I set PIP_EXTRA_INDEX_URL=https://artifactory.example.com/my-org-pypi-local/, I can do pip install my-project and it will work.
However, other organizations do https://artifactory.example.com/their-org-pypi-local/their-project/1.2.3/their-project-1.2.3-py3-none-any.whl. If I set PIP_EXTRA_INDEX_URL=https://artifactory.example.com/their-org-pypi-local/, pip install their-project will fail because it can't find any valid wheels under their-org-pypi-local/their-project. It refuses to recurse past the extra /1.2.3.
Forgetting why someone would set up their deployment like this, is there a way to get pip to cope with this odd repo structure?

Related

How do i build a discord app in heroku if the discord.py 2.0 and pycord 2.0 isn't available for download offcially [duplicate]

I've installed a library using the command
pip install git+git://github.com/mozilla/elasticutils.git
which installs it directly from a Github repository. This works fine and I want to have that dependency in my requirements.txt. I've looked at other tickets like this but that didn't solve my problem. If I put something like
-f git+git://github.com/mozilla/elasticutils.git
elasticutils==0.7.dev
in the requirements.txt file, a pip install -r requirements.txt results in the following output:
Downloading/unpacking elasticutils==0.7.dev (from -r requirements.txt (line 20))
Could not find a version that satisfies the requirement elasticutils==0.7.dev (from -r requirements.txt (line 20)) (from versions: )
No distributions matching the version for elasticutils==0.7.dev (from -r requirements.txt (line 20))
The documentation of the requirements file does not mention links using the git+git protocol specifier, so maybe this is just not supported.
Does anybody have a solution for my problem?
Normally your requirements.txt file would look something like this:
package-one==1.9.4
package-two==3.7.1
package-three==1.0.1
...
To specify a Github repo, you do not need the package-name== convention.
The examples below update package-two using a GitHub repo. The text between # and # denotes the specifics of the package.
Specify commit hash (41b95ec in the context of updated requirements.txt):
package-one==1.9.4
git+https://github.com/path/to/package-two#41b95ec#egg=package-two
package-three==1.0.1
Specify branch name (master):
git+https://github.com/path/to/package-two#master#egg=package-two
Specify tag (0.1):
git+https://github.com/path/to/package-two#0.1#egg=package-two
Specify release (3.7.1):
git+https://github.com/path/to/package-two#releases/tag/v3.7.1#egg=package-two
Note that #egg=package-two is not a comment here, it is to explicitly state the package name
This blog post has some more discussion on the topic.
“Editable” packages syntax can be used in requirements.txt to import packages from a variety of VCS (git, hg, bzr, svn):
-e git://github.com/mozilla/elasticutils.git#egg=elasticutils
Also, it is possible to point to particular commit:
-e git://github.com/mozilla/elasticutils.git#000b14389171a9f0d7d713466b32bc649b0bed8e#egg=elasticutils
requirements.txt allows the following ways of specifying a dependency on a package in a git repository as of pip 7.0:1
[-e] git+git://git.myproject.org/SomeProject#egg=SomeProject
[-e] git+https://git.myproject.org/SomeProject#egg=SomeProject
[-e] git+ssh://git.myproject.org/SomeProject#egg=SomeProject
-e git+git#git.myproject.org:SomeProject#egg=SomeProject (deprecated as of Jan 2020)
For Github that means you can do (notice the omitted -e):
git+git://github.com/mozilla/elasticutils.git#egg=elasticutils
Why the extra answer?
I got somewhat confused by the -e flag in the other answers so here's my clarification:
The -e or --editable flag means that the package is installed in <venv path>/src/SomeProject and thus not in the deeply buried <venv path>/lib/pythonX.X/site-packages/SomeProject it would otherwise be placed in.2
Documentation
1 https://pip.readthedocs.org/en/stable/reference/pip_install/#git
2 https://pip.readthedocs.org/en/stable/reference/pip_install/#vcs-support
First, install with git+git or git+https, in any way you know. Example of installing kronok's branch of the brabeion project:
pip install -e git+https://github.com/kronok/brabeion.git#12efe6aa06b85ae5ff725d3033e38f624e0a616f#egg=brabeion
Second, use pip freeze > requirements.txt to get the right thing in your requirements.txt. In this case, you will get
-e git+https://github.com/kronok/brabeion.git#12efe6aa06b85ae5ff725d3033e38f624e0a616f#egg=brabeion-master
Third, test the result:
pip uninstall brabeion
pip install -r requirements.txt
Since pip v1.5, (released Jan 1 2014: CHANGELOG, PR) you may also specify a subdirectory of a git repo to contain your module. The syntax looks like this:
pip install -e git+https://git.repo/some_repo.git#egg=my_subdir_pkg&subdirectory=my_subdir_pkg # install a python package from a repo subdirectory
Note: As a pip module author, ideally you'd probably want to publish your module in it's own top-level repo if you can. Yet this feature is helpful for some pre-existing repos that contain python modules in subdirectories. You might be forced to install them this way if they are not published to pypi too.
None of these answers worked for me. The only thing that worked was:
git+https://github.com/path_to_my_project.git
No "e", no double "git" and no previous installs necessary.
Github has zip endpoints that in my opinion are preferable to using the git protocol. The advantages are:
You don't have to specify #egg=<project name>
Git doesn't need to be installed in your environment, which is nice for containerized environments
It works much better with pip hashing and caching
The URL structure is easier to remember and more discoverable
You usually want requirements.txt entries to look like this, e.g. without the -e prefix:
https://github.com/org/package/archive/1a58aa586efd4bca37f2cfb9d9348958986aab6c.tar.gz
To install from main branch:
https://github.com/org/package/archive/main.tar.gz
There is also an equivalent .zip endpoint, but it was reported in a comment that always using the .tar.gz endpoint avoids problems with unicode package names.
It seems like this is also a valid format:
gym-tictactoe # git+https://github.com/haje01/gym-tictactoe.git#84e22fc28fe192ba0040bdd56a697f63d3d4a3d5
If you do a pip install "git+https://github.com/haje01/gym-tictactoe.git", then look at what got installed by running pip freeze, you will see the package described in this format and can copy and paste into requirements.txt.
I'm finding that it's kind of tricky to get pip3 (v9.0.1, as installed by Ubuntu 18.04's package manager) to actually install the thing I tell it to install. I'm posting this answer to save anyone's time who runs into this problem.
Putting this into a requirements.txt file failed:
git+git://github.com/myname/myrepo.git#my-branch#egg=eggname
By "failed" I mean that while it downloaded the code from Git, it ended up installing the original version of the code, as found on PyPi, instead of the code in the repo on that branch.
However, installing the commmit instead of the branch name works:
git+git://github.com/myname/myrepo.git#d27d07c9e862feb939e56d0df19d5733ea7b4f4d#egg=eggname
For private repositories, I found that these two work fine for me:
pip install https://${GITHUB_TOKEN}#github.com/owner/repo/archive/main.tar.gz
Where main.tar.gz refers to the main branch of your repo and can be replaced with other branch names. For more information and using the more recent Github API see here:
pip install https://${GITHUB_TOKEN}#api.github.com/repos/owner/repo/tarball/master
If you have git installed and available, then
pip install git+https://${GITHUB_TOKEN}#github.com/owner/repo.git#main
achieves the same, and it also allows for some more flexibility by appending #branch or #tag or #commit-hash. That approach, however, actually clones the repo into a local temp folder which can take a noticeable amount of time.
You can use the URLs in your requirements.txt, too.

Keeping a private python package installed and up to date

Question regarding installation of a Python-package from a private git-repository.
I have an init.py-file that is run whenever a user logs in to my service. This script is responsible for installing required packages, amongst others a python-package (with setup.py) from a private repository.
I am looking for ways to:
Install the latest version of the package if not currently installed.
Update the package, if the current installed version is not the latest.
Perform no action, if the latest version of the package already is installed.
I have tried the following:
Using pip install --upgrade git+ssh://..., however this always performs a clean install of the package.
Using pip install git+ssh://..., however this will not update the package if the current version is not the latest.
I am currently looking into ways of doing this manually by:
Git cloning the repository if it does not exist locally; then,
Call python setup.py develop to install the package in develop mode; then finally,
Do a git stash; git pull to discard any changes to working directory, and automatically pull latest changes.
However, I feel this approach is prone to users messing up.
I'd love if someone could provide some insight into this issue.
Thanks in advance!

pip install from Azure DevOps Python Artifacts feed not working

When I attempt to install a package from our Azure DevOps Artifacts feed, I get the error:
Looking in indexes: https://pypi.org/simple, https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/
ERROR: Could not find a version that satisfies the requirement as-api (from versions: none)
ERROR: No matching distribution found for as-api
As using pip install -vvv potentially produces confidential information, I cannot provide the full log here. Please feel free to ask any specific questions about the log. In the meantime, I can see promising messages like:
Found index url https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/
Getting credentials from keyring for https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/
And some problematic messages?:
Status code 302 not in (200, 203, 300, 301)
Skipping link: not a file: ...
Given no hashes to check 0 links for project 'as-api': discarding no candidates
Reproduction details
virtualenv .venv
.\.venv\Scripts\activate
python -m pip install -U pip
pip install keyring artifacts-keyring
pip install as-api
This link was used to produce a pipeline to publish the package and the suggested way of installing the package. My approach is now a mix of both option 1 and option 2. Note the use of a php.ini file to set --index-url and the artifacts-keyring package (installing with --pre does not make any difference to the version), so it really doesn't make any difference. However, I have tried both options separately, it doesn't spawn a browser, so it gives the same result.
System details:
OS: Windows 10
Python 2.7.17
pip list
Package Version
----------------- ----------
artifacts-keyring 0.2.8rc0
certifi 2019.11.28
chardet 3.0.4
configparser 4.0.2
entrypoints 0.3
idna 2.8
keyring 18.0.1
pip 19.3.1
pywin32-ctypes 0.2.0
requests 2.22.0
setuptools 42.0.2
urllib3 1.25.7
wheel 0.33.6
Folder structure:
test
|-- test.py
|-- .venv
|-- pip.ini
|-- ... other virtualenv folders and files
pip.ini:
[global]
extra-index-url = https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/
Further analysis
Using a clean laptop actually works with the above reproduction details. Other computers in the company also have the same problem, so some of our set up is conflicting with the authentication.
If we use a pipeline (see this link) to install the as-api package, it works, so I suspect this is an authentication problem, but it's not mentioned on any documentation.
Using https://username:password#... does not give any authentication error, even with wrong username and password.
Using the correct username but have symbols in the password triggers interactive mode to enter username and password. However, this gives this error: WARNING: 401 Error, Credentials not correct for https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/as-api/ Note that I am the owner of the Artifacts feed and the team has been added as the owner in the permission tab.
As a workaround:
Looks like you're using option2 from the document to do the install. I happen to see one similar issue which indicates this error message could have something to do with pip.ini(windows) or pip.conf(linux/mac), so I think you can try another approach to avoid something wrong with those configurations.
You can run pip install artifacts-keyring --pre and then run
pip install packageName --index-url https://pkgs.dev.azure.com/xxx/xxx/_packaging/xxx/pypi/simple/ -vvv --no-deps
You would meet something like this when running command pip install artifacts-keyring --pre:
After the login-in passes, you would get the package you need if it do exist in your feed.
Assuming your Azure DevOps artifacts is private and you have a PAT then installing a package from the artifact can be done in the following two ways
If you have access to a terminal (only preferred in dev environment)
pip install https://<your-feed-name>:<your-PAT-key>#pkgs.dev.azure.com/<your-organization-name>/<your-project-name>/_packaging/<your-feed-name>/pypi/simple/ Your-Package-Name==x.x.x
Note: All the names (eg: feed, project) must follow the HTTPS URL convention.A simple (& actually correct) way to get to know the URL is goto Artifacts --> Select your artifact feed --> Connect to feed --> PIP --> Here you will get the correct URL. Also, use the some feed name both the place in URL
Using requirements.txt (this will be ideally used in prod or CI/CD pipeline) and automating the process:
Mind you it need a bit of string/URL manipulation. Add the respective line/URL in your requirements.txt in following manner:
The URL will be mostly similar to the earlier URL used in the earlier terminal method
In the URL after simple everything will have to change, modified URL-
https://<your-feed-name>:<your-PAT-key>#pkgs.dev.azure.com/<your-organization-name>/<your-project-name>/_packaging/<your-feed-name>/pypi/download/<yourpackagename>/<package version>/Your-Package-Name.whl
#assuming your package is a .whl file
So simple changed to download; then whatever is your package name, whether it contains '-' or '_' or CAPS, everything will be removed and converted into small case.
Next is version no of your package that you want to install & finally the name of wheel or .whl file.
My issue was that I had not installed artifacts-keyring. After that I could see VS Code authenticating to the feed and installing the package.
I also needed to upgrade pip (needs to be above > 19.2) with the following command:
python -m pip install --upgrade pip
The fix
Do one of the following:
Remove the VSS_NUGET_EXTERNAL_FEED_ENDPOINTS environment variable (not very useful, not recommended).
Add an extra endpoint to the VSS_NUGET_EXTERNAL_FEED_ENDPOINTS environment variable. E.g.,
{"endpointCredentials": [{"endpoint":"https://pkgs.dev.azure.com/company/_packaging/NuGetFeed/nuget/v3/index.json", ...},{"endpoint":"https://pkgs.dev.azure.com/company/company_Software/_packaging/PyPI/pypi/simple/", ...}]}
We have a script which sets up these endpoints, so this turns out to be a simple fix.
The cause
It turns out that if you have used artifacts-credprovider to set up another feed, in our case, a NuGet feed with another endpoint, the VSS_NUGET_EXTERNAL_FEED_ENDPOINTS environment variable stores only that feed URL inside the key endpoint. artifacts-keyring will still read that environment variable even if the endpoint doesn't exist, which causes authentication problem. The -vvv log doesn't tell you anything about authentication and it won't attempt to authenticate using another method.
As an update to #user:10097045 's answer
You must add the --extra-index-url= infront of the URL with path in option 1 otherwise pip will fail to find the directory
Otherwise answer was super helpful you just get a 404 without that definition

Proper install dependency by requirements.txt or how to properly install dependency which was edited

I am using this dependency on my django site.
https://bitbucket.org/tim_heap/django-bleach
Problem is on django 1.9 there is a fix, and the repository was forked to someone else.
This is the repository
https://bitbucket.org/C14L/django-bleach.git
And this is the fix
https://bitbucket.org/C14L/django-bleach/commits/4dd2616f490d5d63bc119b24e07fdf8154f25503
On both there is manual how to install it
pip install django-bleach
But that is the same I would end up with the same error. I would like to install the fixed version. I can edit it manually on my localhost - it is not proper software engineering, so I would like to know proper way. But this is not the problem, problem is how to install it on heroku and I do not want to edit the dependency with CLI on heroku.
Can you help me what would be the proper way to install this fix on heroku and my localhost from that repository? What I need to put in the requirements.txt that it would install the correct fix?
https://bitbucket.org/C14L/django-bleach/commits/4dd2616f490d5d63bc119b24e07fdf8154f25503
You can put repository urls in requirements.txt like so:
git+https://bitbucket.org/C14L/django-bleach.git
or be more specific and specify the branch:
git+https://bitbucket.org/C14L/django-bleach.git#django_1_9_fix
Pip documentation.

Is it possible to create a fully self-contained Python package?

The question
Ansible is a python moduel, installable via pip. It relies on several dependencies, also pip modules. Is it possible to "roll up" all of those dependencies and Ansible itself into some sort of a single package, that can be installed offline, without root? It's highly preferable to not need pip for the install, although it will be available for package creation.
Extra background
I'm trying to install Ansible on one of our servers. The server does not have access to the internet, there is no root access. Pip is not installed, but Python is. It is possible to get pip installed there, but might be complicated. The only way to get anything on the server is via an internal tar.gz package sharing solution.
I've tried fiddling around with rpm, saving dependencies, but the absence of root access put an end to that.
Use pip on an internet-connected machine to download all the deps to a local dir with --download and -r requirements.txt, then drop that dir on the disconnected machine with pip installed, and install using --no-index and --find-links=(archive dir).
See https://pip.pypa.io/en/latest/user_guide/#fast-local-installs

Categories

Resources