So my problem is as follows. I have an online repository where I host my package. I'd like my end user to install it using:
pip install git + ssh
The source code is written in C and I would like for the users to not be given access to it. Is it possible to package a wheel out of pre-existing binary outputs of CMake?
Follow up question: if I push a wheel to the repo, how do I get pip install to install the wheel? what does my setup.py look like?
Related
I created a repository, on Artifactory, which includes a zip containing 2 folders.
https://artifactory.healthcareit.net:443/artifactory/pi-generic-local/paymentintegrity-airflow-lib-plugins/paymentintegrity-airflow-libs-plugins-202211031330.zip
Is there a way to download that zip and extract the directories during a pip install? Basically, the developer just runs a pip install and gets the directories where needed.
I'm not looking for something in the [scripts] section since installing these directories would be the only thing currently needed in the Pipfile (its a weird project). Is this possible?
You can abuse the install command in the setup.py to call any arbitrary code, such as a one that unzips the assets and install it at the right place. I have seen this being done to package c++ binaries using python pypi in a place I worked for. See Post-install script with Python setuptools for hints on how to override the install command.
As per my observations, you have created generic repo and trying to fetch the packages using pip install. I would recommend creating a PyPI repository in the Artifactory and then try fetching the packages. This will. help in creating the metadata which will maintain all the package versions in the repository.
If you have the packages in your local then push them in the PyPI local repo and when you resolve them from Artifactory it will automatically download for the pip install based on your requirement.
If your requirement is to zip up multiple packages and push the archive file to the Artifactory and want the Artifactory to unzip and give the dependeciense during the pip install - then this is not possible from the Artifactory side we need to use a Post-install script with Python setup tools as mentioned .
Question regarding installation of a Python-package from a private git-repository.
I have an init.py-file that is run whenever a user logs in to my service. This script is responsible for installing required packages, amongst others a python-package (with setup.py) from a private repository.
I am looking for ways to:
Install the latest version of the package if not currently installed.
Update the package, if the current installed version is not the latest.
Perform no action, if the latest version of the package already is installed.
I have tried the following:
Using pip install --upgrade git+ssh://..., however this always performs a clean install of the package.
Using pip install git+ssh://..., however this will not update the package if the current version is not the latest.
I am currently looking into ways of doing this manually by:
Git cloning the repository if it does not exist locally; then,
Call python setup.py develop to install the package in develop mode; then finally,
Do a git stash; git pull to discard any changes to working directory, and automatically pull latest changes.
However, I feel this approach is prone to users messing up.
I'd love if someone could provide some insight into this issue.
Thanks in advance!
This question already has answers here:
How to install packages offline?
(12 answers)
Closed 3 years ago.
I'm an experienced programmer, but very new to python. My company requires us to do development on a private network for some of our projects. There is a pypi index on the private network which can be used to install packages using pip. Recently, while needing to install a package, the pypi index when down and was down for several hours. Although it did come back up eventually, the situation begs the question, how do I install packages (maybe manually without pip) in the absense of an index? I've tried to google this, but came up empty. I'm sure there's a way, but I'm probably not searching for the right phrase. Thanks for any help with.
You can manually install Python packages if you have read access to the package repositories. Every Python package has a setup.py file in the root directory and you can do something like
python setup.py sdist
This creates a subdirectory called dist which contains a compressed archived file, tar.gz or .zip depending in your OS. You can pass this archived file to pip and install the package
pip3 install some-python-package.tar.gz
I would download the wheel and install that. For this to you do need to install the wheel package:
pip install wheel
You can then tell pip to install the project (and it'll download the wheel if available), or the wheel file directly:
pip install project_name # download and install
pip install wheel_file.whl # directly install the wheel
The wheel module is also runnable from the command line and you can use it to install already-downloaded wheels:
python -m wheel install wheel_file.whl
There are a few ways you can get around this issue. The two that I know of are:
Use a proxy to get to the standard PyPI. If your company permits it, then you can tunnel your traffic through their proxy and install packages from PyPA's standard locations.
Use a locally hosted index. All you need is a directory structured like https://pypi.org/simple/, and you can then pip install -i ~/my/personal/index/path and packages will be installed from there.
I'm working on a project and need a little different functionality from the package sklearn. I've forked the repo and pushed my changes. I know that I can install from github via pip:
pip install git+git://github.com/wdonahoe/scikit-learn-fork.git#master
and then I can install the package with setup.py:
python setup.py install
However, I am confused about what to do after this step. Running setup.py creates some .egg-info folders and .egg-links files in .../dist-packages/, but I am unsure what to do with them. Ideally, I'd like to go into my project in .../projects/my_project and say something like
from sklearn-my-version import <stuff>
or switch it out with just
from sklearn import <stuff>
I am also a little confused because a lot of resources on this issue mention using easy_install, which I thought pip replaced.
try again using just (-e flag lets you git pull updates by installing it as a git repo)
pip install -e git+git://github.com/wdonahoe/scikit-learn-fork.git#master#egg=scikit-learn
more on eggs:
http://mrtopf.de/blog/en/a-small-introduction-to-python-eggs/
I understand there is already a question about packaging into pip, but this is more generic. On what mechanism does pip identify packages? To which central server should I add the name so that when someone types in
pip install <mypackagename>
how does pip know, where to look for the package. What should I do to add mine to that name resolution directory?
Pip pulls from the Python Package Index. It is very easy to submit a package, assuming you have a configured setup.py to build the package.
You'll need to register an account on PyPi, have certain metadata defined in setup.py (license, etc), and a setup.cfg if you're using markdown-formatted readme (as on Github). Then it's just a shell command to register the package :
Register:
python setup.py register -r pypi
Submit:
python setup.py sdist upload -r pypi
Python's crowdsourced package repository, PyPI, aka the Python Package Index.
You will want to start with a tutorial on how to package your code for, then submit to, PyPI. This is one. There is a learning curve, but it is most worthwhile.
It helps to look at packages already on PyPI, then follow the links back to their source code repositories to see all of the files and configurations that were used. For example, my intspan package is hosted at bitbucket. As many PyPI packages are hosted at either Bitbucket or Github, so there are many examples available from which to learn.