I am making a python package that has some dependencies not hosted on PyPI or other indexes. I'd like to include those .whl dependencies in my repo/package, and install them from local files when the user pip installs my package.
I could add code to call pip directly on those files within my setup.py file, but I'm not sure of a reliable way to do that, especially considering if the user passed special arguments to pip originally.
Is there a proper way to do this? Or is it not something well supported by pip/setuptools?
Related
following this guide: https://realpython.com/pypi-publish-python-package/
can I just create my Python package but not publish it to pypi.
just install it with pip install and then import my_packeg.
when trying this get No module named my_packeg error
the goal is to using this package code inside x micro service for prevent duplicate code...
From pip's user guide:
pip supports installing from PyPI, version control, local projects, and directly from distribution files.
The command-line to use is different:
There is a suitable command-line use for each of them and pip looks for, in the following order:
When looking at the items to be installed, pip checks what type of item each is, in the following order:
Project or archive URL.
Local directory (which must contain a setup.py, or pip will report an error).
Local file (a sdist or wheel format archive, following the naming conventions for those formats).
A requirement, as specified in PEP 440.
For your specific problem, you don't need to upload to PyPI. Solutions:
Build a "wheel" https://pip.pypa.io/en/stable/reference/pip_wheel/ and distribute that file, pip can install it.
Place a zip archive of the source somewhere on your intranet (or shared file system) and call pip install http://intranet.url/mypackage-1.0.4.zip
source: Can I make pip installable package without registering package in pypi?
I'm trying to deploy a new version of a project I've been working on. I zip the package into a .tar.gz file using the standard python setup.py sdist command, and then install the package using python -m pip install x.tar.gz.
All the files which are required are present in the archive itself (the .tar.gz file) but for some reason pip skips installing one of these, a binary file containing some lookup tables. Moreover, I think this happens only on linux (I make the package itself using Windows, and here everything works fine). I would think that this might have something to do with using sdist instead of bdist, but with previous versions I've had no issues of this kind, despite using the same method. I'm not sure if there is any architecture dependency at play here. Are all binary files architecture dependent?
Any ideas regarding what might be the cause of this?
I would like to know if there's a way to package a simple Python project and have it perform installation over the internet, just like when you install a module with pip.
Sure there is. This is how all the 3rd party packages we are all using did.
The formal pypa explain how to do it here.
Basically you need to package your project to a wheel file and upload it to the pypi repository. To do this you need to declare (mainly in setup.py), what is your package name, version, which sub-packages you want to pack to the wheel etc..
If your packages are required for a particular project, it is straightforward to contain them in the Git repository. You can put them in the directory named wheelhouse, which comes from the name of the previous default directory created by pip wheel.
If you put the private package foo in the wheelhouse, you can install as follows:
pip install foo -f wheelhouse
I have an installable python package (mypackage) and it needs to use specific versions of a number of dependencies. At the minute I have some .sh scripts that just pip these into an internal package folder (eg C:\Python27\Lib\site-packages\mypackage\site-packages). When mypackage executes it adds this internal folder to the beginning of the python path so that it will override any other versions of the required dependencies elsewhere in the python path.
I know this will only work if the user doesn't import the dependencies prior to importing mypackage but I will document this.
I want to remove the .sh scripts and integrate the above into either dist_utils install or pip standard installation process. What is the best way to do this? I know about install_requires but it does not seem to allow specification of a location.
I eventually found the virtualenv tool which solved the above problem much more elegantly than the solution I'd put in place:
https://docs.python.org/3/tutorial/venv.html
I'm wondering if there's a way to "install" single-file python modules using pip (i.e. just have pip download the specified version of the file and copy it to site-packages).
I have a Django project that uses several 3rd-party modules which aren't proper distributions (django-thumbs and a couple others) and I want to pip freeze everything so the project can be easily installed elsewhere. I've tried just doing
pip install git+https://github.com/path/to/file.git
(and tried with the -e tag too) but pip complains that there's no setup.py file.
Edit: I should have mentioned - the reason I want to do this is so I can include the required module in a requirements.txt file, to make setting up the project on a new machine or new virtualenv easier.
pip requires a valid setup.py to install a python package. By definition every python package has a setup.py... What you are trying to install isn't a package but rather a single file module... what's wrong with doing something like:
git clone git+https://github.com/path/to/file.git /path/to/python/install/lib
I don't quite understand the logic behind wanting to install something that isn't a package with a package manager...