Super newb question here..
What does pip install actually do?
Assuming the pypi package is a tarball...
Does it just download the tar.gz, unpack it and run setup.py?
Does it add the downloaded package to the site_packages folder?
I want to create a pip installable pkg using pypiserver so my colleagues can download my pkg in a painless way, but am a little unsure exactly what to include beyond the actual .py scripts.
Any guidance would be appreciated
The tar.gz file is a source archive whereas the .whl file is a built
distribution. Newer pip versions preferentially install built
distributions, but will fall back to source archives if needed. You
should always upload a source archive and provide built archives for
the platforms your project is compatible with. In this case, our
example package is compatible with Python on any platform so only one
built distribution is needed.
See: https://packaging.python.org/tutorials/packaging-projects/
You would typically not manually create the source archive and wheel, but use setuptools and wheel to do so for you.
Nowadays, many packages are wheels. While installing these wheels using pip, pip will:
[...] unpack the archive in your current site packages directory and install any console scripts contained in the wheel.
See: https://wheel.readthedocs.io/en/stable/user_guide.html#installing-wheels
Related
I created a repository, on Artifactory, which includes a zip containing 2 folders.
https://artifactory.healthcareit.net:443/artifactory/pi-generic-local/paymentintegrity-airflow-lib-plugins/paymentintegrity-airflow-libs-plugins-202211031330.zip
Is there a way to download that zip and extract the directories during a pip install? Basically, the developer just runs a pip install and gets the directories where needed.
I'm not looking for something in the [scripts] section since installing these directories would be the only thing currently needed in the Pipfile (its a weird project). Is this possible?
You can abuse the install command in the setup.py to call any arbitrary code, such as a one that unzips the assets and install it at the right place. I have seen this being done to package c++ binaries using python pypi in a place I worked for. See Post-install script with Python setuptools for hints on how to override the install command.
As per my observations, you have created generic repo and trying to fetch the packages using pip install. I would recommend creating a PyPI repository in the Artifactory and then try fetching the packages. This will. help in creating the metadata which will maintain all the package versions in the repository.
If you have the packages in your local then push them in the PyPI local repo and when you resolve them from Artifactory it will automatically download for the pip install based on your requirement.
If your requirement is to zip up multiple packages and push the archive file to the Artifactory and want the Artifactory to unzip and give the dependeciense during the pip install - then this is not possible from the Artifactory side we need to use a Post-install script with Python setup tools as mentioned .
I would like to know if there's a way to package a simple Python project and have it perform installation over the internet, just like when you install a module with pip.
Sure there is. This is how all the 3rd party packages we are all using did.
The formal pypa explain how to do it here.
Basically you need to package your project to a wheel file and upload it to the pypi repository. To do this you need to declare (mainly in setup.py), what is your package name, version, which sub-packages you want to pack to the wheel etc..
If your packages are required for a particular project, it is straightforward to contain them in the Git repository. You can put them in the directory named wheelhouse, which comes from the name of the previous default directory created by pip wheel.
If you put the private package foo in the wheelhouse, you can install as follows:
pip install foo -f wheelhouse
I know that wheels are binary version of a module uploaded on PyPI.
with pip install
On Windows: I get wheels downloaded and installed.
On Ubuntu: I should get the source distribution of the package BUT in some cases I get wheels.
On fedora: Tricky I have to install with dnf
I tried to add wheels to my package as well. But I am only able to upload wheels for windows.
Why do some packages provide wheels for Linux platform?
Is this okay? Providing binaries instead of the source?
Why I cannot provide wheels?
Note: I know a bit about Fedora rpm packages. I am interested now in wheels on Ubuntu.
Why do some packages provide wheels for Linux platform?
Why shouldn't they, as long as source distributions are available as well? :)
Your question is not clear. If you meant
Why do some packages provide platform-specific wheels for Linux platform instead of platfom-independent ones?
then take a look at this question and its answers. If not, please clarify your question.
On Ubuntu: I should get the source distribution of the package BUT in some cases I get wheels.
Try using:
pip install --no-binary :all: somepackage
This should make pip download a source distribution if it exists on PyPI. I don't know why there are no source packages for PyQt5 on PyPI, probably because they are not installable with pip and need a whole toolchaing for compilation.
Is this okay? Providing binaries instead of the source?
It's okay as long as you provide both binaries and the source. I suggest you doing so.
Why I cannot provide wheels?
Try python setup.py bdist_wheel. You need to install wheel package (on PyPI) to make it work. If your package supports both Python 2 and 3 and contains no C extensions, append the --universal option to make a "universal wheel".
Replace bdist_wheel with sdist to make a source distribution. It will create an archive in dist directory.
sdist creates the archive of the default format for the current platform. The default format is a gzip’ed tar file (.tar.gz) on Unix, and ZIP file on Windows.
You can specify as many formats as you like using the --formats option, for example:
python setup.py sdist --formats=gztar,zip
to create a gzipped tarball and a zip file
(Quote from https://docs.python.org/3/distutils/sourcedist.html)
More info about packaging and wheels is available here: https://packaging.python.org/distributing/#packaging-your-project
I'm trying to use SVN to manage my python project.
I installed many external Libs (the path is like:"C:\Python27\Lib\site-packages")on Computer A,then I upload the project to the SVN Server.
and then I use Computer B which just has python(v2.7) been installed.I checkout from the SVN server
:here comes the problem..there is no external Libs in computer B.Is there any solution to solve this problem,I don't want to install the external Libs on Computer B again!
Thanks advance!
The normal Python way to deal with this is to use pip and requirements files. virtualenv, which lets you have multiple sets of installed packages, is also commonly used.
For example, if you have a project which depends on any version of itsdangerous and any version of Werkzeug over 0.9, you could have this requirements file:
Werkzeug>=0.9
itsdangerous
You would usually store that in a file named requirements.txt. You would then install the packages like this:
pip install -r requirements.txt
pip will find all of the packages needed not already installed and install them.
You could actually copy the package source code from site-packages to your project folder, and your project folder normally has a higher prority than site-packages.
Then you just need check-in library to your svn.
I recently began learning Python, and I am a bit confused about how packages are distributed and installed.
I understand that the official way of installing packages is distutils: you download the source tarball, unpack it, and run: python setup.py install, then the module will automagically install itself
I also know about setuptools which comes with easy_install helper script. It uses eggs for distribution, and from what I understand, is built on top of distutils and does the same thing as above, plus it takes care of any dependencies required, all fetched from PyPi
Then there is also pip, which I'm still not sure how it differ from the others.
Finally, as I am on a windows machine, a lot of packages also offers binary builds through a windows installer, especially the ones that requires compiling C/Fortran code, which otherwise would be a nightmare to manually compile on windows (assumes you have MSVC or MinGW/Cygwin dev environment with all necessary libraries setup.. nonetheless try to build numpy or scipy yourself and you will understand!)
So can someone help me make sense of all this, and explain the differences, pros/cons of each method. I'd like to know how each keeps track of packages (Windows Registry, config files, ..). In particular, how would you manage all your third-party libraries (be able to list installed packages, disable/uninstall, etc..)
I use pip, and not on Windows, so I can't provide comparison with the Windows-installer option, just some information about pip:
Pip is built on top of setuptools, and requires it to be installed.
Pip is a replacement (improvement) for setuptools' easy_install. It does everything easy_install does, plus a lot more (make sure all desired distributions can be downloaded before actually installing any of them to avoid broken installs, list installed distributions and versions, uninstall, search PyPI, install from a requirements file listing multiple distributions and versions...).
Pip currently does not support installing any form of precompiled or binary distributions, so any distributions with extensions requiring compilation can only be installed if you have the appropriate compiler available. Supporting installation from Windows binary installers is on the roadmap, but it's not clear when it will happen.
Until recently, pip's Windows support was flaky and untested. Thanks to a lot of work from Dave Abrahams, pip trunk now passes all its tests on Windows (and there's a continuous integration server helping us ensure it stays that way), but a release has not yet been made including that work. So more reliable Windows support should be coming with the next release.
All the standard Python package installation mechanisms store all metadata about installed distributions in a file or files next to the actual installed package(s). Distutils uses a distribution_name-X.X-pyX.X.egg-info file, pip uses a similarly-named directory with multiple metadata files in it. Easy_install puts all the installed Python code for a distribution inside its own zipfile or directory, and places an EGG-INFO directory inside that directory with metadata in it. If you import a Python package from the interactive prompt, check the value of package.__file__; you should find the metadata for that package's distribution nearby.
Info about installed distributions is only stored in any kind of global registry by OS-specific packaging tools such as Windows installers, Apt, or RPM. The standard Python packaging tools don't modify or pay attention to these listings.
Pip (or, in my opinion, any Python packaging tool) is best used with virtualenv, which allows you to create isolated per-project Python mini-environments into which you can install packages without affecting your overall system. Every new virtualenv automatically comes with pip installed in it.
A couple other projects you may want to be aware of as well (yes, there's more!):
distribute is a fork of setuptools which has some additional bugfixes and features.
distutils2 is intended to be the "next generation" of Python packaging. It is (hopefully) adopting the best features of distutils/setuptools/distribute/pip. It is being developed independently and is not ready for use yet, but eventually should replace distutils in the Python standard library and become the de facto Python packaging solution.
Hope all that helped clarify something! Good luck.
I use windows and python. It is somewhat frustrating, because pip doesn't always work to install things. Python is moving to pip, so I still use it. Pip is nice, because you can uninstall items and use
pip freeze > requirements.txt
pip install -r requirements.txt
Another reason I like pip is for virtual environments like venv with python 3.4. I have found venv a lot easier to use on windows than virtualenv.
If you cannot install a package you have to find the binary for it. http://www.lfd.uci.edu/~gohlke/pythonlibs/
I have found these binaries to be very useful.
Pip is trying to make something called a wheel for binary installations.
pip install wheel
wheel convert path\to\binary.exe
pip install converted_wheel.whl
You will also have to do this for any required libraries that do not install and are required for that package.