The pip tool makes python module install a breeze -- for the most part. But when the module requires some external C libraries that are not found in "standard locations", it can cause problem. Case in point: I was trying to install gmpy2 python module, and it needs to have access to gmp, mpfr, and mpc libraries. On the system I'm using (Linux HPC, running RHEL 6.9), the system-wide libraries are very old. The HPC system admin provides more up-to-date libraries but not in /usr/lib or /usr/local/lib. My question is: can we still use pip to build binary parts of the Python module? How to specify the custom include & library files location? I was forced to get back to python setup.py approach: first using build_ext subcommand, then invoking the install subcommand to finish the install. This is workable, but rather messy.
Related
VTK library cannot be installed via pip.
Though, it can be compiled and installed from sources.
My Python project depends on VTK.
I want it to install VTK automatically by calling pip install . from root directory of the project.
In this case setup.py file should be able to
download VTK sources of needed version from GitHub
call cmake in order to prepare build
compile sources and create Python bindings
install needed files into currently used site-packages (e. g., it should not be installed into /usr/local/lib/python3/site-packages if I use virtualenv, pipenv or pyenv)
Is it possible?
If yes, how can I do this?
In principle, you can include any executable code in the setup file. However, nowhere in the setuptools documentation could I find information that would solve the problem here.
Also, the installation procedure for vtk is a bit complex which is why kitware uses cmake in the first place.
So, the short answer would be "no" or "don't do that".
Further, the problems you will encounter:
Users will expect a transparent install. But achieving a cross-platform build process on the basis of the cmake build instructions for vtk will prevent you from setting the customization (path to vtk, path to the Python interpreter, platform-specific C flags).
The install process will be harder to debug. Users will come to you for VTK build problems.
Kitware themselve do not propose vtk on pypi. This suggests that it is too much time intensive, impossible or too fragile to maintain to achieve this goal.
If you wish to see a popular Python project that relies on vtk, there is mayavi. The installation instructions request to install vtk beforehand.
Looks like VTK has presented their official binding via pypi, so I can use it in setup.py file simply by appending it to install_requires list.
Works well in my project, doesn't need compilation anymore.
Though, there is a caution in the mayavi documentation
The latest VTK wheels are available on all the major platforms
(Windows, MacOS, and Linux), but only for 64 bit machines. Python 3.x
is fully supported on all these operating systems and Python 2.7.x on
MacOS and Linux. If you are out of luck, and your platform is not
supported then you will need to install VTK yourself using your
particular distribution as discussed in the General Build and
Installation instructions
I am packaging a python application that depends on several C libraries through gobject introspection. I would like to make sure that, at least, the python module from the glib is installed (that is the gi module, packaged as python-gi in Debian, I am not talking about the deprecated PyGObject module). Adding it as a regular dependency makes the install fail, since it is not in Pypi.
How should I declare this ? I looked a setuptool doc and nothing I see quite does the trick.
Thanks.
Related question:
Bundling GTK3+ with py2exe
You cannot specify non-Python dependencies using setuptools (AFAIK, that is ...).
The install_requires keywords to setuptools.setup can specify Python-style dependencies only; it targets the Python packaging infrastructure. Python-style installer programs (either pip, easy_install or python setup.py install) will resolve such dependencies using strategies to find and resolve Python-style packages only. One of those strategies is using a package index like PyPI.
If you want to create a package that has Debian-style dependencies, which are resolved by Debian-style installers, using Debian package repos, you have to create a Debian package. There are tools that support creating Debian packages from Python projects, for example easydeb and stdeb. However, most people recommend going the extra-mile and explicitly create a Debian package.
In the general case, packaging for and distributing Python projects via PyPI should be the way to go. It's platform and distro independent, and plays nicely with Python-specific installers like pip and tools like virtualenv or buildout. Having a dependency to PyGI would involve documenting the fact to users, like e.g. the pydbus packages does in its README:
It’s based on PyGI, the Python GObject Introspection bindings, which
is the recommended way to use GLib from Python. Unfortunately, PyGI is
not packaged on pypi, so you need to install it from your
distribution’s repository (usually called python-gi, python-gobject or
pygobject3).
Your project could also be defensive when importing from PyGI, presenting users a digestible error message like "please sudo apt-get install python-gi" or so, when the import fails.
can someone point out a simple example of making a Python package using distutils that depends on other python modules being installed, e.g. numpy (a particular version) and scipy? I found very simple examples online but could not find an example that depends on a known package. I want to find the easiest system where such dependencies are installed for the user when the user installs the package I am defining using setup.py install. thanks.
Distutils by itself does not install dependencies. You need to use an add-on to Distutils, like pip or plain Distribute/setuptools.
I have read the documentation but I don't understand.
Why do I have to use distutils to install python modules ?
Why do I just can't save the modules in python path ?
You don't have to use distutils. You can install modules manually, just like you can compile a C++ library manually (compile every implementation file, then link the .obj files) or install an application manually (compile, put into its own directory, add a shortcut for launching). It just gets tedious and error-prone, as every repetive task done manually.
Moreover, the manual steps I listed for the examples are pretty optimistic - often, you want to do more. For example, PyQt adds the .ui-to-.py-compiler to the path so you can invoke it via command line.
So you end up with a stack of work that could be automated. This alone is a good argument.
Also, the devs would have to write installing instructions. With distutils etc, you only have to specify what your project consists of (and fancy extras if and only if you need it) - for example, you don't need to tell it to put everything in a new folder in site-packages, because it already knows this.
So in the end, it's easier for developers and for users.
what python modules ? for installing python package if they exist in pypi you should do :
pip install <name_of_package>
if not, you should download them .tar.gz or what so ever and see if you find a setup.py and run it like this :
python setup.py install
or if you want to install it in development mode (you can change in package and see the result without installing it again ) :
python setup.py develop
this is the usual way to distribute python package (the setup.py); and this setup.py is the one that call disutils.
to summarize this distutils is a python package that help developer create a python package installer that will build and install a given package by just running the command setup.py install.
so basically what disutils does (i will sit only important stuff):
it search dependencies of the package (install dependencies automatically).
it copy the package modules in site-packages or just create a sym link if it's in develop mode
you can create an egg of you package.
it can also run test over your package.
you can use it to upload your package to pypi.
if you want more detail see this http://docs.python.org/library/distutils.html
You don't have to use distutils to get your own modules working on your own machine; saving them in your python path is sufficient.
When you decide to publish your modules for other people to use, distutils provides a standard way for them to install your modules on their machines. (The "dist" in "distutils" means distribution, as in distributing your software to others.)
I recently began learning Python, and I am a bit confused about how packages are distributed and installed.
I understand that the official way of installing packages is distutils: you download the source tarball, unpack it, and run: python setup.py install, then the module will automagically install itself
I also know about setuptools which comes with easy_install helper script. It uses eggs for distribution, and from what I understand, is built on top of distutils and does the same thing as above, plus it takes care of any dependencies required, all fetched from PyPi
Then there is also pip, which I'm still not sure how it differ from the others.
Finally, as I am on a windows machine, a lot of packages also offers binary builds through a windows installer, especially the ones that requires compiling C/Fortran code, which otherwise would be a nightmare to manually compile on windows (assumes you have MSVC or MinGW/Cygwin dev environment with all necessary libraries setup.. nonetheless try to build numpy or scipy yourself and you will understand!)
So can someone help me make sense of all this, and explain the differences, pros/cons of each method. I'd like to know how each keeps track of packages (Windows Registry, config files, ..). In particular, how would you manage all your third-party libraries (be able to list installed packages, disable/uninstall, etc..)
I use pip, and not on Windows, so I can't provide comparison with the Windows-installer option, just some information about pip:
Pip is built on top of setuptools, and requires it to be installed.
Pip is a replacement (improvement) for setuptools' easy_install. It does everything easy_install does, plus a lot more (make sure all desired distributions can be downloaded before actually installing any of them to avoid broken installs, list installed distributions and versions, uninstall, search PyPI, install from a requirements file listing multiple distributions and versions...).
Pip currently does not support installing any form of precompiled or binary distributions, so any distributions with extensions requiring compilation can only be installed if you have the appropriate compiler available. Supporting installation from Windows binary installers is on the roadmap, but it's not clear when it will happen.
Until recently, pip's Windows support was flaky and untested. Thanks to a lot of work from Dave Abrahams, pip trunk now passes all its tests on Windows (and there's a continuous integration server helping us ensure it stays that way), but a release has not yet been made including that work. So more reliable Windows support should be coming with the next release.
All the standard Python package installation mechanisms store all metadata about installed distributions in a file or files next to the actual installed package(s). Distutils uses a distribution_name-X.X-pyX.X.egg-info file, pip uses a similarly-named directory with multiple metadata files in it. Easy_install puts all the installed Python code for a distribution inside its own zipfile or directory, and places an EGG-INFO directory inside that directory with metadata in it. If you import a Python package from the interactive prompt, check the value of package.__file__; you should find the metadata for that package's distribution nearby.
Info about installed distributions is only stored in any kind of global registry by OS-specific packaging tools such as Windows installers, Apt, or RPM. The standard Python packaging tools don't modify or pay attention to these listings.
Pip (or, in my opinion, any Python packaging tool) is best used with virtualenv, which allows you to create isolated per-project Python mini-environments into which you can install packages without affecting your overall system. Every new virtualenv automatically comes with pip installed in it.
A couple other projects you may want to be aware of as well (yes, there's more!):
distribute is a fork of setuptools which has some additional bugfixes and features.
distutils2 is intended to be the "next generation" of Python packaging. It is (hopefully) adopting the best features of distutils/setuptools/distribute/pip. It is being developed independently and is not ready for use yet, but eventually should replace distutils in the Python standard library and become the de facto Python packaging solution.
Hope all that helped clarify something! Good luck.
I use windows and python. It is somewhat frustrating, because pip doesn't always work to install things. Python is moving to pip, so I still use it. Pip is nice, because you can uninstall items and use
pip freeze > requirements.txt
pip install -r requirements.txt
Another reason I like pip is for virtual environments like venv with python 3.4. I have found venv a lot easier to use on windows than virtualenv.
If you cannot install a package you have to find the binary for it. http://www.lfd.uci.edu/~gohlke/pythonlibs/
I have found these binaries to be very useful.
Pip is trying to make something called a wheel for binary installations.
pip install wheel
wheel convert path\to\binary.exe
pip install converted_wheel.whl
You will also have to do this for any required libraries that do not install and are required for that package.