I need to build a python module from source. It is just my second build and I'm a bit confused regarding the interaction between built packages and binaries installed through package manager.
Do I need to uninstall the binary first?
If I don't need to Will it overwrite the installed version or will both be available?
If it will not overwrite how can I import the built version into python?
Thank you all!
p.s: If it is case sensitive I'm on fedora 24 and the package is matplotlib which is installed through a setup.py.
I strongly recommend to use virtualenv and build your package inside. Is it really necessary to install via setup.py? If not, you can consider using pip to install your package inside virtualenv.
Related
This weekend I've been reading up on conda and the python packaging user guide because I have a simple pure python project that depends on numpy. It seemed to me that distributing/installing this project via conda was better than pip due to this dependency.
One thing on which I'm still not clear: conda will install a python package from a recipe in build.sh, but it seems like build.sh just ends up calling python setup.py install for most python packages.
So even if I want to distribute/install my python package with conda, I still end up depending on setuptools (or distutils) for the actual installation, correct? I was unable to find a conda utility analogous to setuptools; am I missing something?
FWIW, I posted this question on the conda issue tracker.
Thanks!
Typically you will still be using distutils (or setuptools if the library requires it) to install things, yes. It is not technically required. The build.sh can be anything. If you wanted to, you could just copy the code into site-packages. Using setup.py install is recommended, though, as libraries will already have setup.py working, it will install metadata that can be read by pip, and it will compile any extension modules and install any data files.
I've had a quick look around, but because of terminology like dependencies and packages being used in different ways, it's quite tricky to pin down an answer.
I'm building a mixed-language source (Fortran, some C and Python) and the Fortran calls a Python script which depends on the networkx Python package in the PyPI. Normally, I just have networkx installed anyway, so it isn't a problem for me when rebuilding.
However, for distribution, I want the best way to:
Install pip or equivalent, if it is not installed.
Possibly install virtualenv and create a virtual environment, if appropriate.
Download and install networkx using the --user option with pip.
Is there a standard way? Or should I just use CMake dependencies with custom commands that install pip etc.?
it depends. for "manual" install, you definitely should detect if all required (to build) tools are installed, and issue an error if they don't. then use execute_process() to run pip and whatever you want.
from other side, if you are going to produce a real package for some particular Linux, you just pack your binaries and require (via corresponding syntax of particular package format like *.rpm or *.deb that your package depends on some other packages. so, you can be sure that they will be installed w/ (or even before) your package.
I've a strange problem with co-existence of debian package and pip package. For example, I've python-requests (deb version 0.8.2) installed. Then when i install the requests (pip version 2.2.1), the system only apply the deb version instead of pip new version. Does anyone can resolve this problem? Thank you in advance.
In regard to installing python packages by system packages and pip, you have to define clear plan.
Personally, I follow these rules:
Install only minimal set of python packages by system installation packages
There I include supervisord in case, I am not on too old system.
Do not install pip or virtualenv by system package.
Especially with pip in last year there were many situations, when system packages were far back behind what was really needed.
Use Virtualenv and prefer to install packages (by pip) in here
This will keep your system wide Python rather clean. It takes a moment to get used, but it is rather easy to follow, especially, if you use virtualenvwrapper which helps a lot during development.
Prepare conditions for quick installation of compiled packages
Some packages require compilation and this often fails on missing dependencies.
Such packages include e.g. lxml, pyzmq, pyyaml.
Make sure, which ones you are going to use, prepare packages in the system and you are able to install them into virtualenv.
Fine-tuning speed of installation of compiled packages
There is great package format (usable by pip) called wheel. This allows to install a package (like lxml) to install on the same platform within fraction of a second (compared to minutes of compilation). See my answer at SO on this topic
I've installed A LOT of python packages for Python 2.6. Now I would like to upgrade Python to 2.7. Is there a proper or systematic way to update all the installed packages?
In my system, all the packages are installed at
/usr/lib64/python2.6/site-packages/ and
/usr/lib/python2.6/site-packages/
One obvious way is to install Python 2.7, download all the package sources or egg files, and re-install them one by one. However, Some useful packages like numpy and scipy are notorious for installation, especially when one needs to install from source. I expect I'll need to spend several hours to find the packages and solve the installation problems here and there.
Anyone has any suggestions on systematically update the installed packages?
First, you should not never ever ever ever install Python packages in in system library folder with easy_install using sudo on any operating system.
http://jamiecurle.co.uk/blog/installing-pip-virtualenv-and-virtualenvwrapper-on-os-x/#comment-573429347
The correct procedure would be make your installation procedure repeatable. There exist two commonly used solutions in Python world. These solutions automatically download correct versions of Python packages from http://pypi.python.org
PIP
pip and requirements.txt http://www.pip-installer.org/en/latest/requirements.html within virtualenv http://pypi.python.org/pypi/virtualenv
Buidout
Buildout, example from Plone CMS https://github.com/plone/Installers-UnifiedInstaller/blob/master/base_skeleton/versions.cfg
Buildout can also do configure, make, make install style installations for packages which need native libraries. For example there exist solution for libxml2 + lxml
http://pypi.python.org/pypi/z3c.recipe.staticlxml/
(Note: buildout does not need virtualenv as it does its own isolation from system Python)
I recently began learning Python, and I am a bit confused about how packages are distributed and installed.
I understand that the official way of installing packages is distutils: you download the source tarball, unpack it, and run: python setup.py install, then the module will automagically install itself
I also know about setuptools which comes with easy_install helper script. It uses eggs for distribution, and from what I understand, is built on top of distutils and does the same thing as above, plus it takes care of any dependencies required, all fetched from PyPi
Then there is also pip, which I'm still not sure how it differ from the others.
Finally, as I am on a windows machine, a lot of packages also offers binary builds through a windows installer, especially the ones that requires compiling C/Fortran code, which otherwise would be a nightmare to manually compile on windows (assumes you have MSVC or MinGW/Cygwin dev environment with all necessary libraries setup.. nonetheless try to build numpy or scipy yourself and you will understand!)
So can someone help me make sense of all this, and explain the differences, pros/cons of each method. I'd like to know how each keeps track of packages (Windows Registry, config files, ..). In particular, how would you manage all your third-party libraries (be able to list installed packages, disable/uninstall, etc..)
I use pip, and not on Windows, so I can't provide comparison with the Windows-installer option, just some information about pip:
Pip is built on top of setuptools, and requires it to be installed.
Pip is a replacement (improvement) for setuptools' easy_install. It does everything easy_install does, plus a lot more (make sure all desired distributions can be downloaded before actually installing any of them to avoid broken installs, list installed distributions and versions, uninstall, search PyPI, install from a requirements file listing multiple distributions and versions...).
Pip currently does not support installing any form of precompiled or binary distributions, so any distributions with extensions requiring compilation can only be installed if you have the appropriate compiler available. Supporting installation from Windows binary installers is on the roadmap, but it's not clear when it will happen.
Until recently, pip's Windows support was flaky and untested. Thanks to a lot of work from Dave Abrahams, pip trunk now passes all its tests on Windows (and there's a continuous integration server helping us ensure it stays that way), but a release has not yet been made including that work. So more reliable Windows support should be coming with the next release.
All the standard Python package installation mechanisms store all metadata about installed distributions in a file or files next to the actual installed package(s). Distutils uses a distribution_name-X.X-pyX.X.egg-info file, pip uses a similarly-named directory with multiple metadata files in it. Easy_install puts all the installed Python code for a distribution inside its own zipfile or directory, and places an EGG-INFO directory inside that directory with metadata in it. If you import a Python package from the interactive prompt, check the value of package.__file__; you should find the metadata for that package's distribution nearby.
Info about installed distributions is only stored in any kind of global registry by OS-specific packaging tools such as Windows installers, Apt, or RPM. The standard Python packaging tools don't modify or pay attention to these listings.
Pip (or, in my opinion, any Python packaging tool) is best used with virtualenv, which allows you to create isolated per-project Python mini-environments into which you can install packages without affecting your overall system. Every new virtualenv automatically comes with pip installed in it.
A couple other projects you may want to be aware of as well (yes, there's more!):
distribute is a fork of setuptools which has some additional bugfixes and features.
distutils2 is intended to be the "next generation" of Python packaging. It is (hopefully) adopting the best features of distutils/setuptools/distribute/pip. It is being developed independently and is not ready for use yet, but eventually should replace distutils in the Python standard library and become the de facto Python packaging solution.
Hope all that helped clarify something! Good luck.
I use windows and python. It is somewhat frustrating, because pip doesn't always work to install things. Python is moving to pip, so I still use it. Pip is nice, because you can uninstall items and use
pip freeze > requirements.txt
pip install -r requirements.txt
Another reason I like pip is for virtual environments like venv with python 3.4. I have found venv a lot easier to use on windows than virtualenv.
If you cannot install a package you have to find the binary for it. http://www.lfd.uci.edu/~gohlke/pythonlibs/
I have found these binaries to be very useful.
Pip is trying to make something called a wheel for binary installations.
pip install wheel
wheel convert path\to\binary.exe
pip install converted_wheel.whl
You will also have to do this for any required libraries that do not install and are required for that package.