Macports does not recognize pip-installed packages - python

Until today, I have been using the macports version of python27 and installing python packages through macports. Today, I needed some packages which were not available through macports; I learned about pip and found them there. After installing these packages through pip, however, I realized that neither pip nor macports could see what had been installed by the other. So, for consistency, I decided to uninstall all macports packages, install python27 and py27-pip through macports and then proceed to install all of my python packages through pip.
This worked fine, but since macports does not know about my pip-installed python packages, I ran into trouble when installing something else which depends on python (e.g., inkscape): macports tried to install its own version of, e.g. py27-numpy (already installed by pip) and then failed installation because it "already exists and does not belong to a registered port."
Is there a consistent way to use pip and to get macports to recognize that the python packages it might need for something else are already installed?

The solution is: dont use Macports for installing Python's packages.
Macports is a general package manager and it registers installed packages in its database.
Pip is a package manager for Python so if you want to install Python package, use appropriate package management tool. Pip doesnt have it's own database to keep evidence about installed stuff - it just checks Python's path to see if the package is there (and that's what you want).
Sooner or later you'll use Virtualenv anyway and you'll need pip to install packages in there too so it's better to use it everywhere.

Related

pip uninstall fails with "owned by OS" - even under sudo

I'm working on a DevOps project for a client who's using Python. Though I never used it professionally, I know a few things, such as using virtualenv and pip - though not in great detail.
When I looked at the staging box, which I am trying to prepare for running a functional test suite, I saw chaos. Tons of packages installed globally, and those installed inside a virtualenv not matching the requirements.txt of the project. OK, thought I, there's a lot of cleaning up. Starting with global packages.
However, I ran into a problem at once:
➜ ~ pip uninstall PyYAML
Not uninstalling PyYAML at /usr/lib/python2.7/dist-packages, owned by OS
OK, someone must've done a 'sudo pip install PyYAML'. I think I know how to fix it:
➜ ~ sudo pip uninstall PyYAML
Not uninstalling PyYAML at /usr/lib/python2.7/dist-packages, owned by OS
Uh, apparently I don't.
A search revealed some similar conflicts caused by users installing packages bypassing pip, but I'm not convinced - why would pip even know about them, if that was the case? Unless the "other" way is placing them in the same location pip would use - but if that's the case, why would it fail to uninstall under sudo?
Pip denies to uninstall these packages because Debian developers patched it to behave so. This allows you to use both pip and apt simultaneously. The "original" pip program doesn't have such functionality
Update: my answer is relevant only to old versions of Pip. For the latest versions, Pip is configured to modify only the files which reside only in its "home directory" - that is /usr/local/lib/python3.* for Debian. For the latest tools, you will get these errors when you try to delete the package, installed by apt:
For pip 9.0.1-2.3~ubuntu1 (installed from Ubuntu repository):
Not uninstalling pyyaml at /usr/lib/python3/dist-packages, outside environment /usr
For pip 10.0.1 (original, installed from pypi.org):
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
The point is not that pip cannot install the package because you don't have enough permissions, but because it is not a package installed through pip, so it doesn't want to uninstall it.
dist-packages is where packages installed by the OS package manager reside; as they are handled by another package manager (e.g. apt on Ubuntu/Debian, pacman on Arch, rpm/yum on CentOS, ... ) pip won't touch them (but still has to know about them as they are installed packages, so they can be used to satisfy dependencies of pip-installed packages).
You should also probably avoid to touch them unless you use the correct package manager, and even so, they may have been installed automatically to satisfy the dependencies of some program, so you may not remove them without breaking it. This can usually be checked quite easily, although the exact way depends from the precise Linux distribution you are using.

How to make 'pip install' not uninstall other versions?

I am managing several modules on an HPC, and want to install some requirements for a tool using pip.
I won't use virtualenv because they don't work well with our module system. I want to install module-local versions of packages and will set PYTHONPATH correctly when the module is loaded, and this has worked just fine when the packages I am installing are not also installed in the default python environment.
What I do not want to do is uninstall the default python's versions of packages while I am installing module-local versions.
For example, one package requires numpy==1.6, and the default version installed with the python I am using is 1.8.0. When I
pip install --install-option="--prefix=$RE_PYTHON" numpy==1.6
where RE_PYTHON points to the top of the module-local site-packages directory, numpy==1.6 installs fine, then pip goes ahead and starts uninstalling 1.8.0 from the tree of the python I am using (why it wants to uninstall a newer version is beyond me but I want to avoid this even when I am doing a local install of e.g. numpy==1.10.1).
How can I prevent pip from doing that? It is really annoying and I have not been able to find a solution that doesn't involve virtualenv.
You have to explicitly tell pip to ignore the current installed package by specifying the -I option (or --ignore-installed). So you should use:
PYTHONUSERBASE=$RE_PYTHON pip install -I --user numpy==1.6
This is mentioned in this answer by Ian Bicking.

"command-not-found==0.2.44" in pip's freeze

The output of pip freeze on my machine has included the following odd line:
command-not-found==0.2.44
When trying to install requirements on a different machine, I got the obvious No distributions at all found for command-not-found==0.2.44. Is this a pip bug? Or is there any real python package of that name, one which does not exist in pypi?
Indeed, as mentioned in the follow up comments, Ubuntu has a python package, installed via dpkg/apt that is called "python-commandnotfound"
$apt-cache search command-not-found
command-not-found - Suggest installation of packages in interactive bash sessions
command-not-found-data - Set of data files for command-not-found.
python-commandnotfound - Python 2 bindings for command-not-found.
python3-commandnotfound - Python 3 bindings for command-not-found.
As this is provided via apt, and not available in the pypi repo, you won't be able to install it via pip, but pip will see that it is installed. For the purposes of showing installed packages, pip doesn't care if a package is installed via apt, easy_install, pip, manually, etc.
In short, if you actually need it on another host (which I assume you don't) you'll need to apt-get install python-commandnotfound.

Why use pip over easy_install?

A tweet reads:
Don't use easy_install, unless you
like stabbing yourself in the face.
Use pip.
Why use pip over easy_install? Doesn't the fault lie with PyPI and package authors mostly? If an author uploads crap source tarball (eg: missing files, no setup.py) to PyPI, then both pip and easy_install will fail. Other than cosmetic differences, why do Python people (like in the above tweet) seem to strongly favor pip over easy_install?
(Let's assume that we're talking about easy_install from the Distribute package, that is maintained by the community)
From Ian Bicking's own introduction to pip:
pip was originally written to improve on easy_install in the following ways
All packages are downloaded before installation. Partially-completed installation doesn’t occur as a result.
Care is taken to present useful output on the console.
The reasons for actions are kept track of. For instance, if a package is being installed, pip keeps track of why that package was required.
Error messages should be useful.
The code is relatively concise and cohesive, making it easier to use programmatically.
Packages don’t have to be installed as egg archives, they can be installed flat (while keeping the egg metadata).
Native support for other version control systems (Git, Mercurial and Bazaar)
Uninstallation of packages.
Simple to define fixed sets of requirements and reliably reproduce a set of packages.
Many of the answers here are out of date for 2015 (although the initially accepted one from Daniel Roseman is not). Here's the current state of things:
Binary packages are now distributed as wheels (.whl files)—not just on PyPI, but in third-party repositories like Christoph Gohlke's Extension Packages for Windows. pip can handle wheels; easy_install cannot.
Virtual environments (which come built-in with 3.4, or can be added to 2.6+/3.1+ with virtualenv) have become a very important and prominent tool (and recommended in the official docs); they include pip out of the box, but don't even work properly with easy_install.
The distribute package that included easy_install is no longer maintained. Its improvements over setuptools got merged back into setuptools. Trying to install distribute will just install setuptools instead.
easy_install itself is only quasi-maintained.
All of the cases where pip used to be inferior to easy_install—installing from an unpacked source tree, from a DVCS repo, etc.—are long-gone; you can pip install ., pip install git+https://.
pip comes with the official Python 2.7 and 3.4+ packages from python.org, and a pip bootstrap is included by default if you build from source.
The various incomplete bits of documentation on installing, using, and building packages have been replaced by the Python Packaging User Guide. Python's own documentation on Installing Python Modules now defers to this user guide, and explicitly calls out pip as "the preferred installer program".
Other new features have been added to pip over the years that will never be in easy_install. For example, pip makes it easy to clone your site-packages by building a requirements file and then installing it with a single command on each side. Or to convert your requirements file to a local repo to use for in-house development. And so on.
The only good reason that I know of to use easy_install in 2015 is the special case of using Apple's pre-installed Python versions with OS X 10.5-10.8. Since 10.5, Apple has included easy_install, but as of 10.10 they still don't include pip. With 10.9+, you should still just use get-pip.py, but for 10.5-10.8, this has some problems, so it's easier to sudo easy_install pip. (In general, easy_install pip is a bad idea; it's only for OS X 10.5-10.8 that you want to do this.) Also, 10.5-10.8 include readline in a way that easy_install knows how to kludge around but pip doesn't, so you also want to sudo easy_install readline if you want to upgrade that.
Another—as of yet unmentioned—reason for favoring pip is because it is the new hotness and will continue to be used in the future.
The infographic below—from the Current State of Packaging section in the The Hitchhiker's Guide to Packaging v1.0—shows that setuptools/easy_install will go away in the future.
Here's another infographic from distribute's documentation showing that Setuptools and easy_install will be replaced by the new hotness—distribute and pip. While pip is still the new hotness, Distribute merged with Setuptools in 2013 with the release of Setuptools v0.7.
Two reasons, there may be more:
pip provides an uninstall command
if an installation fails in the middle, pip will leave you in a clean state.
REQUIREMENTS files.
Seriously, I use this in conjunction with virtualenv every day.
QUICK DEPENDENCY MANAGEMENT TUTORIAL, FOLKS
Requirements files allow you to create a snapshot of all packages that have been installed through pip. By encapsulating those packages in a virtualenvironment, you can have your codebase work off a very specific set of packages and share that codebase with others.
From Heroku's documentation https://devcenter.heroku.com/articles/python
You create a virtual environment, and set your shell to use it. (bash/*nix instructions)
virtualenv env
source env/bin/activate
Now all python scripts run with this shell will use this environment's packages and configuration. Now you can install a package locally to this environment without needing to install it globally on your machine.
pip install flask
Now you can dump the info about which packages are installed with
pip freeze > requirements.txt
If you checked that file into version control, when someone else gets your code, they can setup their own virtual environment and install all the dependencies with:
pip install -r requirements.txt
Any time you can automate tedium like this is awesome.
pip won't install binary packages and isn't well tested on Windows.
As Windows doesn't come with a compiler by default pip often can't be used there. easy_install can install binary packages for Windows.
UPDATE: setuptools has absorbed distribute as opposed to the other way around, as some thought. setuptools is up-to-date with the latest distutils changes and the wheel format. Hence, easy_install and pip are more or less on equal footing now.
Source: http://pythonhosted.org/setuptools/merge-faq.html#why-setuptools-and-not-distribute-or-another-name
As an addition to fuzzyman's reply:
pip won't install binary packages and isn't well tested on Windows.
As Windows doesn't come with a compiler by default pip often can't be
used there. easy_install can install binary packages for Windows.
Here is a trick on Windows:
you can use easy_install <package> to install binary packages to avoid building a binary
you can use pip uninstall <package> even if you used easy_install.
This is just a work-around that works for me on windows.
Actually I always use pip if no binaries are involved.
See the current pip doku: http://www.pip-installer.org/en/latest/other-tools.html#pip-compared-to-easy-install
I will ask on the mailing list what is planned for that.
Here is the latest update:
The new supported way to install binaries is going to be wheel!
It is not yet in the standard, but almost. Current version is still an alpha: 1.0.0a1
https://pypi.python.org/pypi/wheel
http://wheel.readthedocs.org/en/latest/
I will test wheel by creating an OS X installer for PySide using wheel instead of eggs. Will get back and report about this.
cheers - Chris
A quick update:
The transition to wheel is almost over. Most packages are supporting wheel.
I promised to build wheels for PySide, and I did that last summer. Works great!
HINT:
A few developers failed so far to support the wheel format, simply because they forget to
replace distutils by setuptools.
Often, it is easy to convert such packages by replacing this single word in setup.py.
Just met one special case that I had to use easy_install instead of pip, or I have to pull the source codes directly.
For the package GitPython, the version in pip is too old, which is 0.1.7, while the one from easy_install is the latest which is 0.3.2.rc1.
I'm using Python 2.7.8. I'm not sure about the underlay mechanism of easy_install and pip, but at least the versions of some packages may be different from each other, and sometimes easy_install is the one with newer version.
easy_install GitPython

What is the most compatible way to install python modules on a Mac?

I'm starting to learn python and loving it. I work on a Mac mainly as well as Linux. I'm finding that on Linux (Ubuntu 9.04 mostly) when I install a python module using apt-get it works fine. I can import it with no trouble.
On the Mac, I'm used to using Macports to install all the Unixy stuff. However, I'm finding that most of the python modules I install with it are not being seen by python. I've spent some time playing around with PATH settings and using python_select . Nothing has really worked and at this point I'm not really understanding, instead I'm just poking around.
I get the impression that Macports isn't universally loved for managing python modules. I'd like to start fresh using a more "accepted" (if that's the right word) approach.
So, I was wondering, what is the method that Mac python developers use to manage their modules?
Bonus questions:
Do you use Apple's python, or some other version?
Do you compile everything from source or is there a package manger that works well (Fink?).
The most popular way to manage python packages (if you're not using your system package manager) is to use setuptools and easy_install. It is probably already installed on your system. Use it like this:
easy_install django
easy_install uses the Python Package Index which is an amazing resource for python developers. Have a look around to see what packages are available.
A better option is pip, which is gaining traction, as it attempts to fix a lot of the problems associated with easy_install. Pip uses the same package repository as easy_install, it just works better. Really the only time use need to use easy_install is for this command:
easy_install pip
After that, use:
pip install django
At some point you will probably want to learn a bit about virtualenv. If you do a lot of python development on projects with conflicting package requirements, virtualenv is a godsend. It will allow you to have completely different versions of various packages, and switch between them easily depending your needs.
Regarding which python to use, sticking with Apple's python will give you the least headaches, but If you need a newer version (Leopard is 2.5.1 I believe), I would go with the macports python 2.6.
Your question is already three years old and there are some details not covered in other answers:
Most people I know use HomeBrew or MacPorts, I prefer MacPorts because of its clean cut of what is a default Mac OS X environment and my development setup. Just move out your /opt folder and test your packages with a normal user Python environment
MacPorts is only portable within Mac, but with easy_install or pip you will learn how to setup your environment in any platform (Win/Mac/Linux/Bsd...). Furthermore it will always be more up to date and with more packages
I personally let MacPorts handle my Python modules to keep everything updated. Like any other high level package manager (ie: apt-get) it is much better for the heavy lifting of modules with lots of binary dependencies. There is no way I would build my Qt bindings (PySide) with easy_install or pip. Qt is huge and takes a lot to compile. As soon as you want a Python package that needs a library used by non Python programs, try to avoid easy_install or pip
At some point you will find that there are some packages missing within MacPorts. I do not believe that MacPorts will ever give you the whole CheeseShop. For example, recently I needed the Elixir module, but MacPorts only offers py25-elixir and py26-elixir, no py27 version. In cases like these you have:
pip-2.7 install --user elixir
( make sure you always type pip-(version) )
That will build an extra Python library in your home dir. Yes, Python will work with more than one library location: one controlled by MacPorts and a user local one for everything missing within MacPorts.
Now notice that I favor pip over easy_install. There is a good reason you should avoid setuptools and easy_install. Here is a good explanation and I try to keep away from them. One very useful feature of pip is giving you a list of all the modules (along their versions) that you installed with MacPorts, easy_install and pip itself:
pip-2.7 freeze
If you already started using easy_install, don't worry, pip can recognize everything done already by easy_install and even upgrade the packages installed with it.
If you are a developer keep an eye on virtualenv for controlling different setups and combinations of module versions. Other answers mention it already, what is not mentioned so far is the Tox module, a tool for testing that your package installs correctly with different Python versions.
Although I usually do not have version conflicts, I like to have virtualenv to set up a clean environment and get a clear view of my packages dependencies. That way I never forget any dependencies in my setup.py
If you go for MacPorts be aware that multiple versions of the same package are not selected anymore like the old Debian style with an extra python_select package (it is still there for compatibility). Now you have the select command to choose which Python version will be used (you can even select the Apple installed ones):
$ port select python
Available versions for python:
none
python25-apple
python26-apple
python27 (active)
python27-apple
python32
$ port select python python32
Add tox on top of it and your programs should be really portable
Please see Python OS X development environment. The best way is to use MacPorts. Download and install MacPorts, then install Python via MacPorts by typing the following commands in the Terminal:
sudo port install python26 python_select
sudo port select --set python python26
OR
sudo port install python30 python_select
sudo port select --set python python30
Use the first set of commands to install Python 2.6 and the second set to install Python 3.0. Then use:
sudo port install py26-packagename
OR
sudo port install py30-packagename
In the above commands, replace packagename with the name of the package, for example:
sudo port install py26-setuptools
These commands will automatically install the package (and its dependencies) for the given Python version.
For a full list of available packages for Python, type:
port list | grep py26-
OR
port list | grep py30-
Which command you use depends on which version of Python you chose to install.
I use MacPorts to install Python and any third-party modules tracked by MacPorts into /opt/local, and I install any manually installed modules (those not in the MacPorts repository) into /usr/local, and this has never caused any problems. I think you may be confused as to the use of certain MacPorts scripts and environment variables.
MacPorts python_select is used to select the "current" version of Python, but it has nothing to do with modules. This allows you to, e.g., install both Python 2.5 and Python 2.6 using MacPorts, and switch between installs.
The $PATH environment variables does not affect what Python modules are loaded. $PYTHONPATH is what you are looking for. $PYTHONPATH should point to directories containing Python modules you want to load. In my case, my $PYTHONPATH variable contains /usr/local/lib/python26/site-packages. If you use MacPorts' Python, it sets up the other proper directories for you, so you only need to add additional paths to $PYTHONPATH. But again, $PATH isn't used at all when Python searches for modules you have installed.
$PATH is used to find executables, so if you install MacPorts' Python, make sure /opt/local/bin is in your $PATH.
There's nothing wrong with using a MacPorts Python installation. If you are installing python modules from MacPorts but then not seeing them, that likely means you are not invoking the MacPorts python you installed to. In a terminal shell, you can use absolute paths to invoke the various Pythons that may be installed. For example:
$ /usr/bin/python2.5 # Apple-supplied 2.5 (Leopard)
$ /opt/local/bin/python2.5 # MacPorts 2.5
$ /opt/local/bin/python2.6 # MacPorts 2.6
$ /usr/local/bin/python2.6 # python.org (MacPython) 2.6
$ /usr/local/bin/python3.1 # python.org (MacPython) 3.1
To get the right python by default requires ensuring your shell $PATH is set properly to ensure that the right executable is found first. Another solution is to define shell aliases to the various pythons.
A python.org (MacPython) installation is fine, too, as others have suggested. easy_install can help but, again, because each Python instance may have its own easy_install command, make sure you are invoking the right easy_install.
If you use Python from MacPorts, it has it's own easy_install located at: /opt/local/bin/easy_install-2.6 (for py26, that is). It's not the same one as simply calling easy_install directly, even if you used python_select to change your default python command.
Have you looked into easy_install at all? It won't synchronize your macports or anything like that, but it will automatically download the latest package and all necessary dependencies, i.e.
easy_install nose
for the nose unit testing package, or
easy_install trac
for the trac bug tracker.
There's a bit more information on their EasyInstall page too.
For MacPython installations, I found an effective solution to fixing the problem with setuptools (easy_install) in this blog post:
http://droidism.com/getting-running-with-django-and-macpython-26-on-leopard
One handy tip includes finding out which version of python is active in the terminal:
which python
When you install modules with MacPorts, it does not go into Apple's version of Python. Instead those modules are installed onto the MacPorts version of Python selected.
You can change which version of Python is used by default using a mac port called python_select. instructions here.
Also, there's easy_install. Which will use python to install python modules.
You may already have pip3 pre-installed, so just try it!
Regarding which python version to use, Mac OS usually ships an old version of python. It's a good idea to upgrade to a newer version. You can download a .dmg from http://www.python.org/download/ . If you do that, remember to update the path. You can find the exact commands here http://farmdev.com/thoughts/66/python-3-0-on-mac-os-x-alongside-2-6-2-5-etc-/
I use easy_install with Apple's Python, and it works like a charm.
Directly install one of the fink packages (Django 1.6 as of 2013-Nov)
fink install django-py27
fink install django-py33
Or create yourself a virtualenv:
fink install virtualenv-py27
virtualenv django-env
source django-env/bin/activate
pip install django
deactivate # when you are done
Or use fink django plus any other pip installed packages in a virtualenv
fink install django-py27
fink install virtualenv-py27
virtualenv django-env --system-site-packages
source django-env/bin/activate
# django already installed
pip install django-analytical # or anything else you might want
deactivate # back to your normally scheduled programming

Categories

Resources