If, in my Travis CI build, I want to uninstall a package locally and install the PyPI version, how do I modify the .travis.yml script to automatically say "yes" to the pip uninstall packagename command?
As it stands right now, all I have is:
- pip uninstall packagename #remove local version
- pip install packagename #install the PyPI version
Looking into pip help uninstall you will see that it takes a --yes flag.
Also you could consider using pip install -U packagename.
Related
I'm confused by the intended pip usage. Pip comes installed with Python, which is great, but I get the following warnings when new versions come out:
WARNING: You are using pip version 21.1.1; however, version 21.1.3 is available.
You should consider upgrading via the '/usr/local/opt/python#3.8/bin/python3.8 -m pip install --upgrade pip' command.
I follow the instructions to install it using the command they gave. But then it uninstalls my existing pip and is not able to install the new version.
Installing collected packages: pip
Attempting uninstall: pip
Found existing installation: pip 21.1.1
Uninstalling pip-21.1.1:
ERROR: Could not install packages due to an OSError: Cannot move the non-empty directory '/usr/local/lib/python3.8/site-packages/pip-21.1.1.dist-info/': Lacking write permission to '/usr/local/lib/python3.8/site-packages/pip-21.1.1.dist-info/'.
The pip command is now unrecognized, and the official documentation for upgrading pip suggests running:
python -m pip install -U pip
which gives the same permission error.
I Google this error, and found that the community highly advises to not sudo from these questions (this and this). They also advised pip3 install --upgrade pip --user which also gave the same error. The common consensus is to only install pip packages inside virtual environments, but I'm hesitant to have pip completely uninstalled.
So I got pip to install using sudo, but it's unclear whether I've inadvertently affected (or will affect future) system-wide installations, or how I'd check for these.
I don't understand why installing pip inside /usr/local/ requires sudo, and whether I should only be using pip exclusively inside virtual environments and never outside it
pip can be installed with sudo, into a folder that you don't have permissions to write to. However, it can install packages outside of that folder (and thus, into a folder you have write permissions). However, it is recommended that you don't install pip into a root folder, and instead install it into your home directory.
The command to install pip as root is
sudo apt-get install pip
It should then prompt you for your password. I recommend using sudo whenever you install something.
Im trying to install google assistant on my Raspberry Pi, but when I keep getting an error: pip is a package and cannot be directly executed
Instead of
pip [...]
Try doing
python -m pip [...]
Can't really help more without more info.
I think your version of pip is old. You need to upgrade it first, like this:
pip install -U pip
You may need to upgrade setuptools too:
pip install -U setuptools
Since google-assistant-library is available as a wheel, you need to install wheel too:
pip install wheel
I don't know if you can do that with Raspberry Pi, but I recommend you to used a virtualenv. That way, you have a fresh and isolated Python executable and a recent version of pip.
virtualenv your_proj
source your_proj/bin/activate
pip install wheel
pip install google-assistant-library
For newer version ie. using pip3:
pip3 install -U <<package name>>
I had the same problem.
I think it was an outcome of a failed
> .\python.exe -m pip install --upgrade pip
do to some environment misconfiguration.
So it first removed the existing version 10.0.1, and then the installation of the new version 22.3.1 failed, leaving me with no pip.
From official documentation, I ran
> .\python.exe -m ensurepip --upgrade
which restored the original pip 10.0.1.
Then I fixed the environment problem, and then again
> .\python.exe -m pip install --upgrade pip
I now have pip 22.3.1.
Our sysadmin has installed a package, so I can remove my local copy. I'd like to say
pip uninstall --user <package>
but pip uninstall does not support --user. (At least pip 1.5.4 on Linux doesn't.)
Is there an easy way to do this by hand, i.e., delete the directory that contains the package?
This was a known bug in pip
Ref : https://github.com/pypa/pip/issues/2094
As pip uninstall does not have --user option unlike pip install the question is if there even exists a way to uninstall package installed with pip install --user?
It is now cleared with a note
The packages mentioned in the ticket started working after they offered Wheel-based packages.
I have found that upgrading the package first will let you uninstall the package that you installed with --user option. For my case was elevated:
I have installed with command:
pip3 install --user elevate
When i try to uninstall i recieve the skip info:
Skipping elevate as it is not installed.
After many unsuccessfull commands i have found that i need to update the package first with:
pip3 install --user --upgrade elevated
Then i was able to successfully uninstall the elevate package:
pip3 uninstall elevated
I've seen it documented that you can install a Github hosting Python package using pip via:
sudo pip install -e git+git://github.com/myuser/myproject.git#egg=myproject
However, this appears to install the package to the current working directory, which is almost never where is should be.
How do you instruct pip to install it into the standard Python package directory (e.g. on Ubuntu this is /usr/local/lib/python2.6/dist-packages)?
The -e flag tells pip to install it as "editable", i.e. keep the source around. Drop the -e flag and it should do about what you expect.
sudo pip install git+git://github.com/myuser/myproject.git#egg=myproject
If that doesn't work try using https instead of git.
sudo pip install git+https://github.com/myuser/myproject.git#egg=myproject
For Python 3 make sure you have python3-pip installed (and of course git installed):
The syntax just changed to:
sudo pip3 install git+git://github.com/someuser/someproject.git
I'm trying to uninstall all django packages in my superuser environment to ensure that all my webapp dependencies are installed to my virtualenv.
sudo su
sudo pip freeze | grep -E '^django-' | xargs pip -q uninstall
But pip wants to confirm every package uninstall, and there doesn't seem to be a -y option for pip. Is there a better way to uninstall a batch of python modules? Is rm -rf .../site-packages/ a proper way to go? Is there an easy_install alternative?
Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies, e.g. pip --upgrade install, but forcing even equally old versions to be installed to override any system modules. I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).
starting with pip version 7.1.2 you can run pip uninstall -y <python package(s)>
pip uninstall -y package1 package2 package3
or from file
pip uninstall -y -r requirements.txt
Pip does NOT include a --yes option (as of pip version 1.3.1).
WORKAROUND: pipe yes to it!
$ sudo ls # enter pw so not prompted again
$ /usr/bin/yes | sudo pip uninstall pymongo
If you want to uninstall every package from requirements.txt,
pip uninstall -y -r requirements.txt
on www.saturncloud.io, Jupiter notebooks one can use like this:
!yes | pip uninstall tensorflow
!yes | pip uninstall gast
!yes | pip uninstall tensorflow-probability
Alternatively, would it be better to force pip to install all dependencies to the virtualenv rather than relying on the system python modules to meet those dependencies,
Yes. Don't mess too much with the inbuilt system installed packages. Many of the system packages, particularly in OS X (even the debian and the derived varieties) depend too much on them.
pip --upgrade install, but forcing even equally old versions to be installed to override any system modules.
It should not be a big deal if there are a few more packages installed within the venv that are already there in the system package, particularly if they are of different version. Thats the whole point of virtualenv.
I tried activating my virtualenv and then pip install --upgrade -r requirements.txt and that does seem to install the dependencies, even those existing in my system path, but I can't be sure if that's because my system modules were old. And man pip doesn't seem to guarantee this behavior (i.e. installing the same version of a package that already exists in the system site-packages).
No, it doesn't install the packages already there in the main installation unless you have used the --no-site-packages flag to create it, or the required and present versions are different..
Lakshman Prasad was right, pip --upgrade and/or virtualenv --no-site-packages is the way to go. Uninstalling the system-wide python modules is bad.
The --upgrade option to pip does install required modules in the virtual env, even if they already exist in the system environment, and even if the required version or latest available version is the same as the system version.
pip --upgrade install
And, using the --no-site-packages option when creating the virtual environment ensures that missing dependencies can't possibly be masked by the presence of missing modules in the system path. This helps expose problems during migration of a module from one package to another, e.g. pinax.apps.groups -> django-groups, especially when the problem is with load templatetags statements in django which search all available modules for templatetags directories and the tag definitions within.
pip install -U xxxx
can bypass confirm