I know this topic has been beat to death but I have not been able to find a solution to the problem I'm having on SO or elsewhere, so I suspect that there may be a bug somewhere in my system.
I am on an older RHEL 6 platform with Python 3.4. I am developing an application that will run on this platform that uses Qt. I've installed all of the relevant libraries via yum (e.g. qt-devel, pyqt4-devel, etc.) and now want to install my application package as an "editable" package using pip install -e mypkg. I also have a couple of dependency requirements that are not on yum and must be installed via pip.
What I would like to do is create a virtualenv that "inherits" the system packages installed via yum but allows me to pip install my own packages into a virtualenv directory in my home directory.
From my Googling it looks like the best way to do this is to create a virtual env with the system's site packages directory:
$ python3 -m venv --system-site-packages ~/venv
However, when I try to install a package to this virtualenv's site-packages directory, it attempts to install it under /usr/lib and I get a Permission denied error.
So it appears that the --system-site-packages option makes my virtualenv completely share the site-packages directory from my system instead of using it as a "base", where further packages can be layered on top.
This answer states that using pip install -I should do what I want, but that does not appear to be the case:
(venv) $ pip3 install -I bitstring
...
error: could not create '/usr/lib/python3.4/site-packages/bitstring.py': Permission denied
Create the virtual environment without the --system-site-packages switch. After the environment was created go to the folder the environment was created in. It should have a file pyvenv.cfg. Edit this file. It has (among other text) a line
include-system-site-packages = false
Change this line to:
include-system-site-packages = true
Activate the environment. Module installations will now go to the virtual environment and the system site packages are visible too.
With Python 3.8, it seems --system-site-packages work as expected:
python3 -m venv --system-site-packages myProject
cat myProject/pyvenv.cfg
home = /usr/bin
include-system-site-packages = true
version = 3.8.5
After installation astroid, isort, wrapt, I got:
pip list -v
Package Version Location Installer
---------------------- -------------------- ------------------------------------------------------- ---------
apturl 0.5.2 /usr/lib/python3/dist-packages
astroid 2.4.2 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
isort 5.6.4 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
jedi 0.15.2 /usr/lib/python3/dist-packages
keyring 18.0.1 /usr/lib/python3/dist-packages
wrapt 1.12.1 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
Already installed 'system' packages are taken from /usr/lib/python3/dist-packages while locally (venv) installed packages from: /home/to/no/MR/auto-gen/lib/python3.8/site-packages
Related
On MacOs I've installed with brew the pipx:
brew install pipx
then pipx install black
$ pipx list
venvs are in /Users/mc/.local/pipx/venvs
apps are exposed on your $PATH at /Users/mc/.local/bin
package black 22.12.0, installed using Python 3.11.1
- black
- blackd
However, I keep getting missing dependency:
$ /Users/mc/.local/bin/blackd
Traceback (most recent call last):
File "/Users/mc/.local/bin/blackd", line 5, in <module>
from blackd import patched_main
File "/Users/mc/.local/pipx/venvs/black/lib/python3.11/site-packages/blackd/__init__.py", line 14, in <module>
raise ImportError(
ImportError: aiohttp dependency is not installed: No module named 'aiohttp'. Please re-install black with the '[d]' extra install to obtain aiohttp_cors: `pip install black[d]`
How to fix it?
Why pipx is not solving this dependency while installing black ?
Why it uses some (no idea where is this installed) python 3.11.1 when my system python is 3.9.6
$ python3 --version
Python 3.9.6
EDIT
I've did as advised by below answer from #KarlKnechtel :
$ brew install python#3.10
==> Auto-updated Homebrew!
Updated 1 tap (romkatv/powerlevel10k).
You have 2 outdated formulae installed.
You can upgrade them with brew upgrade
or list them with brew outdated.
==> Fetching python#3.10
==> Downloading https://ghcr.io/v2/homebrew/core/python/3.10/manifests/3.10.9
######################################################################## 100.0%
==> Downloading https://ghcr.io/v2/homebrew/core/python/3.10/blobs/sha256:a9b28161cec6e1a027f1eab7576af7
==> Downloading from https://pkg-containers.githubusercontent.com/ghcr1/blobs/sha256:a9b28161cec6e1a027f
######################################################################## 100.0%
==> Pouring python#3.10--3.10.9.arm64_monterey.bottle.tar.gz
==> /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10 -m ensurepip
==> /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10 -m pip install -v --no-deps --no-index --upgr
==> Caveats
Python has been installed as
/opt/homebrew/bin/python3
Unversioned symlinks `python`, `python-config`, `pip` etc. pointing to
`python3`, `python3-config`, `pip3` etc., respectively, have been installed into
/opt/homebrew/opt/python#3.10/libexec/bin
You can install Python packages with
pip3 install <package>
They will install into the site-package directory
/opt/homebrew/lib/python3.10/site-packages
tkinter is no longer included with this formula, but it is available separately:
brew install python-tk#3.10
See: https://docs.brew.sh/Homebrew-and-Python
==> Summary
🍺 /opt/homebrew/Cellar/python#3.10/3.10.9: 3,110 files, 57.1MB
==> Running `brew cleanup python#3.10`...
Disable this behaviour by setting HOMEBREW_NO_INSTALL_CLEANUP.
Hide these hints with HOMEBREW_NO_ENV_HINTS (see `man brew`).
so I got:
$ python3 --version
Python 3.10.9
$brew list python python3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/2to3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/2to3-3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/idle3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/idle3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pip3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pip3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pydoc3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pydoc3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3-config
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10-config
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/wheel3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/wheel3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/Frameworks/Python.framework/ (3055 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/IDLE 3.app/Contents/ (8 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/lib/pkgconfig/ (4 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/libexec/bin/ (6 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/libexec/wheel-0.38.4-py3-none-any.whl
/opt/homebrew/Cellar/python#3.10/3.10.9/Python Launcher 3.app/Contents/ (16 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/share/man/ (2 files)
but still when I install black it installs python 3.11:
$pipx install black[d]
zsh: no matches found: black[d]
$pipx install black
installed package black 22.12.0, installed using Python 3.11.1
These apps are now globally available
- black
- blackd
done! ✨ 🌟 ✨
I solved it as in edit 1 but later you have to do according to this:
pipx install "black[d]" --force
but after that I got error:
dyld[56881]: Library not loaded: '/opt/homebrew/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/Python'
Referenced from: '/Users/mc/Library/Application Support/pypoetry/venv/bin/python'
Reason: tried: '/opt/homebrew/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/Python' (no such file), '/Library/Frameworks/Python.framework/Versions/3.9/Python' (no such file), '/System/Library/Frameworks/Python.framework/Versions/3.9/Python' (no such file)
so I had to:
curl -sSL https://install.python-poetry.org | python3 - --uninstall
curl -sSL https://install.python-poetry.org | python3 -
And now:
poetry env use /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10
as 3.10 is required for aiohttp as mentioned by #KarlKnechel
When you use pipx rather than pip to install a package, it will:
create a new virtual environment for the package
use pip to install the package into that virtual environment
Do some additional setup work, so that "entry points" for the package (commands created by the package to use directly at the command line, such as blackd) will work, without having to activate the specific virtual environment first.
That is its explicit purpose (i.e., the reason not to just use pip), and therefore why there is a different Python being used rather than the system Python.
To use the blackd entry point defined by Black (which starts up a server), the aiohttp optional dependency is required. This must be specified at installation. Per the error message, with pip, this looks like pip install black[d].
However, from the Black release notes (since version 22.8.0; current version 22.12.0):
Python 3.11 is now supported, except for blackd as aiohttp does not support 3.11 as of publishing (#3234)
So, pipx will need to use an older version to create the virtual environment.
From the pipx documentation:
The default python executable used to install a package is
typically the python used to execute pipx and can be overridden
by setting the environment variable PIPX_DEFAULT_PYTHON.
Presumably, brew install pipx installed pipx into some 3.11 install that is not the system Python. This sort of thing happens because you are not intended to install anything into the system Python. As the name suggests, your (operating) system depends on it, and a rogue package could seriously harm your computer in any number of ways. (Hence why pip is not available in it by default.)
You can:
use brew to install 3.10 or earlier, then install pipx to that, then proceed with pipx install black[d].
use venv to create a virtual environment based off your system Python (3.9); this venv should include pip automatically by default (downloaded and installed by venv). Activate this virtual environment, then use pip to install pipx in it, then proceed with pipx install black[d].
xlwt installs in the following path:
/usr/local/lib/python2.7/dist-packages
successfully with pip or apt-get or easy_install.
However, when I try to import, it gives an error no module named xlwt.
All the relevant answers on this site say to pip install, but xlwt is already installed. As a workaround, I added the dist packages to my python path. Why is it not searching in dist packages?
I'm using ubuntu 16.04 and python 2.7
For python and django related projects it is always preferred to use virtualenv!
sudo pip install virtualenv
To create virtualenv:
virtualenv my_env
To activate virtual env:
source my_env/bin/activate
pip install xlwt
Ref: http://pythoncentral.io/how-to-install-virtualenv-python/
set-up eclipse to look in virtualenv (import errors in eclipse)
modules installed in dist packages, ubuntu (ex. xlwt)
create virtualenv as answered by Abijith Mg,
BUT add
(see below)
virtualenv my_env--use-site-package
source my_env/bin/activate
pip install xlwt
(so you can set-up eclipse and only need to add new modules to virtualenv)
go to window menu in eclipse
preferences
pyDev on the left
interpreters
python interpreter
new folder
select your virtualenv path to point to site-packages
apply
Can we create a virtualenv from an existing virtualenv in order to inherit the installed libraries?
In detail:
I first create a "reference" virtualenv, and add libraries (with versions fixed):
virtualenv ref
source ref/bin/activate
pip install -U pip==8.1.1 # <- I want to fix the version number
pip install -U wheel==0.29.0 # <- I want to fix the version number
Then:
virtualenv -p ref/bin/python myapp
source myapp/bin/activate
pip list
I get:
pip (1.4.1)
setuptools (0.9.8)
wsgiref (0.1.2)
How to get my installed libraries?
Similar question
I saw a similar question: Can a virtualenv inherit from another?.
But I want a isolated virtualenv which didn't use the referenced virtualenv, except for libraries installation. So, adding the specified directories to the Python path for the currently-active virtualenv, is not the solution.
Why doing that?
Well, we have an integration server which builds the applications (for releases and continuous integration) and we want to keep the control on libraries versions and make the build faster.
Create a relocatable virtualenv
I think I could use a relocatable virtualenv, that way:
create the ref virtualenv
make it relocatable: ``virtualenv --relocatable ref```
For "myapp":
copy ref to myapp
What do you think of this solution? Is it reliable for a distribuable release?
You can solve your problem by using .pth files. Basically you do this:
virtualenv -p ref/bin/python myapp
realpath ref/lib/python3.6/site-packages > myapp/lib/python3.6/site-packages/base_venv.pth
After doing this and activating myapp, if you run pip list you should see all the packages from ref as well. Note that any packages installed in myapp would hide the respective package from ref.
You may freeze list of packages from one env:
(ref) user#host:~/dir$ pip freeze > ref-packages.txt
Then install them:
(use) user#host:~/dir$ pip install -r ref-packages.txt
when you install the second virtualenv you have to add --system-site-packages flag.
virtualenv -p ref/bin/python myapp --system-site-packages
The pip version 1.4.1 was bundle with an old version of virtualenv. For example the one shipped with Ubuntu 14.04. You should remove that from your system and install the most recent version of virtualenv.
pip install virtualenv
This might require root permissions (sudo).
Then upgrade pip inside the virtual env pip install -U pip or recrete the env.
I think your problem can be solved differently. With use of PYTHONPATH. First we create ref virtaulenv and install all needed packages here
$ virtualenv ref
$ source ref/bin/activate
$ pip install pep8
$ pip list
> pep8 (1.7.0)
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
Then we create second virtaulenv use.
$ virtualenv use
$ source use/bin/activate
$ pip list
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
And now we can set our PYTHONPATH in this env to include ref's directories
$ export PYTHONPATH=PYTHONPATH:/home/path_to/ref/lib/python2.7/site-packages:/home/path_to/ref/local/lib/python2.7/site-packages
$ pip list
> pep8 (1.7.0)
> pip (8.1.2)
> setuptools (26.1.1)
> wheel (0.29.0)
As you see this way you just reference installed packages in ref's environment. Also note that we add this folders at the end so they will have lower priority.
NOTE: this are not all folders that exists in PYTHONPATH. I included this 2 because they are main ones. But if you will have some problems you can add other ones too, just lookup needed paths with this method
how to print contents of PYTHONPATH
I use python's pip to install packages. Now I want to install scipy, which is already installed on the system, but an old version and on a part of the system where I don't have access to. If I try
pip install scipy
pip rightfully tells me that the package is already installed. If I do
pip install scipy --upgrade
pip tries to upgrade the package but I don't have the access rights to do that.
How can I tell pip to install the package local to my user and to ignore the other scipy package?
I think the best way for avoid override packages it's using a virtual environment. Python has it's own virtual environment and you could install it by:
Python 2.7
> sudo apt-get install python-virtualenv
Python 3
> sudo apt-get install virtualenv
With modern python versions, virtualenv is usually included. Once installed, you could generate a virtual enviroment typing:
> virtualenv venv
This would create a folder in the current directory named venv (you could name it whatever you want). In this package the libraries will be installed.
So, it's time to activate the virtual environment
> source venv/bin/activate
You could verify the environment has been activated by checking the prompt changes. If it happens, all the packages installed using pip will be installed locally.
(venv)> pip install scipy
You could check this website for more info.
Don't forget that you eventually have to clear your $PYTHONPATH variable, in order for it to not pick up other packages.
I am a bloody beginner in Python and Django. To set up a environment on my Windows machine, I performed the following steps.
Install Python 3.4
Use pip to install virtualenv
Create a project folder and set up a virtualenv there
Download Django 1.7b1 release from the official site
Extract the archive in my downloads folder
Install it into my virtualenv
For the last step, I used pip from my virtualenv.
[project]\scripts\pip.exe install -e [downloads]\Django-1.7b1
From the global python interpreter I can't import django, as expected. When using the python executable from the virtualenv, it works. But the import only succeeds as long as I have the Django source in my downloads folder. Instead, I would like to include it into my virtualenv.
Can I make pip to automatically copy the Django source into my project folder?
Install django via pip inside the virtualenv. I'm running Linux but you should be able to run the commands on windows.
If you need a version that's not in PyPi, download the package and install it to the virtualenv site-packages-folder.
My site-packages folder for project is in ~/venvs/project/lib/python2.7/site-packages.
To install there:
pip install downloads/Django-1.7b1.tar.gz -t ~/venvs/project/lib/python2.7/site-packages
Django will install to the site-packages folder and is now importable from within the virtualenv. Downloads/Django-1.7b1 is no longer needed.
Below is an example where I'm installing Django 1.7b1 from a local archive to the site-packages-folder of my virtualenv:
(project)msvalkon#Lunkwill:/tmp$ pip install /tmp/Django-1.7b1.tar.gz -t ~/venvs/project/lib/python2.7/site-packages/
Unpacking ./Django-1.7b1.tar.gz
Running setup.py egg_info for package from file:///tmp/Django-1.7b1.tar.gz
-- SNIP --
Successfully installed Django
Cleaning up...
(project)msvalkon#Lunkwill:/tmp$ python -c "import django;print django.get_version()"
1.7b1
(project)msvalkon#Lunkwill:/tmp$ deactivate
# I've got a really old version installed globally, but you can see
# that the installation worked.
msvalkon#Lunkwill:/tmp$ python -c "import django;print django.get_version()"
1.5.1
After this you should find the following output when doing pip freeze while the virtualenv
is activated:
(project)msvalkon#Lunkwill:/tmp$ pip freeze
Django==1.7b1
argparse==1.2.1
wsgiref==0.1.2