On MacOs I've installed with brew the pipx:
brew install pipx
then pipx install black
$ pipx list
venvs are in /Users/mc/.local/pipx/venvs
apps are exposed on your $PATH at /Users/mc/.local/bin
package black 22.12.0, installed using Python 3.11.1
- black
- blackd
However, I keep getting missing dependency:
$ /Users/mc/.local/bin/blackd
Traceback (most recent call last):
File "/Users/mc/.local/bin/blackd", line 5, in <module>
from blackd import patched_main
File "/Users/mc/.local/pipx/venvs/black/lib/python3.11/site-packages/blackd/__init__.py", line 14, in <module>
raise ImportError(
ImportError: aiohttp dependency is not installed: No module named 'aiohttp'. Please re-install black with the '[d]' extra install to obtain aiohttp_cors: `pip install black[d]`
How to fix it?
Why pipx is not solving this dependency while installing black ?
Why it uses some (no idea where is this installed) python 3.11.1 when my system python is 3.9.6
$ python3 --version
Python 3.9.6
EDIT
I've did as advised by below answer from #KarlKnechtel :
$ brew install python#3.10
==> Auto-updated Homebrew!
Updated 1 tap (romkatv/powerlevel10k).
You have 2 outdated formulae installed.
You can upgrade them with brew upgrade
or list them with brew outdated.
==> Fetching python#3.10
==> Downloading https://ghcr.io/v2/homebrew/core/python/3.10/manifests/3.10.9
######################################################################## 100.0%
==> Downloading https://ghcr.io/v2/homebrew/core/python/3.10/blobs/sha256:a9b28161cec6e1a027f1eab7576af7
==> Downloading from https://pkg-containers.githubusercontent.com/ghcr1/blobs/sha256:a9b28161cec6e1a027f
######################################################################## 100.0%
==> Pouring python#3.10--3.10.9.arm64_monterey.bottle.tar.gz
==> /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10 -m ensurepip
==> /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10 -m pip install -v --no-deps --no-index --upgr
==> Caveats
Python has been installed as
/opt/homebrew/bin/python3
Unversioned symlinks `python`, `python-config`, `pip` etc. pointing to
`python3`, `python3-config`, `pip3` etc., respectively, have been installed into
/opt/homebrew/opt/python#3.10/libexec/bin
You can install Python packages with
pip3 install <package>
They will install into the site-package directory
/opt/homebrew/lib/python3.10/site-packages
tkinter is no longer included with this formula, but it is available separately:
brew install python-tk#3.10
See: https://docs.brew.sh/Homebrew-and-Python
==> Summary
🍺 /opt/homebrew/Cellar/python#3.10/3.10.9: 3,110 files, 57.1MB
==> Running `brew cleanup python#3.10`...
Disable this behaviour by setting HOMEBREW_NO_INSTALL_CLEANUP.
Hide these hints with HOMEBREW_NO_ENV_HINTS (see `man brew`).
so I got:
$ python3 --version
Python 3.10.9
$brew list python python3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/2to3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/2to3-3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/idle3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/idle3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pip3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pip3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pydoc3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/pydoc3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3-config
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10-config
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/wheel3
/opt/homebrew/Cellar/python#3.10/3.10.9/bin/wheel3.10
/opt/homebrew/Cellar/python#3.10/3.10.9/Frameworks/Python.framework/ (3055 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/IDLE 3.app/Contents/ (8 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/lib/pkgconfig/ (4 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/libexec/bin/ (6 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/libexec/wheel-0.38.4-py3-none-any.whl
/opt/homebrew/Cellar/python#3.10/3.10.9/Python Launcher 3.app/Contents/ (16 files)
/opt/homebrew/Cellar/python#3.10/3.10.9/share/man/ (2 files)
but still when I install black it installs python 3.11:
$pipx install black[d]
zsh: no matches found: black[d]
$pipx install black
installed package black 22.12.0, installed using Python 3.11.1
These apps are now globally available
- black
- blackd
done! ✨ 🌟 ✨
I solved it as in edit 1 but later you have to do according to this:
pipx install "black[d]" --force
but after that I got error:
dyld[56881]: Library not loaded: '/opt/homebrew/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/Python'
Referenced from: '/Users/mc/Library/Application Support/pypoetry/venv/bin/python'
Reason: tried: '/opt/homebrew/Cellar/python#3.9/3.9.13_1/Frameworks/Python.framework/Versions/3.9/Python' (no such file), '/Library/Frameworks/Python.framework/Versions/3.9/Python' (no such file), '/System/Library/Frameworks/Python.framework/Versions/3.9/Python' (no such file)
so I had to:
curl -sSL https://install.python-poetry.org | python3 - --uninstall
curl -sSL https://install.python-poetry.org | python3 -
And now:
poetry env use /opt/homebrew/Cellar/python#3.10/3.10.9/bin/python3.10
as 3.10 is required for aiohttp as mentioned by #KarlKnechel
When you use pipx rather than pip to install a package, it will:
create a new virtual environment for the package
use pip to install the package into that virtual environment
Do some additional setup work, so that "entry points" for the package (commands created by the package to use directly at the command line, such as blackd) will work, without having to activate the specific virtual environment first.
That is its explicit purpose (i.e., the reason not to just use pip), and therefore why there is a different Python being used rather than the system Python.
To use the blackd entry point defined by Black (which starts up a server), the aiohttp optional dependency is required. This must be specified at installation. Per the error message, with pip, this looks like pip install black[d].
However, from the Black release notes (since version 22.8.0; current version 22.12.0):
Python 3.11 is now supported, except for blackd as aiohttp does not support 3.11 as of publishing (#3234)
So, pipx will need to use an older version to create the virtual environment.
From the pipx documentation:
The default python executable used to install a package is
typically the python used to execute pipx and can be overridden
by setting the environment variable PIPX_DEFAULT_PYTHON.
Presumably, brew install pipx installed pipx into some 3.11 install that is not the system Python. This sort of thing happens because you are not intended to install anything into the system Python. As the name suggests, your (operating) system depends on it, and a rogue package could seriously harm your computer in any number of ways. (Hence why pip is not available in it by default.)
You can:
use brew to install 3.10 or earlier, then install pipx to that, then proceed with pipx install black[d].
use venv to create a virtual environment based off your system Python (3.9); this venv should include pip automatically by default (downloaded and installed by venv). Activate this virtual environment, then use pip to install pipx in it, then proceed with pipx install black[d].
Related
I know this topic has been beat to death but I have not been able to find a solution to the problem I'm having on SO or elsewhere, so I suspect that there may be a bug somewhere in my system.
I am on an older RHEL 6 platform with Python 3.4. I am developing an application that will run on this platform that uses Qt. I've installed all of the relevant libraries via yum (e.g. qt-devel, pyqt4-devel, etc.) and now want to install my application package as an "editable" package using pip install -e mypkg. I also have a couple of dependency requirements that are not on yum and must be installed via pip.
What I would like to do is create a virtualenv that "inherits" the system packages installed via yum but allows me to pip install my own packages into a virtualenv directory in my home directory.
From my Googling it looks like the best way to do this is to create a virtual env with the system's site packages directory:
$ python3 -m venv --system-site-packages ~/venv
However, when I try to install a package to this virtualenv's site-packages directory, it attempts to install it under /usr/lib and I get a Permission denied error.
So it appears that the --system-site-packages option makes my virtualenv completely share the site-packages directory from my system instead of using it as a "base", where further packages can be layered on top.
This answer states that using pip install -I should do what I want, but that does not appear to be the case:
(venv) $ pip3 install -I bitstring
...
error: could not create '/usr/lib/python3.4/site-packages/bitstring.py': Permission denied
Create the virtual environment without the --system-site-packages switch. After the environment was created go to the folder the environment was created in. It should have a file pyvenv.cfg. Edit this file. It has (among other text) a line
include-system-site-packages = false
Change this line to:
include-system-site-packages = true
Activate the environment. Module installations will now go to the virtual environment and the system site packages are visible too.
With Python 3.8, it seems --system-site-packages work as expected:
python3 -m venv --system-site-packages myProject
cat myProject/pyvenv.cfg
home = /usr/bin
include-system-site-packages = true
version = 3.8.5
After installation astroid, isort, wrapt, I got:
pip list -v
Package Version Location Installer
---------------------- -------------------- ------------------------------------------------------- ---------
apturl 0.5.2 /usr/lib/python3/dist-packages
astroid 2.4.2 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
isort 5.6.4 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
jedi 0.15.2 /usr/lib/python3/dist-packages
keyring 18.0.1 /usr/lib/python3/dist-packages
wrapt 1.12.1 /home/to/no/MR/auto-gen/lib/python3.8/site-packages pip
Already installed 'system' packages are taken from /usr/lib/python3/dist-packages while locally (venv) installed packages from: /home/to/no/MR/auto-gen/lib/python3.8/site-packages
It seems the COCO PythonAPI only support python2. But peoples do use it in python3 environment.
I tried possible methods to install it, like
python3 setup.py build_ext --inplace
python3 setup.py install
But python3 setup.py install will fail due to coco.py and cocoeval.py containning python2 print function.
Update: solved by updating the COCO PythonAPI project. Leave this question for people facing the same issue.
Try the following steps:
Use git clone to clone the folder into your drive. In this case, it should be git clone https://github.com/cocodataset/cocoapi.git
Use terminal to enter the directory, or open a terminal inside the directory
Type in 2to3 . -w. Note that you might have to install a package to get 2to3. It is an elegant tool to convert code from Python2 to Python3; this code converts all .py files from Python2-compatible to Python3-compatible
Use terminal to navigate to the setup folder
Type in python3 setup.py install
This should help you install COCO or any package intended for Python2, and run the package using Python3. Cheers!
I have completed it with a simple step
pip install "git+https://github.com/philferriere/cocoapi.git#egg=pycocotools&subdirectory=PythonAPI"
** before that you need to install Visual C++ 2015 build tools on your path
Install
Instead of the official version (which has issues with python 3) use an alternative one. Install it on your local machine, globally (i.e., outside any virtual environment). You can do this by:
pip install git+https://github.com/philferriere/cocoapi.git#subdirectory=PythonAPI
Check if it is installed globally:
pip freeze | grep "pycocotools"
You should see something like pycocotools==2.0.0 in your output.
Now, inside your virtual-env (conda or whatever), first install numpy and cython (and maybe setuptools if it's not installed) using pip, and then:
pip install pycocotools
Verify
Inside your project, import (for example) from pycocotools import mask as mask and then print(mask.__author__). This should print out the author's name, which is tsungyi.
Where Is It?
The installed package, like any other packages that are locally installed inside a virtual-env using pip, will go to External Libraries of your project, under site-packages. That means it is now part of your virtual-env and not part of your project. So, other users who may want to use your code, must repeat this installation on their virtual-env as well.
Troubleshooting:
The main source of confusion is that either you did not install the required packages before installing cocoapi, or you did install the required packages but for a different python version. And when you want to check if something is installed, you may check with, for instance, python3.6 and see that it exists, but you are actually running all your commands with python3.7. So suppose you are using python3.7. You need to make sure that:
python -V gives you python3.7 and NOT other version, and pip -V gives you pip 19.2.3 from /home/<USER>/.local/lib/python3.7/site-packages/pip (python3.7), that actually matches with your default python version. If this is not the case, you can change your default python using sudo update-alternatives --config python, and following the one-step instruction.
All the required packages are installed using the right python or pip version. You can check this using pip and pip3 to stop any differences that may cause an issue:
pip freeze | grep "<SUBSTRING-NAME-OF-PACKAGE>" or pip show <PACKAGE-NAME> for more recent versions of pip.
To install the required packages, after you made sure about (1), you need to run:
sudo apt install python-setuptools python3.7-dev python3-wheel build-essential and pip install numpy cython matplotlib
Environment:
The above steps were tested on Ubuntu 18.4, python 3.6.8, pip 19.0.3.
If you are struggling building pycocotools on Ubuntu 20.04 and python3.7
try this:
sudo apt-get install -y python3.7-dev
python3.7 -m pip install pycocotools>=2.0.1
There are alternative versions of the cocoapi that you can download and use too (I'm using python 3.5). Here's a solution that you might want to try out: How to download and use object detection datasets (e.g. coco or pascal)
here's how i did successfully! (the reason is the gcc version)
install the dependencies: cython (pip install cython), opencv (pip install opencv-python)
check the gcc version by this command: gcc --version
your output will be like this 'Command 'gcc' not found, but can be installed with:
sudo apt install gcc
'
Type the below commands to install the gcc:
sudo apt update
sudo apt install build-essential
sudo apt-get install manpages-dev
now check again the gcc version(step2)
if you get below output
'gcc (Ubuntu 9.3.0-17ubuntu1~20.04) 9.3.0
Copyright (C) 2019 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.'
now run the code for pycocotools installations:
pip install "git+https://github.com/philferriere/cocoapi.git#egg=pycocotools&subdirectory=PythonAPI"
finally wait check if the installation is successful :
'Successfully installed pycocotools-2.0'
I downloaded the newest Cython release from https://pypi.python.org/pypi/Cython/#downloads. I'm working in Python 3.5.1 on a Mac so I downloaded
Cython-0.26.1-cp35-cp35m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
I unzipped it and entered the /Cython directory, but there is no setup.py in the directory. When I try to run python3 setup.py install anyway I get the following error:
/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/Resources/Python.app/Contents/MacOS/Python: can't open file 'setup.py': [Errno 2] No such file or directory
It doesn't seem to be looking in the /Cython directory I am in, but there is also no setup.py in that directory.
Not sure what's going on, I can't seem to find anyone else having this issue.
I've install python3 using homebrew.
The file you downloaded is a wheel file that should be installed using pip. The wheel file does not include the setup.py script which is required to build the package but is not used when installing wheels. First check you are using correct pip command (you need one for python 3.5), this is usually pip3.5 or pip3 command:
$ pip3 -V
pip 9.0.1 from /Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages (python 3.5)
To install cython from the downloaded wheel, issue
$ pip3 install path/to/downloaded.whl
Or simply issue
$ pip3 install cython
as pip will download the latest package wheel for you. Since you installed python 3.5 via Homebrew, you probably have to switch to the user you use to install packages with Homebrew or the installation with pip will fail.
Note that, although suggested in the comments, it is not advised to install cython package via brew install:
$ brew info cython
...
==> Caveats
This formula is keg-only, which means it was not symlinked into
/usr/local,
because this formula is mainly used internally by other formulae.
Users are advised to use `pip` to install cython.
I used pyenv, pyenv-virtualenv for managing python virtual environment.
I have a project working in Python 3.4 virtual environment.
So all installed packages(pandas, numpy etc) are not newest version.
What I want to do is to upgrade Python version from 3.4 to 3.6 as well as upgrade other package version to higher one.
How can I do this easily?
Here is how you can switch to 3.9.0 for a given virtual environement venv-name:
pip freeze > requirements-lock.txt
pyenv virtualenv-delete venv-name
pyenv virtualenv 3.9.0 venv-name
pip install -r requirements-lock.txt
Once everything works correctly you can safely remove the temporary requirements lock file:
rm requirements-lock.txt
Note that using pip freeze > requirements.txt is usually not a good idea as this file is often used to handle your package requirements (not necessarily pip freeze output). It's better to use a different (temporary) file just to be sure.
Use pip freeze > requirements.txt to save a list of installed packages.
Create a new venv with python 3.6.
Install saved packages with pip install -r requirements.txt. When pip founds an universal wheel in its cache it installs the package from the cache. Other packages will be downloaded, cached, built and installed.
OP asked to upgrade the packages alongside Python. No other answers address the upgrade of packages. Lock files are not the answer here.
Save your packages to a requirements file without the version.
pip freeze | cut -d"=" -f1 > requirements-to-upgrade.txt
Delete your environment, create a new one with the upgraded Python version, then install the requirements file.
pyenv virtualenv-delete venv-name
pyenv virtualenv 3.6.8 venv-name
pip install -r requirements-to-upgrade.txt
The dependency resolver in pip should try to find the latest package. This assumes you have the upgrade Python version installed (e.g., pyenv install 3.6.8).
If you use anaconda, just type
conda install python==$pythonversion$
Is it possible to force virtualenv to use the latest setuptools and pip available from pypi? Essentially, I'm looking for the opposite of the --never-download flag.
Currently, when I make a new virtualenv, it uses the local (old) versions that come bundled with virtualenv.
$ v.mk testvenv
New python executable in testvenv/bin/python
Installing setuptools............done.
Installing pip...............done.
$ pip show setuptools
---
Name: setuptools
Version: 0.6c11
Location: /Users/cwilson/.virtualenvs/testvenv/lib/python2.7/site-packages/setuptools-0.6c11-py2.7.egg
Requires:
$ pip search setuptools
[...]
setuptools - Easily download, build, install, upgrade, and
uninstall Python packages
INSTALLED: 0.6c11
LATEST: 0.7.2
[...]
It's not supported for security reasons.
Using virtualenv.py as an isolated script (i.e. without an associated
virtualenv_support directory) is no longer supported for security
reasons and will fail with an error. Along with this, --never-download
is now always pinned to True, and is only being maintained in the
short term for backward compatibility (Pull #412).
I can't use the --extra-search-dir option either because it's currently broken https://github.com/pypa/virtualenv/issues/327
Looks like the only option is to simply wait for the virtualenv maintainers to update the bundled packages?
You can upgrade pip after installing your virtualenv by using pip install -U pip.
I'm sure you could write a bootstrap-script to automate this step.
I needed the latest setuptools library, and the --extra-search-dir flag wasn't working for me (even though it's been fixed apparently).
However, making a virtualenv without setuptools and then installing directly from PyPi worked great.
E.g. to set up a virtualenv called test:
virtualenv --no-setuptools test
source test/bin/activate
wget https://bootstrap.pypa.io/ez_setup.py -O - | python
easy_install pip
Testing with
python -c 'import setuptools; print setuptools.__version__'
shows the right version.
I ran into the same problem, and I fixed it by upgrading setuptools.
If env is your virtual env, run the following:
$ env/bin/pip install --upgrade setuptools
Building on ematsen's excellent answer I made a bash script that works with virtualenvwrapper
#!/bin/bash
source `which virtualenvwrapper.sh`
mkvirtualenv --no-setuptools $1
wget https://bootstrap.pypa.io/ez_setup.py -O - | python
rm setuptools-*.zip
easy_install pip
# for python version < 2.7.9
# https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
pip install urllib3[secure]