I want to install a package with this command: pip install git+https://github.com/BioSystemsUM/mewpy.git
It collects the package, but at the end it shows:
Installing collected packages: ruamel.yaml, pathos, matplotlib, boolean.py, jmetalpy, cobamp, mewpy
Attempting uninstall: ruamel.yaml
Found existing installation: ruamel-yaml 0.15.46
ERROR: Cannot uninstall 'ruamel-yaml'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
I couldn't find a way to solve this issue and install this package. Any suggestion's very appreciated.
Related
I am using Mac. I installed igraph in Python with pip install igraph and it was working well, except for plotting the graphs.
I searched online and I installed cairo with brew install cairo. Ever since, I am getting the following error anytime I simply import igraph:
OSError: no library called "cairo-2" was found
no library called "cairo" was found
no library called "libcairo-2" was found
cannot load library 'libcairo.so.2': dlopen(libcairo.so.2, 0x0002): tried: '/Users/<username>/opt/anaconda3/lib/libcairo.so.2' (no such file)
and the error message continues with several folders where it tried to look for cairo.
It seems the installation of cairo was not successful. So I tried install cairo using pip install pycairo, but I cannot install it:
ERROR: Could not build wheels for pycairo which use PEP 517 and cannot be installed directly
I was trying to run scanpy's neighbors function with my genes expression dataset:
import scanpy as sc
sc.pp.pca(adata)
sc.pp.neighbors(adata)
and got this error:
C:\Users\User\anaconda3\lib\site-packages\numba\core\cpu.py:77: UserWarning: Numba extension module 'sparse._numba_extension' failed to load due to 'ContextualVersionConflict((llvmlite 0.33.0+1.g022ab0f (c:\users\User\anaconda3\lib\site-packages), Requirement.parse('llvmlite<0.38,>=0.37.0rc1'), {'numba'}))'.
numba.core.entrypoints.init_all()
I then Tried to update my llvmlite==0.33 version due to this error using
pip install llvmlite --upgrade
and got another error:
ERROR: -umba 0.50.1 has requirement llvmlite<0.34,>=0.33.0.dev0, but you'll have llvmlite 0.37.0 which is incompatible.
ERROR: Cannot uninstall 'llvmlite'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
Python doesn't let me to upgrade llvmlite to the newer version and therefore I cannot use scanpy's functions. I tried to update numba, I tried to reinstall everything, but nothing worked.
What should be done in order to solve this problem?
Im working on raspberry pi 4, with Python3 and I want to install librosa. (pip3 install librosa)
Previously I installed llvm version 7.0.1
Following the Compatibility I install llvmlite https://pypi.org/project/llvmlite/
$ LLVM_CONFIG=/usr/bin/llvm-config pip3 install llvmlite=0.32.0
That block during the instalation of librosa or numba
Building wheel for llvmlite (setup.py) ... error
If someone has advice about how solve it, thank you to give me.
I got mine running by installing llvmlite==0.31.0, numba==0.48.0, librosa==0.6.3, colorama==0.3.9.
Command:
# specify a valid dependency tree with pi compatibility
LLVM_CONFIG=/usr/bin/llvm-config pip3 install llvmlite==0.31.0 numba==0.48.0 colorama==0.3.9 librosa==0.6.3
Note: colorama is used only to show colored output in the console, if the package causes you issues try removing it but otherwise keep it otherwise you will have errors while importing the module.
Thanks to Michael S. !
But actually the command should be:
LLVM_CONFIG=/usr/bin/llvm-config pip3 install llvmlite==0.31.0 numba==0.48.0 colorama==0.3.9 librosa==0.6.3
it should be == in:
llvmlite=0.31.0
easy_install -U TurboJson is failing with the below error-
user#ubuntu-dev:~$ sudo easy_install -U TurboJSON
Searching for TurboJSON
Reading pypi.python.org/simple/TurboJSON link
Best match: TurboJson 1.3.2
Downloading TurboJson-1.3.2-py2.7.egg#md5=8708fcb8979c661104c9b444e5428484
Processing TurboJson-1.3.2-py2.7.egg
Moving TurboJson-1.3.2-py2.7.egg to /usr/local/lib/python2.7/dist-packages
Adding TurboJson 1.3.2 to easy-install.pth file
Installed /usr/local/lib/python2.7/dist-packages/TurboJson-1.3.2-py2.7.egg
Processing dependencies for TurboJSON
Searching for simplejson>=1.9.1
Reading pypi.python.org/simple/simplejson/ link
Best match: simplejson 3.8.1
Downloading simplejson-3.8.1.tar.gz#md5=b8441f1053edd9dc335ded8c7f98a974
Processing simplejson-3.8.1.tar.gz
Writing /tmp/easy_install-4VcmRi/simplejson-3.8.1/setup.cfg
Running simplejson-3.8.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-4VcmRi/simplejson-3.8.1/egg-dist-tmp-K89rCq
zip_safe flag not set; analyzing archive contents...
simplejson.tests.init: module references file Adding simplejson 3.8.1 to easy-install.pth file
Installed /usr/local/lib/python2.7/dist-packages/simplejson-3.8.1-py2.7-linux-x86_64.egg
Searching for PEAK-Rules>=0.5a1.dev-r2600
Reading https://pypi.python.org/simple/PEAK-Rules/
No local packages or download links found for PEAK-Rules>=0.5a1.dev-r2600
error: Could not find suitable distribution for Requirement.parse('PEAK-Rules>=0.5a1.dev-r2600')
user#ubuntu-dev:~$
https://pypi.python.org/simple/peak-rules/ seems to be broken. It does not list any packages which is very strange. Earlier, it was mostly providing PEAK-Rules>=0.5a1.dev-r2713.
From what I can tell, the PyPi page for PEAK-Rules does not have any packages available for installation, as you suspected in your question.
The solution is to install the PEAK-Rules dependency yourself, then install TurboJSON afterwards.
First, run:
easy_install http://www.turbogears.org/2.1/downloads/current/PEAK-Rules-0.5a1.dev-r2686.tar.gz
This should install PEAK-Rules 0.5a1.dev-r2686 successfully, which will satisfy the TurboJSON requirement of PEAK-Rules>=0.5a1.dev-r2600.
Now, if you run (sudo) easy_install -U TurboJSON, the installation should work as intended.
I think it doesn't make a difference here but I'm using Python 2.7.
So the general part of my question is the following: I use a separate virtualenv for each of my projects. I don't have administrator access and I don't want to mess with system-installed packages anyway. Naturally, I want to use wheels to speed up package upgrades and installations across the virtualenvs. How can I build a wheel whose dependencies are only met within a specific virtualenv?
Specifically, issuing
pip wheel -w $WHEELHOUSE scipy
fails with
Building wheels for collected packages: scipy
Running setup.py bdist_wheel for scipy
Destination directory: /home/moritz/.pip/wheelhouse
Complete output from command /home/moritz/.virtualenvs/base/bin/python -c "import setuptools;__file__='/home/moritz/.virtualenvs/base/build/scipy/setup.py';exec(compile(open(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" bdist_wheel -d /home/moritz/.pip/wheelhouse:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/moritz/.virtualenvs/base/build/scipy/setup.py", line 237, in <module>
setup_package()
File "/home/moritz/.virtualenvs/base/build/scipy/setup.py", line 225, in setup_package
from numpy.distutils.core import setup
ImportError: No module named numpy.distutils.core
----------------------------------------
Failed building wheel for scipy
Failed to build scipy
Cleaning up...
because numpy is not globally present and while building the wheel works when a virtualenv with numpy installed is active, it seems like a terrible idea to have the wheel depend on a specific virtualenv's version of numpy.
pandas which also depends on numpy appears to install its own components of numpy but I'm not sure that's the best solution.
I could install numpy with --user and use that to build the scipy wheel. Are there better options?
Problem description
Have a python package (like scipy), which is dependent on other packages (like numpy) but setup.py is not declaring that requirement/dependency.
Building a wheel for such a package will succeed in case, current environment provides the package(s) which are needed.
In case, required packages are not available, building a wheel will fail.
Note: Ideal solution is to correct the broken setup.py by adding there required package declaration. But this is mostly not feasible and we have to go another way around.
Solution: Install required packages first
The procedure (for installing scipy which requires numpy) has two steps
build the wheels
use the wheels to install the package you need
Populate wheelhouse with wheels you need
This has to be done only once and can be then reused many times.
have properly configured pip configuration so that installation from wheels is allowed, wheelhouse directory is set up and overlaps with download-cache and find-links as in following example of pip.conf:
[global]
download-cache = /home/javl/.pip/cache
find-links = /home/javl/.pip/packages
[install]
use-wheel = yes
[wheel]
wheel-dir = /home/javl/.pip/packages
install all required system libraries for all the packages, which have to be compiled
build a wheel for required package (numpy)
$ pip wheel numpy
set up virtualenv (needed only once), activate it and install there numpy:
$ pip install numpy
As a wheel is ready, it shall be quick.
build a wheel for scipy (still being in the virtualenv)
$ pip wheel scipy
By now, you will have your wheelhouse populated with wheels you need.
You can remove the temporary virtualenv, it is not needed any more.
Installing into fresh virtualenv
I am assuming, you have created fresh virtualenv, activated it and wish to have scipy installed there.
Installing scipy from new scipy wheel directly would still fail on missing numpy. This we overcome by installing numpy first.
$ pip install numpy
And then finish with scipy
$ pip install scipy
I guess, this could be done in one call (but I did not test it)
$ pip install numpy scipy
Repeatedly installing scipy of proven version
It is likely, that at one moment in future, new release of scipy or numpy will be released and pip will attempt to install the latest version for which there is no wheel in your wheelhouse.
If you can live with the versions you have used so far, you shall create requirements.txt stating the versions of numpy and scipy you like and install from it.
This shall ensure needed package to be present before it is really used.