I have recently moved to Python3.3 from python3.2. I installed Numpy 1.7.0 and Scipy 0.11.0. I am running all these on Scientific Linux 6.4.
But when I run:
from scipy import integrate
I get this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.3/site-packages/scipy/integrate/__init__.py", line 50, in <module>
from .quadrature import *
File "/usr/local/lib/python3.3/site-packages/scipy/integrate/quadrature.py", line 5, in <module>
from scipy.special.orthogonal import p_roots
File "/usr/local/lib/python3.3/site-packages/scipy/special/__init__.py", line 532, in <module>
from .lambertw import lambertw
File "lambertw.pyx", line 24, in init scipy.special.lambertw (scipy/special/lambertw.c:1588)
ValueError: level must be >= 0
So I installed Scipy 0.12.0c1, but the problem still remains. Could you please help me fix this issue?
Thank you very much in advance
The answer is that Scipy 0.11.0 is not compatible with Python 3.3.
You need to wait for 0.12.0, or download the release candidate version 0.12.0rc1, or recompile using the Cython fix mentioned in the comments above.
However, this bug is fixed in 0.12.0rc1. You most likely made a mistake in installing it --- there is no file called lambertw.c in 0.12.0rc1.
Related
New to Python. Trying to understand how to import the pandas module. I imported it through Pycharm, then ran a basic script seen below. I get several errors that I'm not sure how to interpret
import pandas
x = input("Enter your age")
print(x)
I receive this error message.
Traceback (most recent call last):
File "C:/Users/leeb/PycharmProjects/HelloWorld/app.py", line 1, in <module>
import pandas
File "C:\Users\leeb\PycharmProjects\HelloWorld\venv\lib\site-packages\pandas\__init__.py", line 11, in <module>
__import__(dependency)
File "C:\Users\leeb\PycharmProjects\HelloWorld\venv\lib\site-packages\numpy\__init__.py", line 150, in <module>
from . import random
File "C:\Users\leeb\PycharmProjects\HelloWorld\venv\lib\site-packages\numpy\random\__init__.py", line 181, in <module>
from . import _pickle
File "C:\Users\leeb\PycharmProjects\HelloWorld\venv\lib\site-packages\numpy\random\_pickle.py", line 1, in <module>
from .mtrand import RandomState
File "type.pxd", line 9, in init numpy.random.mtrand
ValueError: builtins.type size changed, may indicate binary incompatibility. Expected 440 from C header, got 432 from PyObject
This error tends to happen when you have an older version of Numpy installed.
You should upgrade it, as follows:
pip install numpy --upgrade
If that doesn't work, try to use a specific version of numpy, as follows:
pip uninstall numpy
pip install numpy==1.15.1
Or, if you're using anaconda, try:
conda update numpy
I'm going to preface this by saying that I'm relatively new to Python and so please forgive me if I have difficulty understanding something.
I've recently been trying to install OpenCV on my computer following the "Installing OpenCV from prebuilt binaries" instructions found here:
http://docs.opencv.org/3.1.0/d5/de5/tutorial_py_setup_in_windows.html#gsc.tab=0
I initially tried to install OpenCV by itself, as I already had Python 3.5 and a working version of numpy. However, my attempts to import cv2 failed, and I eventually decided to uninstall Python and follow all the steps listed on the website. However, I now have this error when I try import numpy:
Traceback (most recent call last):
File "C:/PythonProgramming/SLIC/src/SLICAlgorithm.py", line 1, in <module>
import numpy
File "C:\Python27\lib\site-packages\numpy\__init__.py", line 180, in <module>
from . import add_newdocs
File "C:\Python27\lib\site-packages\numpy\add_newdocs.py", line 13, in <module>
from numpy.lib import add_newdoc
File "C:\Python27\lib\site-packages\numpy\lib\__init__.py", line 4, in <module>
from type_check import *
File "C:\Python27\lib\site-packages\numpy\lib\type_check.py", line 8, in <module>
import numpy.core.numeric as _nx
File "C:\Python27\lib\site-packages\numpy\core\__init__.py", line 20, in <module>
import function_base
File "C:\Python27\lib\site-packages\numpy\core\function_base.py", line 6, in <module>
from .numeric import result_type, NaN, shares_memory, MAY_SHARE_BOUNDS, TooHardError
ImportError: cannot import name shares_memory
Process finished with exit code 1
Other people online seem to have no problem with these instructions, or at least not with numpy. What does this error mean, and what can I do to fix it? I have a 64-bit version of Windows 10 and am working in PyCharm. All help you can provide would be much appreciated.
same problem, the following fix worked for me
pip install -I numpy --force-reinstall
I got "numpy.dtype has the wrong size, try recompiling" in both pycharm and terminal when compiling Sci-kit learning. I've upgraded all packages(numpy, scikit to the latest), nothing works.Python version is 2.7. Please help. Appreciate!
checking for nltk
Traceback (most recent call last):
File "startup.py", line 6, in <module>
import nltk
File "/Library/Python/2.7/site-packages/nltk/__init__.py", line 128, in <module>
from nltk.chunk import *
File "/Library/Python/2.7/site-packages/nltk/chunk/__init__.py", line 157, in <module>
from nltk.chunk.api import ChunkParserI
File "/Library/Python/2.7/site-packages/nltk/chunk/api.py", line 13, in <module>
from nltk.parse import ParserI
File "/Library/Python/2.7/site-packages/nltk/parse/__init__.py", line 79, in <module>
from nltk.parse.transitionparser import TransitionParser
File "/Library/Python/2.7/site-packages/nltk/parse/transitionparser.py", line 21, in <module>
from sklearn.datasets import load_svmlight_file
File "/Library/Python/2.7/site-packages/sklearn/__init__.py", line 57, in <module>
from .base import clone
File "/Library/Python/2.7/site-packages/sklearn/base.py", line 11, in <module>
from .utils.fixes import signature
File "/Library/Python/2.7/site-packages/sklearn/utils/__init__.py", line 10, in <module>
from .murmurhash import murmurhash3_32
File "numpy.pxd", line 155, in init sklearn.utils.murmurhash (sklearn/utils/murmurhash.c:5029)
ValueError: numpy.dtype has the wrong size, try recompiling
The error "numpy.dtype has the wrong size, try recompiling" means that sklearn was compiled against a numpy more recent than the numpy version sklearn is now trying to import. To fix this, you need to make sure that sklearn is compiled against the version of numpy that it is now importing, or an earlier version. See ValueError: numpy.dtype has the wrong size, try recompiling for a detailed explanation.
I guess from your paths that you are using the OSX system Python (the one that ships with OSX, at /usr/bin/python). Apple has modified this Python in a way that makes it pick up its own version of numpy rather than any version that you install with pip etc - see https://github.com/MacPython/wiki/wiki/Which-Python#system-python-and-extra-python-packages . I strongly recommend you switch to Python.org or homebrew Python to make it easier to work with packages depending on numpy.
The problem occurs when you are using incompatible versions. Check the versions using:
pip freeze
or, for a specific module
pip freeze | grep Module_Name
I fix my problem by updating all packages:
pip install -U scikit-learn numpy scipy pandas matplotlib
As of Today(30/11/2016). These versions are compatible:
matplotlib==1.5.2
nltk==3.2.1
numpy==1.11.2
pandas==0.19.1
scikit-learn==0.18.1
scipy==0.18.1
textblob==0.11.1
I am getting this error on this line:
from sklearn.ensemble import RandomForestClassifier
The error log is:
Traceback (most recent call last):
File "C:\workspace\KaggleDigits\KaggleDigits.py", line 5, in <module>
from sklearn.ensemble import RandomForestClassifier
File "C:\Python27\lib\site-packages\sklearn\ensemble\__init__.py", line 7, in <module>
from .forest import RandomForestClassifier
File "C:\Python27\lib\site-packages\sklearn\ensemble\forest.py", line 47, in <module>
from ..feature_selection.selector_mixin import SelectorMixin
File "C:\Python27\lib\site-packages\sklearn\feature_selection\__init__.py", line 7, in <module>
from .univariate_selection import chi2
File "C:\Python27\lib\site-packages\sklearn\feature_selection\univariate_selection.py", line 13, in <module>
from scipy import stats
File "C:\Python27\lib\site-packages\scipy\stats\__init__.py", line 320, in <module>
from .stats import *
File "C:\Python27\lib\site-packages\scipy\stats\stats.py", line 241, in <module>
import scipy.special as special
File "C:\Python27\lib\site-packages\scipy\special\__init__.py", line 529, in <module>
from ._ufuncs import *
ImportError: DLL load failed: The specified module could not be found.
After installing:
Python 2.7.4 for Windows x86-64
scipy-0.12.0.win-amd64-py2.7.exe (from here)
numpy-unoptimized-1.7.1.win-amd64-py2.7.exe (from here)
scikit-learn-0.13.1.win-amd64-py2.7.exe (from here)
Anybody know why this is happening and how to solve it ?
As Christoph Gohlke mentioned on his download page, the scikit-learn downloadable from his website requires Numpy-MKL. Therefore I made a mistake by using Numpy-Unoptimized.
The link to his Numpy-MKL is statically linked to the Intel's MKL and therefore you do not need any additional download (no need to download Intel's MKL).
This is a little late, but for those like me, download these from the official
Microsoft website.
After that restart your interpreter/console and it should work.
This problem happened to me when I use scipy 0.12. After I changed to scipy 0.11, the problem was gone.
I used to have scipy.fftpack available on 32-bit Python 2.7, but now that I upgraded to 64-bit Python and got SciPy from here, I noticed it doesn't seem to include FFTPack.
Where can I download it?
Oh, and the error is:
>>> import scipy.fftpack
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files\Python 2.7\lib\site-packages\scipy\fftpack\__init__.py", line 95, in <module>
from basic import *
File "C:\Program Files\Python 2.7\lib\site-packages\scipy\fftpack\basic.py", line 11, in <module>
import _fftpack
ImportError: DLL load failed: The specified module could not be found.
>>>
Never mind, ProcMon helped me finally figure it out.
It turns out that that version does include FFTPack, but does not include libmmd.dll, which seems to (?) be part of Intel's Math Kernel Library.
If you have the library available in your PATH then it should indeed work.