I run a code in Google Colab and get the initial error
/usr/local/lib/python3.7/dist-packages/utils_nlp/eval/rouge/rouge_ext.py in <module>()
23 import collections
24
---> 25 from indicnlp.tokenize import sentence_tokenize, indic_tokenize
26 from ...language_utils.hi.hindi_stemmer import hi_stem
27 from rouge import Rouge
ModuleNotFoundError: No module named 'indicnlp.tokenize'
How to install (for instance) the tokenize package from indicnlp ?
I tried
!pip install indicnlp.tokenize
which apprently doesn't crack it. How to specify from which package to pip install from ?
I also tried
!pip install indicnlp
from indicnlp import tokenize
which doesn't do it either. I then get the error
---> 30 from indicnlp import tokenize
31 from utils_nlp import eval
32 from utils_nlp.eval import rouge
ImportError: cannot import name 'tokenize' from 'indicnlp' (/usr/local/lib/python3.7/dist-packages/indicnlp/__init__.py)
Of course if I just do
!pip install tokenize
it does't know which tokenize to install
It looks like you just pip installed the wrong library. On pypi I found another project called indic_nlp_library (github repo) that seems to have the packages you're looking for. I can get
!pip install indic_nlp_library
from indicnlp.tokenize import sentence_tokenize, indic_tokenize
to work.
Looks like the indicnlp name was taken on pip by another project.
Related
I try to import a Python package from github. I am working in Google Colab.
The repository is at the following url https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.
So I use the following code
!pip install --upgrade
!pip install -q git+git://github.com/microsoft/nlp-recipes/tree/master/utils_nlp
from utils_nlp import *
I tried as well
!pip install -q git+https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp
I saw other (working) examples where the url ends in .git :
!pip install -q git+https://github.com/huggingface/transformers.git
so I tried in turn
!pip install -q git+https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.git
But I also noticed that https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.git loads to an error page while https://github.com/huggingface/transformers.git load to https://github.com/huggingface/transformers which surprises me.
How do we load the Python package here ?
EDIT
I am using as suggested
pip install git+https://github.com/microsoft/nlp-recipes.git
then
from utils_nlp import *
works but it doesn't successfully import subfolders, it fails when I do
from utils_nlp.models.transformers.abstractive_summarization_bertsum \
import BertSumAbs, BertSumAbsProcessor
whereas utils_nlp does contain a folder models which in turn contains transformers
The error stack is then the following
/usr/local/lib/python3.7/dist-packages/utils_nlp/models/transformers/abstractive_summarization_bertsum.py in <module>()
15 from torch.utils.data.distributed import DistributedSampler
16 from tqdm import tqdm
---> 17 from transformers import AutoTokenizer, BertModel
18
19 from utils_nlp.common.pytorch_utils import (
ModuleNotFoundError: No module named 'transformers'
So strangely code in utils_nlp.models.transformers.abstractive_summarization_bertsum doesn't resolve the dependency to transformers
This is the correct way to install it:
pip install git+https://github.com/microsoft/nlp-recipes.git
You can't install a module, only a package can be installed. After the installation you can go on with the rest of your code
from utils_nlp import *
...
As you can read in the setup guide (which you should definitely read when installing a package):
The pip installation does not install any of the necessary package dependencies, it is expected that conda will be used as shown above to setup the environment for the utilities being used.
This explains your error: the transformers package is not installed automatically, you need to install it on your own (simply use pip install transformers). This is also confirmed in the setup.py file: the are no "install_requires" dependencies.
I am installing google cloud datastore to build a conversational chatbot, using the command
!pip install google-cloud-datastore
This gives me following error:
google-cloud-bigquery 1.9.0 has requirement google-cloud-core<0.30dev,>=0.29.0, but you'll have google-cloud-core 1.4.3 which is incompatible.
goes on to installing google-cloud-core-1.4.3 google-cloud-datastore-1.15.3.
Execute further
from google.cloud import datastore
gives following error:
/usr/local/envs/py3env/lib/python3.5/site-packages/google/cloud/_http.py in ()
20 import warnings
21
---> 22 from six.moves import collections_abc
23 from six.moves.urllib.parse import urlencode
ImportError: cannot import name 'collections_abc'
Package google-cloud-bigquery=1.9.0 seems to be old, I have google-cloud-bigquery=1.28.0. You can try to update first your Cloud SDK components and then install google-cloud-datastore again. You can update the components with:
gcloud components update
It is possible that you will be required to execute equivalent apt-get update command.
The second error with collections_abc seems to be a compatibility issue, python2 is being deprecated, please use Python3 (pip3). If the issue with six persists, try installing different version for instance; pip3 install six==1.15.0 as recommended in this github thread.
I am trying to import the auto_arima function from pmdarima, but am encountering problems and was unable to do so.
The error message is as follow:
C:\Anaconda2\envs\ipykernel_py3\lib\multiprocessing\connection.py in <module>
19 import itertools
20
---> 21 import _multiprocessing
22
23 from . import util
ImportError: DLL load failed while importing _multiprocessing: The specified module could not be found.
I installed pmdarima using the command and was successful.
conda install -c saravji pmdarima
But I was unable to import the auto_arima function in the pmdarima package. I have tried upgrading numpy as it is mentioned in other posts, but that still didn't solve the problem.
Does anyone have any idea about this type of error? Thanks so much!
I had this same problem & and the reason is probably that you're using python 3.8 and pmdarima isn't installed there yet but u can try this while installing in jupyter
first cell: (installation)
! pip install pmdarima
import warnings
warnings.filterwarnings('ignore')
The second Cell: (installing auto_arima)
from pmdarima import auto_arima
I'm calling tide in the pytides module with Python3.7, which is installed using the pip method.Here is my python code:
from pytides.tide import Tide
I ran into the following problems:
ModuleNotFoundError: No module named 'tide'
What should I do to solve this problem?
This is my link to the petides installation package, and you can see the full code here
enter link description here
I think you should you
from pytides import Tide
It will work, correct me if it doesn't
Edit 1:
sudo apt-get install liblapack-dev libatlas-base-dev gfortran
export LAPACK=/usr/lib/liblapack.so
export ATLAS=/usr/lib/libatlas.so
export BLAS=/usr/lib/libblas.so
pip install numpy
pip install scipy
pip install pytides
It is a problem in \site-packages\pytides\__init__.py
You need to rewrite:
from . import tide
from . import astro
from . import constituent
from . import nodal_corrections
Same problem in \site-packages\pytides\constituent.py
You need to change
from . import nodal_corrections as nc
and you need to add:
from functools import reduce
And again in \site-packages\pytides\tide.py
You need to change:
from .astro import astro
from . import constituent
The pytides is outdated. Use pytides2, the newer version. Other thing to keep in mind is,if your interpreter is miniconda/anaconda, then you can't obtain it from their repository. You must create a 3.7 (and no later version) interpreter in pycharm using python3 (real or virtual environment, either is fine).
I am trying MNIST dataset, however, the code
import time
import mdp
import mnistdigits
results in the following error:
ModuleNotFoundError: No module named 'mnistdigits'
Where I can install this module using pip?
python-mnist 0.3 showed up when I googled "install mnist using pip"
pip install python-mnist