trying to use Japanese model from Spacy.
This
import spacy
import ja_core_news_sm
nlp = spacy.load("ja_core_news_sm")
gives me
ModuleNotFoundError: No module named 'sudachidict'
and
OSError: symbolic link privilege not held
I reinstalled in administrator mode of cmd spacy and sudachipy==0.4.5 (as suggested in spacy docs), but didn't help.
How can I use this Japanese model?
Thanks
Try uninstalling sudachidict_core and reinstalling it in admin mode so that it can create the symlink from sudachidict_core to sudachidict.
Related
Im doing my project in university using Morphological Analyzer of text(russian language).Google Colab cant find the module named pymorphy2.
from pymorphy2 import MorphAnalyzer
Maybe there are sollutions for such problems?
Did you try to pip install it?
Write this and execute the cell:
!pip install pymorphy2
I have the following versions of mxnet==1.4.0 and gluonnlp==0.9.1 installed using pip.
However when I run the following codeimport gluonnlp as nlp it yields the following error
ModuleNotFoundError: No module named 'mxnet.contrib.amp'
So I try to manually import the missing module using
from mxnet.contrib import amp
import gluonnlp as nlp
which also yields an error
ImportError: cannot import name 'amp' from 'mxnet.contrib' (/usr/local/lib/python3.7/dist-packages/mxnet/contrib/__init__.py)
I've been running the code on Colab. Is there a possible workaround for this issue?
Please Advise.
I have not used these libraries but in this github issue they say:
AMP was introduced in MXNet 1.5. Could you try that version (or newer)?
So I think that the problem is there.
Cheers!
In Google colab I am to trying import BucketIterator using:
from allennlp.data.iterators import BucketIterator
But it is raising the same error again and again-
ModuleNotFoundError: No module named 'allennlp.data.iterators
After installing allennlp with the imports:
from allennlp.data.token_indexers import TokenIndexer, SingleIdTokenIndexer
from allennlp.data.tokenizers.character_tokenizer import CharacterTokenizer
from allennlp.data.vocabulary import Vocabulary
from allennlp.modules.seq2vec_encoders import PytorchSeq2VecWrapper
are working fine. Is there a way to solve this issue?
I was facing the same issue.
It won't work, since Iterators are removed from allennlp.
Install a legacy version of allennlp. allennlp is depended upon the PyTorch(torchtext) library. And since torchtext removed iterators form their newer versions allennlp did it too.
You can directly use torchtext.data.BucketIterator(). Use:
pip install torchtext==0.5.0 --user
from datasets import dataset_utils ImportError: No module named datasets.
when i am writing this in python sript.
import tensorflow as tf
from datasets import dataset_utils
slim = tf.contrib.slim
But i am getting error.
from datasets import dataset_utils
ImportError: No module named datasets
I found this solution
How can jupyter access a new tensorflow module installed in the right path?
I did the same and i have dataset packages at path anaconda/lib/python2.7/site-packages/. Still i am getting same error.
pip install datasets
I solved it this way.
You can find the folder address on your device and append it to system path.
import sys
sys.path.append(r"D:\Python35\models\slim\datasets"); import dataset_utils
You'll need to do the same with 'nets' and 'preprocessing'
sys.path.append(r"D:\Python35\models\slim\nets"); import vgg
sys.path.append(r"D:\Python35\models\slim\preprocessing"); import vgg_preprocessing
Datasets is present in https://github.com/tensorflow/models/tree/master/slim/datasets
Since 'models' are not installable from pip (at the time of writing), they are not available in python load paths by default. So either we copy them or manually add to the path.
Here is how I setup env before running the code:
# git clone or wget
wget https://github.com/tensorflow/models/archive/master.zip -O models.zip
unzip models.zip
# add it to Python PATH
export PYTHONPATH=$PYTHONPATH:$PWD/models-master/slim
# now we are good to call `python mytensorflow.py`
It's using the datasets package in the TF-slim image models library, which is in:
git clone https://github.com/tensorflow/models/
Having done that though, in order to import the module as shown in the example on the slim image page, empty init.py have to be added to the models and models/slim directories.
go to https://github.com/nschaetti/EchoTorch/releases and download the latest release
install the latest release from the downloaded file (202006291 is the latest version at the moment):
$pip install ./EchoTorch-202006291.zip
test it out using narma10_esn.py (other examples may have some issues)
you may still need to install some more python packages not listed in the requirements file but it works once you do this.
I'm having trouble using the Python spaCy library. It seems to be installed correctly but at
from spacy.en import English
I get the following import error:
Traceback (most recent call last):
File "spacy.py", line 1, in <module>
from spacy.en import English
File "/home/user/CmdData/spacy.py", line 1, in <module>
from spacy.en import English
ImportError: No module named en
I'm not very familiar with Python but that's the standard import I saw online, and the library is installed:
$ pip list | grep spacy
spacy (0.99)
EDIT
I tested renaming the file, but that's not the problem. I also get the same error when doing:
$ python -m spacy.en.download --force all
/usr/bin/python: No module named en
(The command is supposed to download some models)
For windows, open cmd with admin right. Then,
python -m spacy download en
You should see the shell prompt stating.
You can now load the model via spacy.load('en')
You are facing this error because you named your own file spacy.py. Rename your file, and everything should work.
I had the same issue, and the problem was the folder where the module 'en' was stored (spacy/lang/en).
Typing:
from spacy.lang.en import English
fixed the issue.
This post was helpful in figuring this out.
It is possible that the version of Python at /usr/bin/python is not the one that has spacy installed. If so, navigating to the directory where your 'normal' version of Python is before running
python -m spacy.en.download
should fix the problem. (For example, I installed spacy using Anaconda and had to navigate to C:\Anaconda2\ first.)
SpaCy has various models depending on the language of your choice (even contains a multi-language model), so you can have a look at this link to have a better idea on which might suit your needs.
You could also find the correct installation command here. For example, for small version model for English Language:
python -m spacy download en_core_web_sm
Hope it helps!
This Works!
import spacy
import en_core_web_sm
nlp = en_core_web_sm.load()