I am trying to run the stablediffusion example from this link: https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_diffusion.ipynb#scrollTo=zTDA0fvtuOIQ
!pip install diffusers==0.3.0
!pip install transformers scipy ftfy
!pip install "ipywidgets>=7,<8"
!pip install transformers
from google.colab import output
output.enable_custom_widget_manager()
from huggingface_hub import notebook_login
notebook_login()
I installed the required dependencies and restarted the runtime and then ran the following code:
import torch from diffusers import StableDiffusionPipeline pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", revision="fp16", torch_dtype=torch.float16, use_auth_token=True)
But It spits out the following errors:
ImportError: cannot import name 'CLIPFeatureExtractor' from 'transformers' (/usr/local/lib/python3.7/dist-packages/transformers/init.py)
How should I fix the problem?
Related
I follow the instructions in the github that says to install Tensorflow Federated with Collab we need to install version 0.20.0 but I get this error when I try to run the toturials.
!pip install --quiet tensorflow-federated==0.20.0 # The latest version of tensorflow-federated is not working with the colab python version
!pip install --quiet --upgrade nest-asyncio
from __future__ import absolute_import, division, print_function
import collections
from six.moves import range
import numpy as np
import tensorflow as tf
# from tensorflow import compat
from tensorflow_federated import python as tff
np.random.seed(0)
tf.compat.v1.enable_v2_behavior()
tff.federated_computation(lambda: 'Hello, World!')()
Error:
module 'tensorflow_federated.python' has no attribute 'federated_computation'
What is the problem I don't understand? How can I install it on google colab. There is no resource for this problem.
Are you sure you need to import this
from tensorflow_federated import python as tff
instead of
import tensorflow_federated as tff
According to the Tensorflow docs, the federated_computation was under tensorflow_federated directly.
I try to import a Python package from github. I am working in Google Colab.
The repository is at the following url https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.
So I use the following code
!pip install --upgrade
!pip install -q git+git://github.com/microsoft/nlp-recipes/tree/master/utils_nlp
from utils_nlp import *
I tried as well
!pip install -q git+https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp
I saw other (working) examples where the url ends in .git :
!pip install -q git+https://github.com/huggingface/transformers.git
so I tried in turn
!pip install -q git+https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.git
But I also noticed that https://github.com/microsoft/nlp-recipes/tree/master/utils_nlp.git loads to an error page while https://github.com/huggingface/transformers.git load to https://github.com/huggingface/transformers which surprises me.
How do we load the Python package here ?
EDIT
I am using as suggested
pip install git+https://github.com/microsoft/nlp-recipes.git
then
from utils_nlp import *
works but it doesn't successfully import subfolders, it fails when I do
from utils_nlp.models.transformers.abstractive_summarization_bertsum \
import BertSumAbs, BertSumAbsProcessor
whereas utils_nlp does contain a folder models which in turn contains transformers
The error stack is then the following
/usr/local/lib/python3.7/dist-packages/utils_nlp/models/transformers/abstractive_summarization_bertsum.py in <module>()
15 from torch.utils.data.distributed import DistributedSampler
16 from tqdm import tqdm
---> 17 from transformers import AutoTokenizer, BertModel
18
19 from utils_nlp.common.pytorch_utils import (
ModuleNotFoundError: No module named 'transformers'
So strangely code in utils_nlp.models.transformers.abstractive_summarization_bertsum doesn't resolve the dependency to transformers
This is the correct way to install it:
pip install git+https://github.com/microsoft/nlp-recipes.git
You can't install a module, only a package can be installed. After the installation you can go on with the rest of your code
from utils_nlp import *
...
As you can read in the setup guide (which you should definitely read when installing a package):
The pip installation does not install any of the necessary package dependencies, it is expected that conda will be used as shown above to setup the environment for the utilities being used.
This explains your error: the transformers package is not installed automatically, you need to install it on your own (simply use pip install transformers). This is also confirmed in the setup.py file: the are no "install_requires" dependencies.
I'm trying the include of Detectron2.data on Google Colab. I made the connection for colab & my drive. And after that:
!pip install pyyaml
!pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.7/index.html
It worked without any error.
i have been trying this;
import numpy as np
import os, json, cv2, random
from google.colab.patches import cv2_imshow
import detectron2
from detectron2.data import MetadataCatalog, DatasetCatalog
from detectron2.utils.visualizer import Visualizer
from detectron2 import model_zoo
But the outputs like this:
enter image description here
How can i fix that?
I fixed like this:
!pip install pyyaml==5.1
import torch, torchvision
print(torch.__version__, torch.cuda.is_available())
!gcc --version
import torch
assert torch.__version__.startswith("1.8")
!pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.8/index.html
# exit(0) # After installation, you need to "restart runtime" in Colab. This line can also restart runtime
from: https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5#scrollTo=ZyAvNCJMmvFF
I have installed flair library via the following command
!pip install flair
but when i tries to import it, it will generate error like "ModuleNotFoundError: No module named 'flair'"
Code:
import torch
import numpy as np
from flair.data import Sentence
from flair.embeddings import TransformerDocumentEmbeddings
install via the following command make sure you use --user option otherwise you will get a permission error in windows 10.
!pip install --user flair
after install flair you have to restart kernel in jupyter notebook
I'm trying to import osmnx on google Colab and it did install successfully using !pip install osmnx but when I try to import it in Colab like this import osmnx give me this error
AttributeError: /usr/bin/python3: undefined symbol: Error_GetLastErrorNum
Does anyone know how to fix this error?
You need to install libspatialindex-dev first.
!apt install libspatialindex-dev
!pip install osmnx
Then you can import it
import osmnx