I am working on a python program in colab.
I need to import another file here. The file is saved by the name "base_positioner.ipynb" in google drive....
I have gone through multiple resources to see how to do this import and I have done the following:
from google.colab import drive
drive.mount('/content/gdrive')
%cd /content/gdrive/My Drive
On running !ls , I see 'base_positioner.ipynb' in the list but
still running : import base_positioner throws the module not found error
I had also tried the following but with no success in importing the desired file:
sys.path.append('/content/gdrive/My Drive/Colab Notebooks')
What else should I try??
This could happen if you haven't mounted your Drive on Colab to the backend properly and also possibly if your file layout in Drive is distinct from the file layout in Colab. Are you running the import command without running the following code?
from google.colab import drive
drive.mount('/content/gdrive')
%cd /content/gdrive/My Drive
If you are doing that then this won't work, as this is a pre-requisite for the mounting to take place (i.e. not running the cells sequentially). You can also try restarting Google Colab and this often fixes any strange errors.
Update:
As you mentioned, the import error likely happens due to its configuration in the main file (i.e. it requires the file to be in the .py format to be imported just as import base_positioner).
To import .ipynb extension file you will need to follow the following process:
If you want to import A.ipynb in B.ipynb write
import import_ipynb
import A
The import_ipynb module can be installed via pip or any other relevant ways.
pip install import_ipynb
Related
I'm trying to run this python code in Google Colab and I always get this error that the utils module is not installed or does not exist
yet I've ran !pip install utils and the still the same issue.
I've tried running it on my computer and it works without issues but I can't actually run it due to limited resources of my pc.
anyway anyone has a solution for this ?
Traceback (most recent call last):
File "/content/GNN-for-text-classification/preprocess/build_graph.py", line 15, in <module>
from utils.utils import loadWord2Vec, clean_str
ModuleNotFoundError: No module named 'utils.utils'
I am assuming you are using this GNN-for-Text-Classification.
Now, you have probably cloned the repository in your local system and you're running the .py files from there.
But, a New Notebook in Colab will not have the files that you have cloned/downloaded. So, when you're doing
!pip install utils
The utils package from Pypi is getting installed which doesn't contain the functions you require.
What you need is actually the utils module from GNN-for-Text-Classification, for which you'll need to clone and cd into the folder in Colab itself. Just run the following in your Colab Notebook:
!git clone https://github.com/zshicode/GNN-for-text-classification.git
%cd GNN-for-text-classification/
%ls
This will clone the repo, cd into the folder and view the contents where you will be able to find the utils module that you need.
Now you can import stuff like loadWord2Vec, clean_str without any errors.
Note that this cloning is not permanent since a new Colab instance will not keep the changes from the old one.
I'm having issues importing certain modules into GoogleColab after cloning them from a Github repo.
Does anyone have any idea what the problem is and how to solve it?
After connecting my GoogleDrive with
from google.colab import drive
drive.mount('/content/drive')
and cloning the github repo
! git clone https://github.com/naver/dope.git
it appears in my colab data structure.
Colab Data Structure
However, when running the actual code, I cannot import the Github modules.
import sys, os
import argparse
import os.path as osp
from PIL import Image
import cv2
import numpy as np
import matplotlib.pyplot as plt
import torch
from torchvision.transforms import ToTensor
#_thisdir = osp.realpath(osp.dirname(__file__))
from dope.model import dope_resnet50, num_joints
import dope.postprocess as postprocess
import dope.visu as visu
def dope_test(imagename, modelname, postprocessing='ppi'):
if postprocessing=='ppi':
sys.path.append('/content/lcrnet-v2-improved-ppi')# _thisdir+'/lcrnet-v2-improved-ppi/')
try:
from lcr_net_ppi_improved import LCRNet_PPI_improved
except ModuleNotFoundError:
raise Exception('To use the pose proposals integration (ppi) as postprocessing, please follow the readme instruction by cloning our modified version of LCRNet_v2.0 here. Alternatively, you can use --postprocess nms without any installation, with a slight decrease of performance.')
It says
Import "dope.model" could not be resolved(reportMissingImports)
Simply git cloneing a library is not enough to make Python recognize it; you need to add its location to PYTHONPATH, which tells your Python interpreter where to search modules for.
Let's say you have cloned the dope module under /content directory (as the attached picture suggests).
In this case, add /content to sys.path before importing dope-related stuff.
import sys
sys.path.append('/content')
Of course, this will make Python search for other directories that resides in /content, such as lcrnet-v2-improved-ppi and models. To prevent this from happening, just create a directory that specifically stores Python modules, and move dope inside it.
As a side note: there is %env magic command in Colab (and underlying Jupyter Notebook and IPython), which allows users to modify environment variables. But there is a famous trap; editing PYTHONPATH with %env has no effect on the Python interpreter that is already running, so you will still get ImportError!
I am working on some coding work with google colab ipython notebook. From book1, I need to call some functions that are defined in book2.
Book1 and book2 are all located in the same google drive and same folder.
book1:
from google.colab import drive
drive.mount('/content/drive', force_remount=True)
from kora import drive as kora_drive
kora_drive.link_nbs()
import book2 as b2
b2.function1()
book2:
def function1():
print('function1 called')
When I link the book2 from book, function1() can be called successfully.
But, if I changed some code in book2, for example, added a new function in book2
def function2():
print('function2 called')
After making the changes in book2, I clicked "save" and also refreshed the drive and the folder and the book2 and book1. But, when I tried to call function2 from book1, I got error:
AttributeError: module 'book2' has no attribute 'function2'
I have tried to remount the drive and reimport the book2 and also tried the following two posts, but none of them work.
https://stackoverflow.com/questions/53358250/google-colaboratory-how-to-refresh-google-drive
https://stackoverflow.com/questions/59020008/how-to-import-functions-of-a-jupyter-notebook-into-another-jupyter-notebook-in-g
Could anyone point out what I have missed here ?
thanks
UPDTAE
In book1, I have tried
!pip install import-ipynb
import import_ipynb
import sys
sys.path.insert(1, r'/content/drive/MyDrive/Colab Notebooks/book2.ipynb')
I have also tried
sys.path.insert(1, r'/content/drive/MyDrive/Colab Notebooks/')
But, the most recent updates of "book2" still cannot be found from book1.ipynb.
Just to make sure, your functions are located in py files not ipydb files correct?
If they are in py files, you can do one of the two methods below.
You need to change the directory to where the functions are located %cd "mnt/My Drive/Pyfunctions". Make sure you know where the python files you are trying to import are located.
You can also achieve this using the sys library,
import sys
sys.path.insert(1, r'/mnt/My Drive/pyfunctions')
It seems that you have successfully already loaded the function the first time around, and then you make a change to it. I believe that is not how google drive works. Once you load the drive, all the files are a snapshot of the files at that point in time. Making a change will not affect the files that are already loaded.
This is what it looked like after I added the function and tried to reload the module.
The only way to make a change register is to click Runtime / Restart Runtime or CTRL + M followed by . (the fullstop) and reload the module.
I have successfully done this in my colab runtime.
I am trying to run my project using the gpus but I can't get it to work.
I have ran the following commands:
from google.colab import drive
drive.mount('/content/gdrive')
%cd gdrive/MyDrive/project_folder
import sys
sys.path.append('/content/gdrive/MyDrive/project_folder')
I then try to run my main script from project_folder by using
! python property_prediction/predict.py
In the first line of predict.py I import a module from the folder 'project_folder' but that gives this error in colab:
File "property_prediction/predict.py", line 17, in <module>
from GP.kernels import Shortest_Path
ModuleNotFoundError: No module named 'GP
Why is it not finding the folder GP which contains my kernels script?
Try to replace your running command with:
!python -m property_prediction.predict
Or better:
from property_prediction.predict import predict # or whatever your main function is called
predict()
NB: This is of course assuming that you have a module named GP in the folder project_folder
If none of this work, you might be interested in reading this or other articles about imports with python (this is most likely not a problem of google collab)
I had import and mount the data in ggdrive by this code
import pandas as pd
from google.colab import drive
drive.mount('/content/gdrive', force_remount=True)
And then i call back a def in this file
from data.generator import DataGenerator
So i had a problem with this module name,it isn't name of a module ,it's a name of a file in direct folder. Hope someone can solve this problem for me
You have to run the python script from the right folder. So before running the python script add a line like:
%cd "/content/gdrive/MyDrive/Machine Learning/005 Handwriting Recognition/handwritten-text-recognition/src"