I am trying to run my program on Google Colab; where my code make use of .py files written seprately.
In normal system I have all files inside one folder and it works using import xyz, but when I tried using same folder in Google drive it gives import error.
Now in googlecolab(Nov 18) you can upload your python files easily
Navigate to Files (Tab on your left panel)
Click on UPLOAD Upload your python folder or .py files
Use googlecolab book to access the file.
Please check my screenshot below!
If you have just 2-3 files, you can try the solution I gave in another question here.
Importing .py files in Google Colab
But if you have something like 5-10 files, I would suggest you put your library on github, then !git clone it to Google Colab. Another solution is to zip all you library files, then modify the first solution by unzipping with !unzip mylib.zip
If those library files are not in a folder structure, just a few files in the same folder. You can upload and save them then import them. Upload them with:
def upload_files():
from google.colab import files
uploaded = files.upload()
for k, v in uploaded.items():
open(k, 'wb').write(v)
return list(uploaded.keys())
For example you have a module like this
simple.py
def helloworld():
print("hello")
Click arrow on left panel => Choose File tab => Upload simple.py
In notebook code like this
import simple
simple.helloworld()
=> hello
Something I've used when I have multiple python scripts and want to automatically import through code is to set it up as a package and clone from the repo.
First set up the scripts in a repo with a setup.py and __init__.py files (obviously).
Then add this to the top of your notebook:
!rm -rf <repo-name> # in case you need to refresh after pushing changes
!git clone https://github.com/<user>/<repo-name>.git
Then install the package:
!pip install ./<repo-name>
Now conveniently import functions or whatever:
from <app-name>.<module> import <function>
I found this easiest way
from google.colab import drive
drive.mount('/content/drive')
%cd /content/drive/MyDrive/directory-location
Related
I have a python machine learning project in a folder. I want to change the whole project into ipynb formate notebook. Actually I want it in Google Colab.
You can do the following to achieve your goal in easiest way:
Compress your folder to a zip file
Open google colab and create a new notebook
On the left side there will be a menu, click the upload button on the upper left corner of that menu and upload the folder.
User ' !unzip /path-to-file' to unzip the whole folder.
Ps: you can copy the zipped folder path by right clicking on it and it will give you a few options including 'copy path'.
Use p2j to convert Python source code to Jupyter Notebook.
From the command line, run
pip install p2j
then go to the directory where your file is located. -->( for example-> cd downloads, if the file is in download directory)
then run
p2j myscript.py
This will create a myscript.ipynb file.
So I have been trying to make this file compatible to Google Colab but I'm not able to find any way to do it.
[]
EfficientDet-DeepSORT-Tracker is the main folder of this entire package
This picture is from one of the files placed alongside backbone.py
How to fix the fact that the file isn't able to detect backbone.py?
EDIT for more context: I shared the errors I found when trying to run waymo_open_dataset.py which isn't able to detect the other .py files alongside it.
According to this past question you could import 'filename' of the filename.py. So in the main.py file you are trying to run in colab, then import the required files in the main.py file.
First off, apologies if my question is worded naively, as I am new to Python and Colab development. I have mounted my Drive onto my Colab notebook and inserted a path to a directory labeled "backend". It is structured as follows:
backend->
solutions->
__init__.py
**(other files)
**(other files)
I am trying to import my solutions directory as a package into my main.py script, which I have as a code block at the end of my notebook. I am attempting to "import solutions" into main.py, but it is telling me that "import solutions cannot be resolved" even though it recognizes its path. Does anyone have any ideas of what is happening here? The code and import works as expected on my local machine. I even attempted to make solutions into a package by including a README and a simple setup.py file, and while I was able to install it, it gave me a SystemExit error which I suspect originated from my setup.py.
---EDIT---
new setup that copies the "solutions" directory to content. Still getting same error
What have you tried yet?
Have you tried the following?
# Mount your google drive in google colab
from google.colab import drive
drive.mount('/content/gdrive')
# Insert the directory
import sys
sys.path.insert(0,'/content/gdrive/My Drive/Colab Notebooks')
From here, be sure to have given access to your "My Drive" (security pop-up)
Then check:
!ls gdrive/MyDrive
On the left panel you should see something like this now:
"test-colab" is a folder I've created just to show you
Then you copy/import the folder you want into your content:
Adapt the code below with your own paths
!cp -av '/content/gdrive/MyDrive/test-colab' '/content/'
Check:
Check working directory before trying to import a .yp:
I have downloaded large image training data as zip from this Kaggle link
https://www.kaggle.com/c/yelp-restaurant-photo-classification/data
How do I efficiently achieve the following?
Create a project folder in Google Colaboratory
Upload zip file to project folder
unzip the files
Thanks
EDIT: I tried the below code but its crashing for my large zip file. Is there a better/efficient way to do this where I can just specify the location of the file in local drive?
from google.colab import files
uploaded = files.upload()
for fn in uploaded.keys():
print('User uploaded file "{name}" with length {length} bytes'.format(
name=fn, length=len(uploaded[fn])))
!pip install kaggle
api_token = {"username":"USERNAME","key":"API_KEY"}
import json
import zipfile
import os
with open('/content/.kaggle/kaggle.json', 'w') as file:
json.dump(api_token, file)
!chmod 600 /content/.kaggle/kaggle.json
!kaggle config set -n path -v /content
!kaggle competitions download -c jigsaw-toxic-comment-classification-challenge
os.chdir('/content/competitions/jigsaw-toxic-comment-classification-challenge')
for file in os.listdir():
zip_ref = zipfile.ZipFile(file, 'r')
zip_ref.extractall()
zip_ref.close()
There is minor change on line 9, without which was encountering error.
source: https://gist.github.com/jayspeidell/d10b84b8d3da52df723beacc5b15cb27
couldn't add as comment cause rep.
You may refer with these threads:
Import data into Google Colaboratory
Load local data files to Colaboratory
Also check out the I/O example notebook. Example, for access to xls files, you'll want to upload the file to Google Sheets. Then, you can use the gspread recipes in the same I/O example notebook.
You may need to use kaggle-cli module to help with the download.
It’s discussed in this fast.ai thread.
I just wrote this script that downloads and extracts data from the Kaggle API to a Colab notebook. You just need to paste in your username, API key, and competition name.
https://gist.github.com/jayspeidell/d10b84b8d3da52df723beacc5b15cb27
The manual upload function in Colab is kind of buggy now, and it's better to download files via wget or an API service anyway because you start with a fresh VM each time you open the notebook. This way the data will download automatically.
Another option is to upload the data to dropbox (if it can fit), get a download link. Then in the notebook do
!wget link -0 new-name && ls
In a google drive, suppose a jupyter notebook and a python file my_module.py copied in the same directory. How to import my_module from the notebook when run with google colaboratory?
When the notebook is run locally, import my_module just works.
At the bottom of the welcome notebook at colab.research.google.com there's a link to an example notebook titled "Loading and saving data: local files, Drive, Sheets, Google Cloud Storage" (https://colab.research.google.com/notebook#fileId=/v2/external/notebooks/io.ipynb). This gives recipes for how to copy files from Google Drive to the colaboratory runtime which you can use to copy your module code so it will be visible to the runtime's import machinery.
I has same problem before i fixed it. there is few way.
1.Use right python version, google colaboratory have 2 python version 2.7 and 3.6.1.
2.Add init.py blank content in directory, example:
dir/
__init__.py
my_module.py
yournotebook.ipynb
3.Remove all pycache *.pyc in dir/
cd dir
!rm *pyc