Google Colab - FileNotFoundError - python

i am trying to train YOLOv3 with custom dataset on Google Colab. I uploaded my folders, weights etc. When I run my train.py, I get path error. I run the code like this:
!python3 "drive/TrainYourOwnYolo/2_Training/Train_YOLO.py"
The error says,
content/TrainYourOwnYolo/2_Training/dataset/img20.jpg is not found.
As I understand on Colab, my all folders are under drive folder. I don't understand why yolo is trying to find my dataset under content folder. Do you have any idea?

As it seems, you have uploaded your data to /drive/TrainYourOwnYolo/, and not to /content/TrainYourOwnYolo/, where your script is looking.
The /content folder is normally used by Colab when saving, in case you don't use Google Drive. But you have mounted your Google Drive under /drive, so your notebook unsurprisingly fails to find the files.
You should change the file paths in your Train_YOLO.py" script to replace references to /content with /drive.
If this is not possible, you can find the /content folder on the file catalogue on the left of your Colab notebook:
and by right-clicling on it, you'll see an option for uploading files there.

Related

How to upload files to Colab from Github without using Google Drive?

I have started to use Colab. Now I want to from the notebook automatically upload a couple of smaller files from my Github repository.
My idea is that I should try to upload these files directly to the workspace of the Colab virtual machine and use of Google Drive not necessary. This strategy should also facilitate sharing the notebook with others.
My Colab notebook code is as follows:
%%bash
git clone https://github.com/my_repository/folder1
%load folder1/file1.py
run -i file1.py
%load folder1/file2.zip
The first two command works fine but the two last gives error messages.
The error message when I try run file1.py is:
ERROR: root:File 'file1.py' not found.
And the error message when I try to load file2.zip
File "<string>", line unknown
SyntaxError: invalid or missing encoding declaration for 'folder1/file2.zip'
(The file2.zip contains both some text file and an executable file for linux environment)
How to solve this?
Note1. If I check the directory after the second command with !ls I see I have folder1
and when I do !ls folder1 then I see the content of that folder1. So looks ok so far.
Note2. If I mount my Google Drive and upload the folder here then I can get it all to work. But I want to avoid using Google Drive since that complicates sharing of the notebook, in my eyes.
Note3. What I can see the zip-file contains a binary that is described as ELF 64-bit LSB shared object, x86-64, version 1 (SYSV)
I found a solution I think and the code should be:
%%bash
git clone https://github.com/my_repository/folder1
%cd folder1
run -i file1.py

Loading HuggingFace tokenizer from Dropbox (or other cloud storage)

I have a classifying model, and I have nearly finished turning it into a streamlit app.
I have the embeddings and model on dropbox. I have successfully imported the embeddings as it is one file.
However the call for AutoTokenizer.from_pretrained() takes a folder path for various files, rather than a particular file. Folder contains these files:
config.json
special_tokens_map.json
tokenizer_config.json
tokenizer.json
When using the tool locally, I would direct the function to the folder and it would work.
However I am unable to direct it to the folder on DropBox, and I cannot download a folder from DropBox into Python, only a file (as far as I can see).
Is there a way of creating a temp folder on Python or downloading all the files individually and then running AutoTokenizer.from_pretrained() with all the files?
To get around this, I uploaded the model to HuggingFace so I could use it there.
I.e.
tokenizer = AutoTokenizer.from_pretrained("ScoutEU/MyModel")

How to fetch image dataset from Google Drive to Colab?

I have this very weird problem. I have searched across internet, read documentation but am not able to figure out how to do it. So what I want to do is train a classifier using Colab. And for that I have a image dataset of dogs on my local machine.
So what I did was I packed that dataset folder of images into a zip file and uploaded it onto Drive. Then from Colab I mounted the drive and from there I tried to unzip the files. Everything good. But I've realised that after sometime some of the extracted files get deleted. And thing is that those files aren't on Colab storage, but instead on Drive and I dunno why they are getting deleted after sometime. Like about an hour.
So far I've used the following commands to do the extraction -
from google.colab import drive
drive.mount('/content/drive')
from zipfile import ZipFile
filename = 'Stanford Dogs Dataset.zip'
with ZipFile(filename, 'r') as zip:
zip.extractall()
print('Done')
and also tried this -
!unzip filename -d destination
Not sure where I am going wrong. And also, dunno why the extracted files though being extracted to a subfolder within drive, also starts showing up on the main root directory. And no I am not talking about the recent section, because when I want to check their location then they points to the root of the drive. It's all so confusing.
First you mount google drive
from google.colab import drive
drive.mount('/gdrive')
Then you can copy from your drive using !cp
!cp '/gdrive/My Drive/my_file' 'my_file'
then you can work as in your pc, unzip and ...

Google CoLab - How to run a jupyter notebook file that is in the 'Files' tab (i.e. /content/) of my CoLab environment

In Google CoLab on the left is a pane that can be opened that shows Table of Contents, Code snippets, and Files.
In the Files pane there is an upload button, I can upload a notebook file to this Files area. But once the notebook file is uploaded, there is no option to run it as a notebook. The menu option File->OpenNotebook doesn't show the CoLab /content/ files as an option to start a notebook.
Is there a way to do this? Or can it be added in future releases?
The reason for this request is I'd like to git-clone a repo with multiple notebook files into the /content (or Files) area of CoLab. And then be able to easily switch between the notebooks, much like the native Jupyter notebook interface that shows a directory with potentially multiple notebooks that can be started.
I've tried right-clicking on the notebook file in Files but there is no option to start the notebook. I've tried using File->Open_notebook... the Files files aren't shown as an option in any of the tabs.
The desired results is that I can start .ipynb files (i.e. Jupyter notebooks) directly from the 'Files' or /content/ section of Google CoLab.
You can run other notebooks in your current notebook like this:
# if the file was on the google drive
%run /content/gdrive/My\ Drive/Colab\ Notebooks/DenseVideoArchitecture.ipynb
# simply replace the path in your case
%run /content/DenseVideoArchitecture.ipynb
But what you are asking is to switch between different notebooks in the same environment which might not be possible in collab.
I couldn't understand what you actually need, but I hope below code help you:
from google.colab import drive
drive.mount('/content/gdrive')
!cd content/gdrive/My Drive/Colab Notebooks
You should mount your google drive and now you have access to drive as a local drive. In this code, at first two lines, I mount gdrive and then I redirect to some place in google drive for example "Colab Notebooks" and you can run everything you want.

Making an existing director of Python programs available to Colaboratory

I have a number of directories of Python programs what I would like to execute on Colaboratory. Is there a way to do that -- other than to load and save the files one-by-one? If it helps, the directories are all in my own Google Drive. So all I would need (I think) is a way to cd to a given directory. I tried !cd .., which presumably should go to my top Google Drive directory, but it doesn't seem to work.
Just copied a directory into Google Drive\Colab Notebooks using the file explorer. But Colab refused to cd to that directory.
You'll need to mount your Google Drive before files contained therein will be available in Colab.
A recipe for how to mount Drive is available in this answer:
https://stackoverflow.com/a/47744465/8841057
After you mount the drive, don’t use !cd.
Use %cd instead.

Categories

Resources