First off, apologies if my question is worded naively, as I am new to Python and Colab development. I have mounted my Drive onto my Colab notebook and inserted a path to a directory labeled "backend". It is structured as follows:
backend->
solutions->
__init__.py
**(other files)
**(other files)
I am trying to import my solutions directory as a package into my main.py script, which I have as a code block at the end of my notebook. I am attempting to "import solutions" into main.py, but it is telling me that "import solutions cannot be resolved" even though it recognizes its path. Does anyone have any ideas of what is happening here? The code and import works as expected on my local machine. I even attempted to make solutions into a package by including a README and a simple setup.py file, and while I was able to install it, it gave me a SystemExit error which I suspect originated from my setup.py.
---EDIT---
new setup that copies the "solutions" directory to content. Still getting same error
What have you tried yet?
Have you tried the following?
# Mount your google drive in google colab
from google.colab import drive
drive.mount('/content/gdrive')
# Insert the directory
import sys
sys.path.insert(0,'/content/gdrive/My Drive/Colab Notebooks')
From here, be sure to have given access to your "My Drive" (security pop-up)
Then check:
!ls gdrive/MyDrive
On the left panel you should see something like this now:
"test-colab" is a folder I've created just to show you
Then you copy/import the folder you want into your content:
Adapt the code below with your own paths
!cp -av '/content/gdrive/MyDrive/test-colab' '/content/'
Check:
Check working directory before trying to import a .yp:
Related
So I have been trying to make this file compatible to Google Colab but I'm not able to find any way to do it.
[]
EfficientDet-DeepSORT-Tracker is the main folder of this entire package
This picture is from one of the files placed alongside backbone.py
How to fix the fact that the file isn't able to detect backbone.py?
EDIT for more context: I shared the errors I found when trying to run waymo_open_dataset.py which isn't able to detect the other .py files alongside it.
According to this past question you could import 'filename' of the filename.py. So in the main.py file you are trying to run in colab, then import the required files in the main.py file.
I have started to use Colab. Now I want to from the notebook automatically upload a couple of smaller files from my Github repository.
My idea is that I should try to upload these files directly to the workspace of the Colab virtual machine and use of Google Drive not necessary. This strategy should also facilitate sharing the notebook with others.
My Colab notebook code is as follows:
%%bash
git clone https://github.com/my_repository/folder1
%load folder1/file1.py
run -i file1.py
%load folder1/file2.zip
The first two command works fine but the two last gives error messages.
The error message when I try run file1.py is:
ERROR: root:File 'file1.py' not found.
And the error message when I try to load file2.zip
File "<string>", line unknown
SyntaxError: invalid or missing encoding declaration for 'folder1/file2.zip'
(The file2.zip contains both some text file and an executable file for linux environment)
How to solve this?
Note1. If I check the directory after the second command with !ls I see I have folder1
and when I do !ls folder1 then I see the content of that folder1. So looks ok so far.
Note2. If I mount my Google Drive and upload the folder here then I can get it all to work. But I want to avoid using Google Drive since that complicates sharing of the notebook, in my eyes.
Note3. What I can see the zip-file contains a binary that is described as ELF 64-bit LSB shared object, x86-64, version 1 (SYSV)
I found a solution I think and the code should be:
%%bash
git clone https://github.com/my_repository/folder1
%cd folder1
run -i file1.py
I am trying to run my program on Google Colab; where my code make use of .py files written seprately.
In normal system I have all files inside one folder and it works using import xyz, but when I tried using same folder in Google drive it gives import error.
Now in googlecolab(Nov 18) you can upload your python files easily
Navigate to Files (Tab on your left panel)
Click on UPLOAD Upload your python folder or .py files
Use googlecolab book to access the file.
Please check my screenshot below!
If you have just 2-3 files, you can try the solution I gave in another question here.
Importing .py files in Google Colab
But if you have something like 5-10 files, I would suggest you put your library on github, then !git clone it to Google Colab. Another solution is to zip all you library files, then modify the first solution by unzipping with !unzip mylib.zip
If those library files are not in a folder structure, just a few files in the same folder. You can upload and save them then import them. Upload them with:
def upload_files():
from google.colab import files
uploaded = files.upload()
for k, v in uploaded.items():
open(k, 'wb').write(v)
return list(uploaded.keys())
For example you have a module like this
simple.py
def helloworld():
print("hello")
Click arrow on left panel => Choose File tab => Upload simple.py
In notebook code like this
import simple
simple.helloworld()
=> hello
Something I've used when I have multiple python scripts and want to automatically import through code is to set it up as a package and clone from the repo.
First set up the scripts in a repo with a setup.py and __init__.py files (obviously).
Then add this to the top of your notebook:
!rm -rf <repo-name> # in case you need to refresh after pushing changes
!git clone https://github.com/<user>/<repo-name>.git
Then install the package:
!pip install ./<repo-name>
Now conveniently import functions or whatever:
from <app-name>.<module> import <function>
I found this easiest way
from google.colab import drive
drive.mount('/content/drive')
%cd /content/drive/MyDrive/directory-location
I manually removed /Library/Python/2.7/site-packages/google through terminal (rm), however it still seems I can import the package in python 2.7.
I am able to run import google but when I print google.__path__ it displays ['/Library/Python/2.7/site-packages/google'] even though that directory no longer exists because I deleted it.
I initially deleted this package because it was giving me import errors when trying to run google's app engine api, so I need to have import google be unlinked to this directory.
Any help would be greatly appreciated!
Try starting python in verbose mode. This will show from where packages are imported. Since the output can overflow, write it to a text file.
python -v 2>&1 | tee out.txt
>>import google
>>exit()
Open out.txt and see from where google package is being imported.
As suggested earlier, import issues can be avoided by using virtualenv.
I have installed hadoop with this tutorial, hbase with this one, and hbase.thrift with this one
Now I have a given python script, which is there to created some hbase tables. When I run the py file it gives me the error:
Traceback (most recent call last):
File "./createTables.py", line 9, in <module>
from hbase import Hbase
ImportError: No module named hbase
This question seemed to have the same trouble: How can I import hbase in python?
I tried the solution given there. I ran
thrift --gen py Hbase.thrift
in the /usr/lib/hbase-0.94.2/src/main/resources/org/apache/hadoop/hbase/thrift folder, where the Hbase.thrift was located. Ot created the subfolder gen-py, as descibed in the tutorial linked in above similiar question.
Now, if I get the "Simply take that command and copy it to your default module folder (or in the folder where you run your program and it should work)." Part of given solution there right, I go to the folder in which my given py file is located (say /home/kumo/Downloads/createTables.py) and run
thrift --gen py /usr/lib/hbase-0.94.2/src/main/resources/org/apache/hadoop/hbase/thrift/Hbase.thrift
...? But nothing happes with that. Copying the Hbase.thrift file into the Downloads folder next to the py file, gives only
[FAILURE:arguments:1] Could not open input file with realpath: ./Hbase.thrift
So obviously not helping either.
I also tried adding
import sys
sys.path.append('/usr/lib/hbase-0.94.2/src/main/resources/org/apache/hadoop/hbase/thrift/gen-py')
gave the same intial missing modules error again.
I also tried adding the 5.c. step of the thrift tutorial by adding the python path in the .bashrc:
export PYTHONPATH=$PYTHONPATH:/usr/lib/hbase-0.94.2/src/main/resources/org/apache/hadoop/hbase/thrift/gen-py
did not really work.
I tired the same with the path /usr/local/hadoop/src/contrib/thriftfs/gen-py, as that is another gen-py folder that somehow popped up, both as a sys import and pythonpath export, but it still gives me the same error.
I am still new to all of this, so I just followed the tutorials step by step. I have no clue what I may have missed or wasn't in the tutorials to begin with.
Thanks for any help!
Not sure what exactly is the problem in your case but I'm quite happy with HappyBase - you might want to give it a try.
The solution seems to be a link in the python folder to the generated gen-py folder. I moved the unzipped hbase folder to an own "software" folder in home, then I created the link:
cd /usr/local/lib/python2.7/dist-packages/
ln -s /home/kumo/software/hbase-0.94.2/src/main/resources/org/apache/hadoop/hbase/thrift/gen-py/hbase/
and all was fine about that error. Not sure, what happens with multiple projects, through.