Download colab notebook from itself - python

Is there a way to download the google colab notebook from one of its cells?
The issue appears when co-working on the same notebook and one accidentally overwrites the output of another. I want to save the ipynb file automatically after running my cells.
I'm aware of the way to download a file that was created by the notebook:
from google.colab import files
files.download('example_file.csv')
But I couldn't locate where the notebook's ipynb is placed in the filesystem outside the content directory.

Related

Automatically download a file from a notebook

I have a kaggle notebook that creates a file in working directory after a long run, I have a direct link to it r = FileLink(r'wallet.csv'). How do I automatically download it to my C drive on my pc?
I think I can use zip, but I don't understand how

Jupyter Notebook/Lab set current directory to ipynb file's

Desired behaviour
We have an existing workflow in vanilla Jupyter Notebook/Lab where we use relative paths to store outputs of some notebooks. Example:
/home/user/notebooks/notebook1.ipynb
/home/user/notebooks/notebook1_output.log
/home/user/notebooks/project1/project.ipynb
/home/user/notebooks/project1/project_output.log
In both notebooks, we produce the output by simply writing to ./output.log or so.
Problem
However, we are now trying Google Dataproc with Jupyter optional component, and the current directory is always / regardless of which notebook it's run from. This applies for both the notebook and Lab interfaces.
What I've tried
Disabling c.FileContentsManager.root_dir='/' in /etc/jupyter/jupyter_notebook_config.py causes the current directory to be set to wherever I started jupyter notebook from, but it is always that initial starting folder instead of following the .ipynb notebook files.
Any idea on how to restore the "dynamic" current directory behaviour?
Even if it's not possible, I'd like to understand how Dataproc even makes Jupyter behave differently.
Details
Dataproc Image 2.0-debian10
Notebook Server 6.2.0
Jupyterlab 3.0.18
No it is not possible to always get the current directory where your .ipynb file is. Jupyter is running from the local filesystem of the master node of your cluster. It will always take the default system path for its kernel.
In other cases(besides dataproc) also it is not possible to consistently get the path of a Jupyter notebook. You can check out this thread regarding this topic.
You have to mention the directory path for your log file to be saved in the desired path.
Note that the GCS folder in your Lab refers to the Google Cloud storage Bucket of your cluster. You can create .ipynb in GCS but when you will execute the file it will be running inside the local system.Thus you will not be able to save log files in GCS directly.
EDIT:
It's not only Dataproc who makes Jupyter behave differently.If you use Google Colab notebooks there you will also see the same behaviour.
The reason is because youre always executing code in the kernel does not matter where the file is. And in theory multiple notebooks could connect to that kernel.Thus you can't have multiple working directories for the same kernel.
As I mentioned earlier by default if you're starting a notebook, the current working directory is set to the path of the notebook.
Link to the main thread -> https://github.com/ipython/ipython/issues/10123
Definitely a general solution for most use-cases seems to be what is described right here in the github issue: https://github.com/ipython/ipython/issues/10123#issuecomment-354889020

How to config automatic sync Jupyter notebook .ipynb and .py files in VSCode e.g. by using Jupytext

I have written some Jupyter notebooks using original Jupyter notebook web interface. All notebooks are synced nicely in this way.
But now, I would like to edit my notebooks in the VSCode. But I cannot configure syncing notebook file with its python script.
I tried this using jupytext:
created file jupytext in the folder ~/.config
put the next code into this file:
# Always pair ipynb notebooks to py:percent files
default_jupytext_formats = "ipynb,py:percent"
But no effect!
(Update) Can this be achieved, as a first solution, using VSCode Tasks (I am not used tasks yet)?
May be it possible to run the task with jupytext command if the notebook file is opened/saved/modified?
Currently, VSCode does not support such a function. The Jupyter function in VSCode is provided by a Python extension, which supports us to convert between .ipynb files and .py files in VSCode.
.ipynb files to .py files : Export as python script.
.py files to .ipynb files : Right click, "Export Current Python File as Jupyter Notebook"
I have submitted the requirement you described, and we look forward to the realization of this feature. Giuhub link: How to synchronize the jupyter file and python file of VSCode.

File not found when running a file in JupiterLab console

Every time when I try to run a file in the JupiterLab console I get the following message:
ERROR:root:File 'thing.py' not found.
In this case, my file is called thing.py and I try to run it with the trivial run thing.py command in the console. The code is running and it gives me correct results when executed in the console, but I wanted to have it saved, so I put it in a JupiterLab text file and changed the extension to .py instead of .txt. But I get the aforementioned message regardless of which file I try to run. I am new to JupiterLab and admit that I might have missed something important. Every help is much appreciated.
If you're running Jupyterlab you should be able:
to create a new file & paste in your commands
Rename that file to "thing.py"
And then open a console in the same Jupyterlab instance and run that file. Notice that you can see "thing.py" in the file explorer on the left:
Alternatively, you can use the %load magic command in a notebook to dynamically load the code into a notebook's cell.
You might want to understand exactly what a Jupyter Lab file is, and what a Jupyter Lab file is not. The Jupyter Notebooks have the extension, .ipynb.
So anyway, the Jupyter Notebooks are not saved or formatted with python extensions. There are no Jupyter Notebooks or Jupyter Labs ending with the .py extension. That means Jupyter will not recognize an extension having .py, or .txt or .R or etc.... Jupyter opens, reads, and saves files having the .ipynb extension.
Jupyter Notebooks are an open document format based on JSON.
Jupyter can export in a number of different formats. Under the File tab, is the Export feature. The last time I looked there were about 20 different export formats. But there isn't a python or .py export format. A Jupyter file can also be Downloaded. Also under the File tab is the Download feature. This will download a standard text formatted JSON file. JSON files are mostly unreadable unless you've spent years coding JSON.
So there's not much purpose in downloading the Jupyter file unless you are working on a remote server and cannot save your work at that site. And it makes much more sense to save and copy the Jupyter file in its native Jupyter format - that means having the extension, .ipynb . Then just open and use that file on another PC.
Hopefully this should clarify why Jupyter won't open any .py or .txt files.

Google CoLab - How to run a jupyter notebook file that is in the 'Files' tab (i.e. /content/) of my CoLab environment

In Google CoLab on the left is a pane that can be opened that shows Table of Contents, Code snippets, and Files.
In the Files pane there is an upload button, I can upload a notebook file to this Files area. But once the notebook file is uploaded, there is no option to run it as a notebook. The menu option File->OpenNotebook doesn't show the CoLab /content/ files as an option to start a notebook.
Is there a way to do this? Or can it be added in future releases?
The reason for this request is I'd like to git-clone a repo with multiple notebook files into the /content (or Files) area of CoLab. And then be able to easily switch between the notebooks, much like the native Jupyter notebook interface that shows a directory with potentially multiple notebooks that can be started.
I've tried right-clicking on the notebook file in Files but there is no option to start the notebook. I've tried using File->Open_notebook... the Files files aren't shown as an option in any of the tabs.
The desired results is that I can start .ipynb files (i.e. Jupyter notebooks) directly from the 'Files' or /content/ section of Google CoLab.
You can run other notebooks in your current notebook like this:
# if the file was on the google drive
%run /content/gdrive/My\ Drive/Colab\ Notebooks/DenseVideoArchitecture.ipynb
# simply replace the path in your case
%run /content/DenseVideoArchitecture.ipynb
But what you are asking is to switch between different notebooks in the same environment which might not be possible in collab.
I couldn't understand what you actually need, but I hope below code help you:
from google.colab import drive
drive.mount('/content/gdrive')
!cd content/gdrive/My Drive/Colab Notebooks
You should mount your google drive and now you have access to drive as a local drive. In this code, at first two lines, I mount gdrive and then I redirect to some place in google drive for example "Colab Notebooks" and you can run everything you want.

Categories

Resources