I work with Jupyter Notebooks with VSCode as my editor. I have noticed that my checkpoint files are not being properly saved in the .ipynb_checkpoints folder as should be.
To illustrate, I made a notebook called test.ipynb, and after starting to work on it (added a few lines of code), the following notebooks were autogenerated in the same directory (not in the .ipynb_checkpoints folder):
test-d636f411-dc52-42c3-8fd5-3ff98a144a25.ipynb
test-804fd168-59a6-40f4-bdb1-fcf1b2fe899f.ipynb
test-1dbb5747-9833-40cb-a382-4a42cae0008a.ipynb
This happens with all my notebooks and, as a result, crowds my working space. I keep manually deleting these redundant files (which by the way have no code in them), but I think there may be something wrong with my settings.
Has anyone experienced this before and how did you solve it? Thanks in advance!
I can't reproduce your problem, but it could be an issue with the Jupyter Notebook.
Some people have met this kind of problem, you can refer to this page for more details.
Could you try to install another version of the Jupyter Notebook to make a try?
Related
Desired behaviour
We have an existing workflow in vanilla Jupyter Notebook/Lab where we use relative paths to store outputs of some notebooks. Example:
/home/user/notebooks/notebook1.ipynb
/home/user/notebooks/notebook1_output.log
/home/user/notebooks/project1/project.ipynb
/home/user/notebooks/project1/project_output.log
In both notebooks, we produce the output by simply writing to ./output.log or so.
Problem
However, we are now trying Google Dataproc with Jupyter optional component, and the current directory is always / regardless of which notebook it's run from. This applies for both the notebook and Lab interfaces.
What I've tried
Disabling c.FileContentsManager.root_dir='/' in /etc/jupyter/jupyter_notebook_config.py causes the current directory to be set to wherever I started jupyter notebook from, but it is always that initial starting folder instead of following the .ipynb notebook files.
Any idea on how to restore the "dynamic" current directory behaviour?
Even if it's not possible, I'd like to understand how Dataproc even makes Jupyter behave differently.
Details
Dataproc Image 2.0-debian10
Notebook Server 6.2.0
Jupyterlab 3.0.18
No it is not possible to always get the current directory where your .ipynb file is. Jupyter is running from the local filesystem of the master node of your cluster. It will always take the default system path for its kernel.
In other cases(besides dataproc) also it is not possible to consistently get the path of a Jupyter notebook. You can check out this thread regarding this topic.
You have to mention the directory path for your log file to be saved in the desired path.
Note that the GCS folder in your Lab refers to the Google Cloud storage Bucket of your cluster. You can create .ipynb in GCS but when you will execute the file it will be running inside the local system.Thus you will not be able to save log files in GCS directly.
EDIT:
It's not only Dataproc who makes Jupyter behave differently.If you use Google Colab notebooks there you will also see the same behaviour.
The reason is because youre always executing code in the kernel does not matter where the file is. And in theory multiple notebooks could connect to that kernel.Thus you can't have multiple working directories for the same kernel.
As I mentioned earlier by default if you're starting a notebook, the current working directory is set to the path of the notebook.
Link to the main thread -> https://github.com/ipython/ipython/issues/10123
Definitely a general solution for most use-cases seems to be what is described right here in the github issue: https://github.com/ipython/ipython/issues/10123#issuecomment-354889020
I think two weeks ago, or right at the end of Feburary, VSC updated as normal. Except, in my experience, it broke the Jupyter Notebook extension, kind of. I can still open notebooks and play with them, they still connect to the kernel and run, but I can't save any new notebooks. Which is frustrating because I need to make new notebooks for my classes.
First, I'll say that I have tried to update and reinstall jupyter. By that, I mean:
pip install --upgrade notebook
pip install --upgrade jupyterlab
pip install --upgrade jupyter
Then I uninstalled VSC, and reinstalled to see if that would fix it, but the problem remains.
Here are detailed pictures of what my issue is.
I open a new file, and go to select language
Inside the languages, jupyter isn't even there
Instead I go down to open the command palette because thats where you have to open new jupyter notebooks anyway
Here you can see it says create new notebook
It does, it works as its supposed to
Then I go to save the notebook and this is totally new. It's saving it as JSON, but has the ipynb extension. Additionally, the directory I'm saving into has other notebooks in it. They don't show up with this new .ipynb.json.
It doesn't matter if I erase, leave, force the extension to be .ipynb, .json, .ipynb.json, or if I just erase it, I get this same error, and I'm not sure why.
I can't seems to get over this problem, and I'm not sure how to fix it. Let alone, what's caused this issue. Maybe somebody has had this before and worked around it, or currently has this problem?
I can not reproduce your problem, Could you try to disable all the extensions and then only enable Python-related extensions? If it still does not work, could you try to install an older version of VSCode? You can refer to here.
Change "save as type" to "All files (.)" if *.ipynb isn't a choice outright to make sure you get *.ipynb as the extension, not *.ipynb.json.
I installed spyder with pip (not with anaconda).
Whenever I save a .py file, spyder also generates a temp file containing a copy of the file source, in the same directory. Something like this
where each file starting with "tmp" contains the source of a different version of main.py. In the past, when I installed spyder with Anaconda, this never occurred.
Is there a way to deactivate this feature? Or at least to force spyder to save these temp files somewhere else?
Not sure if you are still troubled by this issue. Are you using Dropbox or some similar services?
I've met the same problem as you did, and I found the reason on Github, see #13041.
If you can try to work outside Dropbox, you don't see any tmp files. This works for me and hope it can help you.
Ok, this must be silly but I really can't find a good way to do it.
When I open a Python session with Jupyter Notebook or a Jupyter Lab, I cannot have access to all my scripts on other folders which are generally included in PYTHONPATH.
If I run the Jupyter Notebook from Pycharm, it actually works and I can see that I have the correct PYTHONPATH by doing
import os
os.getcwd()
but if I do the same with a session started from CLI, I get a KeyError and PYTHONPATH not found.
This must have been solved a thousand times, how come I can't find anything about it?
I have a custom Jupyter kernel which runs IPython using a custom IPython profile which uses a matplotlib stylesheet.
I know to run this successfully normally I would put:
The matplotlib stylesheet in ~/.config/matplotlib/stylelib/
The IPython profile in ~/.ipython/
The kernel json in ~/.jupyter/kernels/my_kernel/
But I am doing this as part of larger program which runs in a virtualenv, and if I put the things as above then any notebook server running on the computer will be able to see the custom kernels, even if it is running outside the venv. I don't what this because I don't want my program to interfere with other notebooks on the computer.
I think what I need to do is put the things above somewhere equivalent inside the venv but I can't figure out where they should go. Doe anyone know where they would go? Or is this just a thing IPython/Jupiter can't/won't do?
It's probably worth mentioning that in the case of the stylesheet for example I don't want to just put it in the working directory of my program (which is one option matplotlib offers).
You can put kernelspecs in VIRTUAL_ENV/share/jupyter/kernels/ and they will be made available if the notebook server is running in that env. In general, <sys.prefix>/share/jupyter/kernels is included in the path to look for kernelspecs.
You can see the various locations Jupyter will look, you can see the output of jupyter --paths:
$ jupyter --paths
config:
/Users/you/.jupyter
/Users/you/env/etc/jupyter
/usr/local/etc/jupyter
/etc/jupyter
data:
/Users/you/Library/Jupyter
/Users/you/env/share/jupyter
/usr/local/share/jupyter
/usr/share/jupyter
runtime:
/Users/you/Library/Jupyter/runtime
Kernelspecs are considered data files, and will be found in any of those directories listed under data:, in a kernels subdirectory, e.g. /usr/local/share/jupyter/kernels.