In earlier versions of IPython it was possible to load a specific profile by
ipython notebook --profile=my_profile
so that I can include things like autoreload in my_profile.
Since using IPython 4.0.1 (Jupyter actually) I'm getting the
[W 09:21:32.868 NotebookApp] Unrecognized alias: '--profile=my_profile', it will probably have no effect.
warning, and the profile is not loaded. Have you come across a workaround?
In Jupyter create a kernel for each profile.
Find the kernels directory under the jupyter configuration directory {jupyter configuration}/kernels (on my Mac this is $HOME/Library/Jupyter/kernels):
$ mkdir profile-x-kernel
$ cat << EOF > profile-x-kernel/kernel.json
{
"display_name": "Profile X (Python 3)",
"language": "python3",
"env": { },
"argv": [ "python3", "-m", "ipykernel", "--profile", "profile_x", "-f", "{connection_file}" ]
}
EOF
Then you can select the kernel from the kernel menu in Jupyter, without having to restart the whole notebook server.
Related
I recently setup a fresh EC2 instance for development running Amazon Linux 2. To run the recent version of prefect (https://orion-docs.prefect.io/) I had to install an up to date version of SQLite3, which I compiled from source. I then set the LD_LIBRARY_PATH environmental variable to "/usr/local/lib", and installed python 3.10.5 with the LDFLAGS and CPPFLAGS compiler arguments to include that folder as well, so that the new sqlite libraries are found by python. All good so far, when running the jupyter notebook server or the prefect orion server from the terminal everything works fine. If I want to use the integrated jupyter environment from VS Code I run into the issue that the kernel does not start:
Failed to start the Kernel.
ImportError: /home/mickelj/.pyenv/versions/3.10.5/lib/python3.10/lib-dynload/_sqlite3.cpython-310-x86_64-linux-gnu.so: undefined symbol: sqlite3_trace_v2.
This leads me to believe that the system sqlite library is used, as this is the same error I get when I unset the LD_LIBRARY_PATH env variable. However when calling
ldd /home/mickelj/.pyenv/versions/3.10.5/lib/python3.10/lib-dynload/_sqlite3.cpython-310-x86_64-linux-gnu.so I am getting the following:
linux-vdso.so.1 (0x00007ffcde9c8000)
libsqlite3.so.0 => /usr/local/lib/libsqlite3.so.0 (0x00007f96a3339000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f96a311b000)
libc.so.6 => /lib64/libc.so.6 (0x00007f96a2d6e000)
libz.so.1 => /lib64/libz.so.1 (0x00007f96a2b59000)
libm.so.6 => /lib64/libm.so.6 (0x00007f96a2819000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f96a2615000)
/lib64/ld-linux-x86-64.so.2 (0x00007f96a3870000)
Where the new sqlite3 library is correctly referenced. If I unset the LD_LIBRARY_PATH variable the second line changes to:
libsqlite3.so.0 => /lib64/libsqlite3.so.0 (0x00007f9dce52e000)
So my guess is that the VS Code jupyter integration does not consider environment variables, so my question is: is there a way to specify them (and in particular the LD_LIBRARY_PATH) globally for VS Code or for the built-in jupyter server at runtime or anywhere else to fix this?
Recently, jupyter is repairing .env related problems.
You can try to install vscode insiders and install pre-release version of jupyter extension.
Using ipykernel to create a custom kernel spec with env variable solved this for me.
Steps:
Create a kernelspec with your environment.
conda activate myenv # checkout that venv, using conda as an example
# pip install ipykernel # in case you don't have one
python -m ipykernel install --user --name myenv_ldconf
Edit the kernelspec file, add env variable in the object
nano ~/.local/share/jupyter/kernels/myenv_ldconf/kernel.json
You will see something like this:
{
"argv": [
"/home/alice/miniconda3/envs/myenv/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "myenv_ldconf",
"language": "python",
"metadata": {
"debugger": true
}
}
After adding env variable:
{
"argv": [
"/home/alice/miniconda3/envs/myenv/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "myenv_ldconf",
"language": "python",
"env": {"LD_LIBRARY_PATH": "/home/alice/miniconda3/envs/myenv/lib"},
"metadata": {
"debugger": true
}
}
Ref: How to set env variable in Jupyter notebook
Change the kernel in vscode to myenv_ldconf.
With my conda environment activated I set an environment variable with the command
conda env config vars set ROOT_DIRECTORY=$PWD
Now, if a run echo $ROOT_DIRECTORY the output shows /home/augusto/myproject
How can I get that variable inside Jupyter Notebook? I tried with the command below, but the output shows None.
import os
print(os.getenv('ROOT_DIRECTORY'))
By the way, I have shure that Jupyter Notebook are using the correct Kernel. Running the above code inside a .py file works correctly, i.e. the output shows /home/augusto/myproject.
Find env dir
jupyter kernelspec list
cd path_to_kernel_dir
sudo nano kernel.json
Add environment section to json
json should look something like this
{
"argv": [
"/home/jupyter-admin/.conda/envs/tf/bin/python3",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"env": {"LD_LIBRARY_PATH":"/home/jupyter-admin/.conda/envs/tf/lib/"},
"display_name": "tf_gpu",
"language": "python",
"metadata": {
"debugger": true
}
}
I have added the conda environment to my Jupyter notebook, but the Python version is still 3.8 (Fig 1). What I would like to do is to create an environment which contains Python version 3.7 in a Jupyter notebook without starting from a command prompt and running two separate Jupyter notebooks (Fig 2). Is it possible to have simply one jupyter notebook with two separate environments and two different Python versions?
Fig1
Fig2
Please read this page of the docs for ipykernel.
In your environment you want to install only ipykernel (not full Jupyter), and use one of the ipykernel install --name command to register the kernels with Jupyter.
If that does not work, use jupyter kernelspec list to see which kernels jupyter see and where. A kernelspec is at minimum a kernel.json file in the right place to tell jupyter how to find kernels.
For example I have the following
$ cat ~/miniconda3/share/jupyter/kernels/python3/kernel.json
{
"argv": [
"~/miniconda3/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Python 3 (ipykernel)",
"language": "python",
"metadata": {
"debugger": true
}
}
I can use the documentation above, or create the following by hand:
$ cat ~/miniconda3/share/jupyter/kernels/python3.6/kernel.json
{
"argv": [
"~/miniconda3/envs/mypython36env/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Python 3.6 !!",
"language": "python",
"metadata": {
"debugger": true
}
}
and assuming I have the corresponding Python 3.6 env, then I'll get two kernel one of them being Python 3.6
I tried installing jupyter notebook without using anaconda and ran into some issues, specifically the red 'Kernel Error' that kept showing up.
However through this ques I was somewhat able to identify the issue where the default pythonpath in the kernel.json file in C:\Users\Ashish\AppData\Roaming\jupyter\kernels\python3 was for anaconda, so I added my python path using where python.
On running jupyter notebook on cmd and opening a .ipynb file causes a popup to show : Could not find a kernel matching Python 3. Please select a kernel, which shows an empty drop down list.
My Updated kernel.json file:
{
"argv": [
"C:\Users\Ashish\AppData\Local\Programs\Python\Python38\python.exe",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Python 3",
"language": "python"
}
Nvm, on running jupyter kernelspec list showed an error:
json.decoder.JSONDecodeError: Invalid \escape: line 3 column 6 (char 18)
Fixed it by using C:\\Users\\Ashish\\AppData\\Local\\Programs\\Python\\Python38\\python.exe in the kernel.json file
When defining my own custom ipython/jupyter kernels, is there a way to set a kernel specific default directory directly in the kernel.json file?
{
"argv": [ "python2", "-m", "IPython.kernel",
"-f", "{connection_file}"],
"display_name": "tmp_kernel",
"language": "python"
}
My goal is to create a kernel that per default creates its notebooks in my /tmp directory.
At the moment I am starting ipython in multiple terminals, which is rather inconvenient.