Re-connecting to remotely run kernel with jupyter lab - python

I am working on a remote server with jupyter lab and has one job running. However, the connection was dropped and now I'm trying to re-connect to the same running kernel. I honestly read through many examples and jupyter docs, but I couldn't find a solution. My previous run was outputting intermediate results and I am wondering whether I can re-connect back to the running kernel and continue see the output?
I normally connect via ssh:
ssh -L 8000:localhost:8080 usere#123.45.678.9
...
then I run
jupyter notebook --no-browser --port=8080
and in the browser on my local machine I simply open 'locahost:8000' and it works nicely.
I tried to repeat those steps but I can't re-connect to existing running kernel and continue see the output.
Any suggestions please?

Suddenly, I understand your problem. So you are not let server keep running. Instead, you manually launch it everytime.
Basically the idea is that you need to make it keep running. Somehow like nohup jupyter notebook --no-browser --port=8080 & or use systemd. So that when you lose connection, the jupyter server is still running.
Then you can just reconnect to server by ssh -L 8000:localhost:8080 usere#123.45.678.9. And open locahost:8000. Finally you will see that everything is just the same as you left.

Related

Error when open Jupyter Notebook from putty

I'am trying to open Jupyter Notebook from Putty. I have a server where is installed Python and Jupyter. I followed all the steps from this post Remote access Jupyter notebook from Windows but it doesn't work. I got the error: This site can't be reached.
Any idea?
Thx
edit: I added a photo with the ps from putty and the error from browser. It's said:
This site can't be reached
.
The good old logout/reboot mechanisms work in this case too!
You can try closing the jupyter connection in the tunnel's connection, logout from the remote server, re-connect and try. Has worked a few times for me today.
If that doesn't work, there might be stale jupyter notebook processes on the remote server. Query for them and kill them and then logout-log back in and try.
If that still does not work, try implementing your code through ipython to check if Jupyter is working at all, even if its user interface isn't loading on your browser.

How to run jupyter notebooks on a remote server with job submissions?

I am trying to access some data from a simulation that I have run on a supercomputer that I have access to. I want to process it using a jupyter notebook, but don't want to download the data. Therefore, I want to run the jupyter notebook on the remote server and somehow access it from my local directory.
I am aware of the past solutions using port forwarding, but this does not work in my case (I've tried it!)
I think the reason for this is that I'm not actually running the jupyter notebook on the remote server. The remote server (say me#remoteserver) is just the node where I login. I then qsub a job submission script which runs on a different node.
Is there a way to access jupyter notebooks that I run using this job submission script?
Maybe sharing more about how you tried it with qsub might make it easy to find the solution. I use slurm on my remote machine but I guess the steps should be the same.
You can first request a compute node with
## >> qsub -I -q shared -l nodes=1:ppn=1,walltime=2:00:00
then when you have resource allocated
## >> jupyter notebook --no-browser --port="port number" --ip='/bin/hostname'
Make sure to replace the port number and the hostname.
Copy the generated URL into your browser to become able to access the notebook.

Jupyter notebook error: The port 8888 is already in use, trying another port

I just installed Anaconda on y windows 10, but when I try to run the jupyter notebook or jupyter lab by the icon, nothing happend.
Tried to run the jupyter by anaconda prompt and got the error: The port 8888 is already in use, trying another port.
Tried to run in several different ports and the error is the same.
Tried to kill the task by PID and when I run the jupyter again by promp and it lasts forever and nothing happens
Tried to change the browser by manual config and everything is the same
Tried to uninstall anaconda and install again several times and boom: Same errors
I checked also proxy stuff but everything related to this is unable on my pc
Checked firewall and antivirus and also everything seems to be normal
Any other recommendation is welcome :)
As mentioned here you may just change port to other, for example:
jupyter notebook --port 8889

nohup jupyter notebook on remote server - closed browser, is the output still there?

I am running a jupyter noteboook on AWS EC2 via
nohup jupyter notebook
Unfortunately my notebook went off, but when I reopen it, the notebook says:
kernel connected
I have a few print() commands in the script to see how the program is progressing. However th se has stopped (is by design of the jupyter notebook and cannot be changed yet). However, I wonder is there a way to check the progress of my program now? is it even still running giving me an output?
I have assigned the results to a variable and also at the end of long-running loop a command to export to csv...

How to use iPython notebook with ngrok

(Maybe better ask on superuser?)
iPython works fine if I use an ssh tunnel.
Using ngrok
iPython notebook loads I get an error about mathjax not loading.
I can enter code into cells but if I try to execute I get no result but the kernel seems to be running. Basically nothing works. I have no idea if I am doing something wrong or if this just wont work.
I am starting starting ngrok like this
./ngrok -authtoken myauthtoken 5023
and ipython notebook like
ipython notebook --no-browser --port=5023
Then connect to the iPython session at
https://mysubdomain.ngrok.com
Author of ngrok here: ipython notebooks and any other websocket connections now work properly as of ngrok 0.22 which is available at https://ngrok.com/download

Categories

Resources