Remotely accessed Jupyter notebook has working slowly - python

Remotely accessed Jupyter notebook from a linux machine has working too slowly and ambari server hosts shows the components are down ?
Working jupyter notebook can connected via putty from that machine using no-browser commnd.

Related

How to Host Jupyter Notebook as a website from a terminal only Linux

Given a Terminal Only Mode of Ubuntu, is it possible to host Jupiter notebook and the GUI can be accessed over Http(/s) from another device ?
If not, then, the Jupyter notebook running on the GUI platform of a Linux box , How can the access be made public so that the Jupyter NoteBook on a Linux Box can be used as a Private Cloud service ?

connect to local runtime via google colab with gpu

my goal is connect to goocle colab gpu from my local machine via jupyter notebook
done this things from documentation:
pip install jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
opening jupyter with this command:
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0
after i'm connecting to colab with my url from terminal like this:
after this i've got this in my terminal
[I 18:12:04.374 NotebookApp] 302 GET /?token=HERE IS MY TOKEN (MY IP) 0.000000ms
actually idk what is 302 GET
finally, if i'm using os.getcwd() in Colab that shows me my LOCAL directory from my PC
and print(torch.cuda.is_available()) is False
so i've connected to Colab from my machine, but goal is vice versa, to get GPU on my local machine. so maybe i'm doing something wrong.
I think you get it wrong.
Colab is only GUI which runs in web browser and on normal Google web server (with some access to Google Drive), not on special Google server with GPU.
Colab can connect to hosted runtime which means Google server (hardware) with GPU - and then you can directly access files on this server and you can run code on hardware with GPU.
Or it can connect to local runtime which means your local computer (hardware) without GPU - and then you can directly access local files and you can run code only on local hardware.
You don't have access to both runtimes (hardwares) at the same.
This is how I see it
Connect to Google Server with GPU/TPU:
Connect to Local Computer without GPU:

Access Jupyter terminal from iTerm

When I start a jupyter notebook, I am able to start a terminal prompt running on the host machine with "New > Terminal".
Is it possible to connect to this terminal with, for instance, iTerm, instead of using the web interface?
How does jupyter connect to the remote terminal?
Note: I am using a remote Jupyter notebook with several port forwarding. I am not able to directly open a terminal onto this machine.

What is the use of Jupyter Notebook cluster

Can you tell me what is the use of jupyter cluster. I created jupyter cluster,and established its connection.But still I'm confused,how to use this cluster effectively?
Thank you
With Jupyter Notebook cluster, you can run notebook on the local machine and connect to the notebook on the cluster by setting the appropriate port number. Example code:
Go to Server using ssh username#ip_address to server.
Set up the port number for running notebook. On remote terminal run jupyter notebook --no-browser --port=7800
On your local terminal run ssh -N -f -L localhost:8001:localhost:7800 username#ip_address of server.
Open web browser on local machine and go to http://localhost:8001/

How to link local [py]spark to local jupyter notebook on window

I have just started learning spark and have been using R & Python on Jupyter notebook in my company.
All spark and Jupyter are installed on my computer locally and function perfectly fine individually.
Instead of creating .py script for pyspark in cmd every single time, could I possibly connect it to my Jupyter notebook live and run the scripts there? I have seen many posts on how to achieve that on Linux and Mac but sadly I will have to stick with Window 7 at this case.
Thanks!
Will
You could use the Sandbox from Hortonworks (http://hortonworks.com/downloads/#sandbox) and run your code in Apache Zeppelin.
No setup necessary. Install virtual box and run the sandbox. Then access zeppelin and ambari via your host (windows) browser and you are good to go to run your %pyspark code. Zeppelin has a look an feel like Jupyter.

Categories

Resources