Does dash work in Google Cloud Datalab? - python

Is it possible to run dash apps within Google Cloud Datalab?
It says it's running on specific port, but everytime I try to open it, the connection gets refused. It works fine when I try to run it in Jupyter notebook.
What could be the source of the problem?

Related

Cannot make a Jupyter notebook remote server

I am trying to host a Jupiter notebook server from the AWS cloud yet I have followed the following steps yet I cannot seem to access the notebook from my local browser. Can some one please tell me what is wrong?
https://docs.aws.amazon.com/dlami/latest/devguide/setup-jupyter.html
https://towardsdatascience.com/installing-pytorch-on-apple-m1-chip-with-gpu-acceleration-3351dc44d67c
https://docs.anaconda.com/anaconda/user-guide/tasks/remote-jupyter-notebook/
I have followed the tutorial exactly however I just cannot seem to get pass the security for some reason.
I wonder what does this mean? I cannot seem to fix it

connect the google credits into google colab

It's the first time that I am trying to use google credits, so I apologize if it's a basic question. I am trying to see how to connect google credits into google colab by this site https://medium.com/#senthilnathangautham/colab-gcp-compute-how-to-link-them-together-98747e8d940e (you can open it by creating a new incognito window).
I am stucked in step 3 because I can't see any SSH in my google cloud. Also the numbers after the -L are fixed? If not, how can I found them?
gcloud compute ssh colab-backend --zone=europe-west4-a -L 8081:locahost:8081
EDIT: I am trying to run the above line of code in Google Cloud SDK Shell, but I have this error.
Also, I can't type in the terminal jupiter notebook. If I run the above code in the python 3 jupyter notebook I have this strange error.
Actually, I had a very silly basic mistake. First of all your server should be linux and then write the following code in the windows command:
cloud compute ssh colab-backend --zone=europe-west4-a -L 8080:locahost:8081, change it into the zone of your project and port
type
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0 --no-browser
copy past the link into 'local runtime' in your google colab.

connect to local runtime via google colab with gpu

my goal is connect to goocle colab gpu from my local machine via jupyter notebook
done this things from documentation:
pip install jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
opening jupyter with this command:
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8888 --NotebookApp.port_retries=0
after i'm connecting to colab with my url from terminal like this:
after this i've got this in my terminal
[I 18:12:04.374 NotebookApp] 302 GET /?token=HERE IS MY TOKEN (MY IP) 0.000000ms
actually idk what is 302 GET
finally, if i'm using os.getcwd() in Colab that shows me my LOCAL directory from my PC
and print(torch.cuda.is_available()) is False
so i've connected to Colab from my machine, but goal is vice versa, to get GPU on my local machine. so maybe i'm doing something wrong.
I think you get it wrong.
Colab is only GUI which runs in web browser and on normal Google web server (with some access to Google Drive), not on special Google server with GPU.
Colab can connect to hosted runtime which means Google server (hardware) with GPU - and then you can directly access files on this server and you can run code on hardware with GPU.
Or it can connect to local runtime which means your local computer (hardware) without GPU - and then you can directly access local files and you can run code only on local hardware.
You don't have access to both runtimes (hardwares) at the same.
This is how I see it
Connect to Google Server with GPU/TPU:
Connect to Local Computer without GPU:

VFS connection does not exist

I'm following the AWS tutorial Build a Modern Web Application - [Python].
I'm at Module 2B: Deploy A Service With AWS Fargate #step B:Test The Services Locally
I run my docker image with success with:
docker run -p 8080:8080 xxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/mythicalmysfits/service:latest
When I preview the website on AWS cloud9 I get the following error.
Oops, VFS connection does not exist
I've tried the following:
created new docker image on different region
checked the flask app routing (all good)
double checked my account id
checked AWS documentation
All of this and I can't figure out what is going on with the error. Am I missing something?
Do not run Cloud 9 in a browser in Incognito mode.
Remove add blockers
Check the docker run output for errors
I found the solution and that is to open the console in chrome and do the docker run there.
Here's what it looked like for me
I found the solution!
- Ad Blockers!
As soon I disable them, It worked
My solution is to use the same (normal) browser, not private mode/incognito window.

How can I connect VS Code to a GPU instance on Google Cloud Platform?

I'm on a Windows 10 machine. I have GPU running on the Google Cloud Platform to train deep learning models.
Historically, I have been running Jupyter notebooks on the cloud server without problem, but recently began preferring to run Python notebooks in VS Code instead of the server based Jupyter notebooks. I'd like to train my VS Code notebooks on my GPUs but I don't have access to my google instances from VS Code, I can only run locally on my CPU.
Normally, to run a typical model, I spin up my instance on the cloud.google.com Compute Engine interface. I use the Ubuntu on the Windows Subsystem for Linux installation and I get in like this:
gcloud compute ssh --zone=$ZONE jupyter#$INSTANCE_NAME -- -L 8080:localhost:8080
I have tried installing the Cloud Code extension so far on VS Code, but as I go through the tutorials, I always sort of get stuck. One error I keep experiencing is that gcloud won't work on anything EXCEPT my Ubuntu terminal. I'd like it to work in the terminal inside VS Code.
Alternatively, I'd like to run the code . command on my Ubuntu command line so I can open VS Code from there, and that won't work. I've googled a few solutions, but they lead me to these same problems with neither gcloud not working, nor code . working.
Edit: I just tried the Google Cloud SDK installer from https://cloud.google.com/sdk/docs/quickstart-windows
and then I tried running gcloud compute ssh from the powershell from within VSCODE. This is the new error I got:
(base) PS C:\Users\user\Documents\dev\project\python> gcloud compute ssh --zone=$ZONE jupyter#$INSTANCE_NAME -- -L 8080:localhost:8080
WARNING: The PuTTY PPK SSH key file for gcloud does not exist.
WARNING: The public SSH key file for gcloud does not exist.
WARNING: The private SSH key file for gcloud does not exist.
WARNING: You do not have an SSH key for gcloud.
WARNING: SSH keygen will be executed to generate a key.
ERROR: (gcloud.compute.ssh) could not parse resource []
It still runs from Ubuntu using WSL, I logged in fine. I guess I just don't know entirely enough about how they're separated, what's shared, and what is missing, and to how to get all my command lines using the same stuff.
It seems as if your ssh key paths are configured correctly for your Ubuntu terminal but not for the VS Code one. If your account is not configured to use OS Login, with which Compute Engine stores the generated key with your user account, local SSH keys are needed. SSH keys are specific to each instance you want to access and here is where you can find them. Once you have find them you can specify their path using the --ssh-key-file flag.
Another option is to use OS Login as I have mentioned before.
Here you have another thread with a similar problem than yours.

Categories

Resources