Run a ML program locally using the Colab GPU - python

I have a Lenovo as computer, but there is no GPU installed. So when I run a machine learning program written in python, it runs it on my local CPU. I know that Colab provides us a GPU for free. To use it, I need to take the content of all the python files from my ML program and put it in this Colab notebook. It is not very convenient at this point. Is it possible to run in any ways my ML program from my computer using directly the Colab GPU without using the Colab Notebook directly.
EDIT
Be aware that I don't want to work from Jupiter Notebook. I would like to work in Visual Studio Code and run the code on the Colab GPU directly instead of my CPU

It is possible to run. Check this article out
https://amitness.com/vscode-on-colab/
and
https://github.com/abhi1thakur/colabcode

Not that I know of. Colab's GPU and notebook runs on Google's computers. Your local jupyter notebook runs on your computer alone and sort of can't communicate to Google's computers. This is not a physics limitation or anything. It's just that no one has integrated them before.
What you can do though, to make the transfers quickly, is to create a git repo for all of your files, commit them to GitHub, then pull them down in colab's notebooks. It's relatively quick, syncs well, and serves as a backup.

Be aware that I don't want to work from Jupiter Notebook. I would like to work in Visual Studio Code and run the code on the Colab GPU directly instead of my CPU
Nope, not possible.
Update reason:
Colab itself is a jupyter notebook, you can't take away the machine resources to link to your pc and use other software with it.
If this possible, people will already abuse it and use it for mining crypto, run-heavy load programs, etc.
Colab is a free product by Google to introduce you to their cloud compute services. This mean colab have its own limitation
"Colab resources are not guaranteed and not unlimited, and the usage limits sometimes fluctuate. " -Colab FAQ
If you are a fan of colab, you might want to try the pro version for just $10/month

Did you check out colab-ssh? You SSH into colab from VS Code and can leverage the GPU the same as you would on colab.

Related

Running Jupyter Notebook in GCP on a Schedule

What is the best way to migrate a jupyter notebook in to Google Cloud Platform?
Requirements
I don't want to do a lot of changes to the notebook to get it to run
I want it to be scheduleable, preferably through the UI
I want it to be able to run a ipynb file, not a py file
In AWS it seems like sagemaker is the no brainer solution for this. I want the tool in GCP that gets as close to the specific task without a lot of extras
I've tried the following,
Cloud Function: it seems like it's best for running python scripts, not a notebook, requires you to run a main.py file by default
Dataproc: seems like you can add a notebook to a running instance but it cannot be scheduled
Dataflow: sort of seemed like overkill, like it wasn't the best tool and that it was better suited apache based tools
I feel like this question should be easier, I found this article on the subject:
How to Deploy and Schedule Jupyter Notebook on Google Cloud Platform
He actually doesn't do what the title says, he moves a lot of GCP code in to a main.py to create an instance and he has the instance execute the notebook.
Feel free to correct my perspective on any of this
I use Vertex AI Workbench to run notebooks on GCP. It provides two variants:
Managed Notebooks
User-managed Notebooks
User-managed notebooks creates compute instances at the background and it comes with pre-built packages such as Jupyter Lab, Python, etc and allows customisation. I mainly use for developing Dataflow pipelines.
Other requirement of scheduling - Managed Notebooks supports this feature, refer this documentation (I am yet to try Managed Notebooks):
Use the executor to run a notebook file as a one-time execution or on
a schedule. Choose the specific environment and hardware that you want
your execution to run on. Your notebook's code will run on Vertex AI
custom training, which can make it easier to do distributed training,
optimize hyperparameters, or schedule continuous training jobs. See
Run notebook files with the executor.
You can use parameters in your execution to make specific changes to
each run. For example, you might specify a different dataset to use,
change the learning rate on your model, or change the version of the
model.
You can also set a notebook to run on a recurring schedule. Even while
your instance is shut down, Vertex AI Workbench will run your notebook
file and save the results for you to look at and share with others.

Is there a way to use Google Colab Form feature in a local Jupyter Notebook?

I have just installed a Jupyter Notebook in a local conda environment, and I've imported an .ipynb file from Google Colab to run it locally.
I wonder if there's any extension that enables the Form feature from Google Colab to work on a regular Jupyter Notebook, using the same syntax, like this below:
PS: I'm aware of the Jupyter Widgets (ipywidgets) package/extension, however the syntax is different from Colab Forms, ideally I would want to keep the same syntax so I could use the same file back and forth between Colab and local easily
Disclaimer: this is not a strict solution to the problem presented by the OP, but it is a viable alternative.
For months I have been looking for a way to have certain Google Colab functionality locally, but to no avail. In particular, to date I haven't found a way to:
Run Google Colab locally (offline).
Have the functionalities of Google Colab forms locally (offline) using the same syntax.
However, since one of the main motivations for using Jupyter locally is to use the computational resources of the local machine, Google Colab offers a very simple way to take advantage of all the features of the platform using a local Jupyter environment:
Upload your notebook to Google Drive and open it with Google Colab.
Click the arrow next to the "Connect" button and then click "Connect to a local runtime".
Follow the instructions.
Then you can use the same file back and forth between Google Colab and your local environment easily.

How to apply GoogleColab stronger CPU and more RAM?

I use GoogleColab to test data stuctures like chain-hashmap,probe-hashmap,AVL-tree,red-black-tree,splay-tree(written in Python),and I store very large dataset(key-value pairs) with these data stuctures to test some operation running time,its scale just like a small wikipedia,so run these python script will use very much memory(RAM),GoogleColab offers a approximately 12G RAM but not enough for me,these python scripts will use about 20-30G RAM,so when I run python program in GoogleColab,will often raise an exception that"your program run over 12G upper bound",and often restarts.On the other hand,I have some PythonScript to do some recursion algorithm,as is seen to all,recursion algorithm use CPU vety mush(as well as RAM),when I run these algorithm with 20000+ recursion,GoogleColab often fails to run and restart,I knew that GoogleColab uses two cores of Intel-XEON CPU,but how do I apply more cores of CPU from Google?
You cannot upgrade the GPU and CPU but you can increase the RAM from 12 gb to 25gb just by crashing the session with just by any non ending while loop.
l=[]
while 1:
l.append('nothing')
There is no way to request more CPU/RAM from Google Colaboratory at this point, sorry.
Google Colab Pro recently launched for $9.99 a month (Feb. 2020). Users in the US can get higher resource limits and more frequent access to better resources.
Q&A from the signup page is below:
What kinds of GPUs are available in Colab Pro?
With Colab Pro you get priority access to our fastest GPUs. For example, you may get access to T4 and P100 GPUs at times when non-subscribers get K80s. You also get priority access to TPUs. There are still usage limits in Colab Pro, though, and the types of GPUs and TPUs available in Colab Pro may vary over time.
In the free version of Colab there is very limited access to faster GPUs, and usage limits are much lower than they are in Colab Pro.
How long can notebooks run in Colab Pro?
With Colab Pro your notebooks can stay connected for up to 24 hours, and idle timeouts are relatively lenient. Durations are not guaranteed, though, and idle timeouts may sometimes vary.
In the free version of Colab notebooks can run for at most 12 hours, and idle timeouts are much stricter than in Colab Pro.
How much memory is available in Colab Pro?
With Colab Pro you get priority access to high-memory VMs. These VMs generally have double the memory of standard Colab VMs, and twice as many CPUs. You will be able to access a notebook setting to enable high-memory VMs once you are subscribed. Additionally, you may sometimes be automatically assigned a high-memory VM when Colab detects that you are likely to need it. Resources are not guaranteed, though, and there are usage limits for high memory VMs.
In the free version of Colab the high-memory preference is not available, and users are rarely automatically assigned high memory VMs.
For a paid, high-capability solution, you may want to try Google Cloud Datalab instead

Tensorflow code on google cloud platform

I am facing a strange issue. I have a working code for a very simple neural network. I am running it on my laptop. Kind of slow but ok. I then created a 24 cores instance (Linux) on google cloud, and run the same code. It seems to take almost the same time. I expected to be a lot faster. Any idea why this could be the case? I am using a standard, vanilla pip installation of cpu tensorflow. Nothing fancy.
Would appreciate any ideas...
Best, Umberto

Set up spark using an external virtual machine

I am not as huge a computer person as many others on here, I majored in math with MatLab as my main computer knowledge. I have recently got involved with Apache Spark through the excellent edX course offered by Berkeley.
The method that they used for setting up Spark was provided in a great step by step guide, it involved: downloading Oracle VM Virtual Box with an Ubuntu 32bit VM, then through the use of a vagrant (again I'm not hugely computer-y so not 100% sure how this worked or what it is) connect this to IPython notebook. This enabled me to have access to Spark over the internet and to code in python with pySpark, this is exactly what I want to do.
Everything was going very well until the second lab exercise, it became apparent that my Windows laptop has insufficient free memory (just 3 Gb and four years old) after it continually froze and crashed when trying to work with large datasets.
It is not possible to have a VM in a VM apparently so I have spent most of today looking for alternative ways of setting up Spark to no avail; the guides are all aimed at someone with more computer knowledge than I have.
My (likely naive) idea now is to rent an external machine that I can interface with through my windows laptop completely as before but so that the virtual machine operates outside of the memory of my laptop i.e. in the cloud (using any of Ubuntu, Windows, etc.). Essentially I want to move the Oracle VM virtual box to an outside source to rid my computer of memory burdens and to use Ipython notebook as before.
How can I set up a virtual machine to use for the computational side of Spark in Ipython notebook?
Or is there an alternate method that would be simple to follow?
Don't run VMs. Instead:
Download the latest Spark version. (1.4.1 at the moment.)
Extract the archive.
Run bin/pyspark.cmd.
It's not an IPython Notebook, but you can run Python code against a local Spark instance.
If you want a beefier instance, do the same on a beefy remote machine. For example an EC2 m4.2xlarge is $0.5 per hour with 8 cores and 30 GB of RAM.

Categories

Resources