Does anybody knows how to integrate mne python in google cloud.
We are basically using basic MEG analysis using data in Python and we want to import data onto Google Cloud and use the open source library MNE Python.
Have Tried of using sudo apt install mne-python in shell.
Setting up a Python environment with the required dependencies installed is required in order to use MNE-Python in Google Cloud.
Set up a project and create a Google Cloud account.
Install Python along with any required dependencies on the newly created virtual machine (VM) instance in Google Cloud.
Install MNE-Python with conda or pip.
Use Google Cloud Storage to store your data or upload your MEG data to the VM instance.
Make use of MNE-Python and your data to create a Python script that will carry out your MEG analysis.
On the VM instance, execute the script.
Related
I need to automate a way to download Azure Databricks notebooks using Python to my local machine. Please let me know if there are any ways.
Yes, there is an API endpoint to export a notebook.
Refer to the documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/workspace#--export
Here's how to make API requests with Python: Making a request to a RESTful API using python
Situation
I have an existing Python app in Google Colab that calls the Twitter API and sends the response to Cloud Storage.
I'm trying to automate the Twitter API call in GCP, and am wondering how I install the requests library for the API call, and install os for authentication.
I tried doing the following library installs in a Cloud Function:
import requests
import os
Result
That produced a resulting error message:
Deployment failure: Function failed on loading user code.
Do I need to install those libraries in a Cloud Function? I'm trying to understand this within the context of my Colab python app, but am not clear if the library installs are necessary.
Thank you for any input.
when you create your cloud function source code , there are two files.
main.py
requirements.txt
Add packages in requirements.txt as below
#Function dependencies, for example:
requests==2.20.0
creating a new python environment for your project might help and would be a good start for any project
it is easy to create.
## for unix-based systems
## create a python environment
python3 -m venv venv
## activate your environment
## in linux-based systems
. ./venv/bin/activate
if you are using google colab, add "!" before these commands, they should work fine.
Is it possible to run robot framework test suite using azure data bricks notebook?.
I have a set of robot framework test suite, that uses database library, Operating System library etc.
In my local machine, I install python, pip install all necessary libraries and then run my robot code like
"Python -m robot filename.robot"
I want to do the same using azure notebooks, Is it possible?
Databricks supports 4 Default Language:
Python,
Scala,
SQL,
R
I was unable to find any documentation, which shows use of robot framework on databricks.
However, you can try running commands on Azure databricks which you tried on local machine.
Databricks is simply just a cloud infrastructure provider to run your spark workload with some add on capability.
I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...
I want to automate deploying OVA image on VSphere with python.
I looked up at some packages viz. Pysphere, psphere but didn't find direct method to do so.
is there any Library I'm missing or is there any other way to deploy OVA/OVF files/templates on VSphere with Python.
Pls help!!!
I have the same situation here and found that there is vSphere automation API here made in Python. Github clone here.
All you need to do is extract SDK and download deploy_ovf_template.py for usage here or from github clone here. This template will work with OVF, but since you want to work with OVA you'll need to do extra work and extract OVA (you'll get OVF and vmdk files).
For other scenarios, check PDF documentation here.
Be aware that this is supported 6.5>= vSphere
As far I know there are no appropriate api for deploying ovf template using python package. You can use ovftool, VMware OVF Tool is a command-line utility that allows you to import and export OVF packages to and from many VMware products.
download ovftool from vmware site https://my.vmware.com/web/vmware/details?productId=352&downloadGroup=OVFTOOL350
to install ovftool:-
sudo /bin/sh VMware-ovftool-3.5.0-1274719-lin.x86_64.bundle
to deploy ova image as template.
syntax:-
ovftool -dm=thick -ds=3par1 -n=abhi_vm /root/lab/extract/overcloud-esx-ovsvapp.ova vi://root:pwd#10.1.2**.**/datacenter/host/cluster
use os.system(ovftool_syntax) to use in your python script.