How to use gcloud commands programmatically via Python - python

The Google documentation is a little generic on this topic and I find it hard to get around the different APIs and terms they're using, so I'm wondering if someone could point me to the right direction.
I'm looking for a way to call the gcloud command directly from Python. I've installed gcloud in my Python environment and as an example to follow, I'd like to know how to do the following from Python:
gcloud compute copy-files [Source directory or file name] [destination directory of file name]

You should check out gcloud:
https://pypi.python.org/pypi/gcloud

There's nothing magic about uploading files to a Computer Engine VM. I ended up using paramiko to upload files.

You can of course call gcloud from python directly and not care about the implementation details, or you can try to see what gcloud does:
Try running gcloud compute copy-files with the --dry-run flag. That will expose the scp command it uses underneath and with what arguments. Knowing what scp params you need, you can recreate them programmatically using paramiko_scp in python. More information on this here: How to scp in python?

You can use the subprocess.run function in python to execute commands from your terminal/shell/bash. That is what I have done to execute gcloud commands from python, rather than using the Python SDK.

Related

Starting multiple containers with configs using the Python Docker SDK

I am using the Docker Python SDK docker-py to create a script that allows starting one or multiple containers (depending on a program argument in a way like script.py --all or script.py --specific_container) and it has to be possible to start each container with its own configuration (image, container_name, etc.) just like in typical docker-compose.yml files.
So basically, im trying to do the same what docker-compose does, just with the Python Docker SDK.
I've read that some people are trying to stick with docker-compose by using subprocess but it is not recommended and i would like to avoid this.
I am searching for possibly existing libraries for this but i haven't found anything just yet. Do you know anything i could use?
Another option would be to somehow store configuration files for the "specific_container"-profiles and for the "all"-profile as JSON (?) and then parse them and populate the Docker SDK's run method of the Container class, which lets you give all options that you can also give in the docker-compose file.
Maybe someone knows another, better solution?
Thanks in advance guys.

Firebase Cloud Functions running a python script - needs dependencies

I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.

Authenticating Azure CLI with Python SDK

I am writing some functions to extract data from Azure. I am using the Python subprocess library with the Azure CLI commands as they are easier and better documented thank the Python SDK. My question comes whether it is possible to combine the Azure CLI commands with the Python SDK to make the authentication as the CLI uses interactive login and don't have many choices.
The goal of this is to incorporate those functions into a bigger script that authenticates and gets all the information we need.
Any ideas or ways of doing this would be appreciate it.
Thank you
According to my test, if you want to call Azure CLI command in python application, we can use the package azure-cli.
For example
from azure.cli.core import get_default_cli
az_cli = get_default_cli()
az_cli.invoke(['login', '--service-principal', '-u', '<appId>', '-p', 'password','--tenant','teanat id'])
az_cli.invoke(['group','show', '-n', 'jimtest'])

Run a gsutil command in a Google Cloud Function

I would like to run a gsutil command every x minutes as a cloud function. I tried the following:
# main.py
import os
def sync():
line = "gsutil -m rsync -r gs://some_bucket/folder gs://other_bucket/other_folder"
os.system(line)
While the Cloud Function gets triggered, the execution of the line does not work (or i.e. the files are not copied from one bucket to another). However, it does work fine when I run it locally in Pycharm or with cmd. What is the difference with cloud functions?
You can use Cloud Run for this. You have very few change to perform in your code.
Create a container with gsutil installed and python also, for example gcr.io/google.com/cloudsdktool/cloud-sdk as base image
Take care of the service account used when you deploy Cloud Run, grant the correct permission for accessing to your bucket
Let me know if you need more guidance
Cloud Functions server instances don't have gsutil installed. It works on your local machine because you do have it installed and configured there.
I suggest trying to find a way to do what you want with the Cloud Storage SDK for python. Or figure out how to deploy gsutil with your function and figure out how to configure and invoke it from your code, but that might be very difficult.
There's no straightforward option for that.
I think the best for Cloud Functions is to use google-cloud-storage python library

GCS Rsync from Python client library?

I am trying to rewrite some bash scripts in Python, and specifically am trying to rewrite a line that executes gsutil -m rsync -r /local/path/to/data gs:/path/to/data. However, I am not able to find any references to rsync functionality in the Python client library documentation here.
If anyone has solved this, please let me know. If this functionality is not currently implemented in the client library, does anyone know why?
gsutil is a command line tool and has application-level logic beyond the client library, so not all of the features of gsutil are available in the client library. gsutil does not presently consume the google-cloud-python client library, as that library was developed later.

Categories

Resources