I am writing some functions to extract data from Azure. I am using the Python subprocess library with the Azure CLI commands as they are easier and better documented thank the Python SDK. My question comes whether it is possible to combine the Azure CLI commands with the Python SDK to make the authentication as the CLI uses interactive login and don't have many choices.
The goal of this is to incorporate those functions into a bigger script that authenticates and gets all the information we need.
Any ideas or ways of doing this would be appreciate it.
Thank you
According to my test, if you want to call Azure CLI command in python application, we can use the package azure-cli.
For example
from azure.cli.core import get_default_cli
az_cli = get_default_cli()
az_cli.invoke(['login', '--service-principal', '-u', '<appId>', '-p', 'password','--tenant','teanat id'])
az_cli.invoke(['group','show', '-n', 'jimtest'])
Related
I’m building an App that runs on K8s version 1.21 and the container already includes Python 3.92. Do I still need to install https://github.com/kubernetes-client/python if I want to interact with Kubernetes using Python or am I good?
Thanks,
Kubernetes Python client is the library that helps you interact with Kubernetes API.
So if you want to do anything with Kubernetes API from inside your Python program (e.g. query the currently running Pods), then you need to install Kubernetes python client.
However, if your application is just deployed in Kubernetes, but does not need interact with Kubernetes API, then you don't need it.
Would like more info on what are planning to do. If you going to just run Python programs then you don't need this library at all. This is for accessing Kubernetes (K8S) REST APIs.Also even for REST API you can do normal REST API calls and handle Requests/Responses or take help from this library for doing the heavy lifting. Whether python is running outside or inside a container or pod, you need the library for accessing REST APIs to do specific API functionalities not related to actual python to work.
Ref: https://kubernetes.io/docs/reference/using-api/client-libraries/
I'm building a website with React and Firebase that utilizes an algorithm I wrote in python. The database and authentication for the project are both handled by Firebase, so I would like to keep the cloud functions in that same ecosystem if possible.
Right now, I'm using the python-shell npm package to send and receive data from NodeJS to my python script.
I have local unit testing set up so I can test the https.onCall functions locally without needing to deploy and test from the client.
When I am testing locally, everything works perfectly.
However, when I push the functions to the cloud and trigger the function from the client, the logs in the Firebase console show that the python script is missing dependencies.
What is the best way to ensure that the script has all the dependencies available to it up on the server?
I have tried:
-Copying the actual dependency folders from my library/.../site-packages and putting them in the same directory under the /functions folder with the python script. This almost works. I just run into an issue with numpy: "No module named 'numpy.core._multiarray_umath'" is printed to the logs in Firebase.
I apologize if this is an obvious answer. I'm new to Python, and the solutions I've found online seem way to elaborate or involve hosting the python code in another ecosystem (like AWS or Heroku). I am especially hesitant to go to all that work because it runs fine locally. If I can just find a way to send the dependencies up with the script I'm good to go.
Please let me know if you need any more information.
the logs in the Firebase console show that the python script is missing dependencies.
That's because the nodejs runtime targeted by the Firebase CLI doesn't have everything you need to run python programs.
If you need to run a function that's primarily written in python, you should not use the Firebase CLI and instead uses the Google Cloud tools to target the python runtime, which should do everything you want. Yes, it might be extra work for you to learn new tools, and you will not be able to use the Firebase CLI, but it will be the right way to run python in Cloud Functions.
I'm working on a flask API, which one of the endpoint is to receive a message and publish it to PubSub. Currently, in order to test that endpoint, I will have to manually spin-up a PubSub emulator from the command line, and keep it running during the test. It working just fine, but it wouldn't be ideal for automated test.
I wonder if anyone knows a way to spin-up a test PubSub emulator from python? Or if anyone has a better solution for testing such an API?
As far as I know, there is no Python native Google Cloud PubSub emulator available.
You have few options, all of them require launching an external program from Python:
Just invoke the gcloud command you mentioned: gcloud beta emulators pubsub start [options] directly from your python application to start this as an external program.
The PubSub emulator which comes as part of Cloud SDK is a JAR file bootstrapped by the bash script present in CLOUD_SDK_INSTALL_DIR/platform/pubsub-emulator/bin/cloud-pubsub-emulator. You could possibly run this bash script directly.
Here is a StackOverflow answer which covers multiple ways to launch an external program from Python.
Also, it is not quite clear from your question how you're calling the PubSub APIs in Python.
For unit tests, you could consider setting up a wrapper over the code which actually invokes the Cloud PubSub APIs, and inject a fake for this API wrapper. This way, you can test the rest of the code which invokes just your fake API wrapper instead of the real API wrapper and not worry about starting any external programs.
For integration tests, the PubSub emulator will definitely be useful.
This is how I usually do:
1. I create a python client class which does publish and subscribe with the topic, project and subscription used in emulator.
Note: You need to set PUBSUB_EMULATOR_HOST=localhost:8085 as env in your python project.
2. I spin up a pubsub-emulator as a docker container.
Note: You need to set some envs, mount volumes and expose port 8085.
set following envs for container:
PUBSUB_EMULATOR_HOST
PUBSUB_PROJECT_ID
PUBSUB_TOPIC_ID
PUBSUB_SUBSCRIPTION_ID
Write whatever integration tests you want to. Use publisher or subscriber from client depending on your test requirements.
I have a list of VMs and I'd like to get each VM's status (ReadyRole/Stopped/StoppedDeallocated) using Azure's Python SDK.
I have done this in a bash terminal using azure cli commands and a combination of grep,tail and such utils but I'd like to do that in python script using Azure's SDK.
With the help of azure cli I run in a shell script azure vm list and then grep my way to get the status of the VMs.
I've been looking into servicemanagementservice.py of Azure SDK but I can't find a function like get_role_status(). list_hosted_services() and get_hosted_service_properties don't seem to provide the info I want, unless I'm missing something.
Can anyone point me towards a solution?
Base on my experience, we can get every instances status using Azure REST API.
So Azure SDK for python should have similar method, because the functions in Azure SDK use the same URL as REST API.
I tried to use this method get_deployment_by_name to get the instances status:
subscription_id = '****-***-***-**'
certificate_path = 'CURRENT_USER\\my\\***'
sms = ServiceManagementService(subscription_id, certificate_path)
result=sms.get_deployment_by_name("your service name","your deployment name")
You can get the role list and check the every role property, please see this picture:
The Google documentation is a little generic on this topic and I find it hard to get around the different APIs and terms they're using, so I'm wondering if someone could point me to the right direction.
I'm looking for a way to call the gcloud command directly from Python. I've installed gcloud in my Python environment and as an example to follow, I'd like to know how to do the following from Python:
gcloud compute copy-files [Source directory or file name] [destination directory of file name]
You should check out gcloud:
https://pypi.python.org/pypi/gcloud
There's nothing magic about uploading files to a Computer Engine VM. I ended up using paramiko to upload files.
You can of course call gcloud from python directly and not care about the implementation details, or you can try to see what gcloud does:
Try running gcloud compute copy-files with the --dry-run flag. That will expose the scp command it uses underneath and with what arguments. Knowing what scp params you need, you can recreate them programmatically using paramiko_scp in python. More information on this here: How to scp in python?
You can use the subprocess.run function in python to execute commands from your terminal/shell/bash. That is what I have done to execute gcloud commands from python, rather than using the Python SDK.