Python Azure SDK Equivalent of CloudConfigurationManager - python

Is there a Python equivalent of:
Microsoft.Azure.CloudConfigurationManager.GetSetting
Currently I locate the ServiceConfiguration.cscfg file, and parse it!!

I reviewed the source codes & api documents of Azure SDK for Python, I didn't find any information about the feature in Python which be equivalent of Microsoft.Azure.CloudConfigurationManager.GetSetting in C#.
Meanwhile, it seems that Azure Python SDK only support for reading the .cscfg file to create a Cloud Service deployment via service management client as the code configuration = base64.b64encode(open(file_path, 'rb').read('path_to_.cscfg_file')) which from here.
So seems that parsing the .cscfg file is the only way to get the properties of cloud service.

Related

Is there a way to check on past Pivot Cloud Foundry (PCF) CLI buildpacks?

I'm currently attempting to stop utilizing a web proxy which allows internet access from an AWS Virtual Private Cloud as it won't be in use anymore soon. I also use the internet access to fetch data from an API endpoint which has past buildpack data such as the version and name of the buildpack itself. (https://buildpacks.cloudfoundry.org/#/buildpacks) General information is that I'm currently using python and AWS to do what I am doing.
Despite my research, I haven't been able to find such a CLI command which allows me to get this data without usage of this PCF API. Is there any way to do this without internet access?

Download or export Azure databricks notebooks to my local machine in Python using REST API

I need to automate a way to download Azure Databricks notebooks using Python to my local machine. Please let me know if there are any ways.
Yes, there is an API endpoint to export a notebook.
Refer to the documentation: https://learn.microsoft.com/en-us/azure/databricks/dev-tools/api/latest/workspace#--export
Here's how to make API requests with Python: Making a request to a RESTful API using python

How to authenticate a google vision API on cloud inside a python script?

I am trying to build a python script and deploy it as an HTTP function/Serverless cloud function on Pivotal cloud foundry or GKE, but I have gone through several articles and most of them mention using an SA and download Json key, setup env variable to JSON key location and run the script.
But how can I provide local downloaded JSON key file when I deploy it on cloud?
I have gone through below links but I couldn't understand as I am new to GCP, can anyone provide me an elaborated anws on how can I achieve this?
Google Cloud Vision API - Python
Google cloud vision api- OCR
https://cloud.google.com/vision/docs/quickstart-client-libraries#client-libraries-usage-python
According to docs, during function execution, Cloud Functions uses the service account PROJECT_ID#appspot.gserviceaccount.com as its identity. For instance, when making requests to Google Cloud Platform services using the Google Cloud Client Libraries, Cloud Functions can automatically obtain and use tokens to authorize the services this identity has permissions to use.
By default, the runtime service account has the Editor role, which lets it access many GCP services. In your case, you will need to enable the Vision API and grant the default service account with necessary permissions. Check out the Function Identity Docs for more details.
Instead of a SA json file, you could use an api key if that's easier for you. However, if you use an api key, you will only be able to send image bytes or specify a public image.

When accessing Bigquery using Python API, what is the difference of using google client API and gcloud

I searched for Python API to interact with google bigquery. And I found two packages provides similar APIs: Google BigQuery Client(Part of Google API Client Package googleapiclient) and Gcloud package gcloud.
Here are the documentation about using these two APIs for Bigquery:
Google API Client:googleapiclient
https://developers.google.com/resources/api-libraries/documentation/bigquery/v2/python/latest/index.html
https://cloud.google.com/bigquery/docs/reference/v2/
Google Cloud package: gcloud
http://googlecloudplatform.github.io/gcloud-python/stable/bigquery-usage.html
Both packages are from google, and provides similar functionalities interacting with bigquery. I have the following confusions:
It seems both package includes a wide range of functionalities of Google Cloud Platform. In my view, gcloud provides commandline tool and local environment setup. Generally, what are the differences of these two packages?
In terms of python module, what are the differences of their usage?
Is there any relation between these two packages?
Which is more suitable for accessing Bigquery?
What kind of job are they suitable for?
The googleapiclient client is generated directly from the raw API definition (the definition is a json file, hosted here.)
Because it is automatically generated, it is not what any sane python programmer would do if they were trying to write a python client for BigQuery. That said, it is the lowest-level representation of the API.
The gcloud client, on the other hand, was what a group of more-or-less sane folks at Google came up with when they tried to figure out what a client should look like for BigQuery. It is really quite nice, and lets you focus on what's important rather than converting results from the strange f/v format used in the BigQuery API into something useful.
Additionally, the documentation for the gcloud API was written by a doc writer. The documentation for the googleapiclient was, like the code, automatically generated from a definition of the API.
My advice, having used both (and having, mostly unsuccessfully, helped design the BigQuery API to try to make the generated client behave reasonably), is to use the gcloud client. It will handle a bunch of low-level details for you and generally make your life easier.

Get VM status using Azure Python SDK

I have a list of VMs and I'd like to get each VM's status (ReadyRole/Stopped/StoppedDeallocated) using Azure's Python SDK.
I have done this in a bash terminal using azure cli commands and a combination of grep,tail and such utils but I'd like to do that in python script using Azure's SDK.
With the help of azure cli I run in a shell script azure vm list and then grep my way to get the status of the VMs.
I've been looking into servicemanagementservice.py of Azure SDK but I can't find a function like get_role_status(). list_hosted_services() and get_hosted_service_properties don't seem to provide the info I want, unless I'm missing something.
Can anyone point me towards a solution?
Base on my experience, we can get every instances status using Azure REST API.
So Azure SDK for python should have similar method, because the functions in Azure SDK use the same URL as REST API.
I tried to use this method get_deployment_by_name to get the instances status:
subscription_id = '****-***-***-**'
certificate_path = 'CURRENT_USER\\my\\***'
sms = ServiceManagementService(subscription_id, certificate_path)
result=sms.get_deployment_by_name("your service name","your deployment name")
You can get the role list and check the every role property, please see this picture:

Categories

Resources