not getting azure vm metric data values - python

I am trying to fetch metric values for a vm using below rest API :
https://management.azure.com/subscriptions/aac11d2f-f03b-454e-9f65-4eb00795f964/resourceGroups/test-rg/providers/Microsoft.Compute/virtualMachines/ubuntu/metrics?api-version=2014-04-01&$filter=%28name.value%20eq%20%27\Memory\Availableand%20timeGrain%20eq%20duration%27PT1M%27%20and%20startTime%20eq%202016-02-18T20%3A26%3A00.0000000Z%20and%20endTime%20eq%202016-03-23T21%3A26%3A00.0000000Z
But it is giving me an bad request URL. can any one help me out from this.

Finally got API's to list the azure resource metrics sharing link of rest API's:
https://msdn.microsoft.com/en-us/library/azure/mt743622.aspx

If you want to get the VM metrics from Azure Platform, you could follow these document.
Using Azure Storage service to storage your metrics and get metrics using storage SDK or Rest API whatever you used the classic mode or Resource Group Mode.
Please refer to official document about how to enable the diagnostics settings on Azure Portal :
https://azure.microsoft.com/en-in/blog/windows-azure-virtual-machine-monitoring-with-wad-extension/
and how to use Azure storage REST API (https://msdn.microsoft.com/en-us/library/azure/dd179355.aspx) and SDK (https://github.com/Azure?utf8=%E2%9C%93&query=storage )
From your description, it seems that you used the Application Insight service to show your metrics on Azure. Actually, base on my experience, Application Insight service is in preview and it is design for your live App on Azure platform such as Web App, Android App and soon on.

Related

Azure api or sdk to get list of app registrations and the certificates associated with them

I want to connect my python application to azure.
Problem statement:
Get a list of apps registered with their certificate thumbprints and expiry details.
I tried a lot but can't find any APIs. Please help me.
You will need to use Microsoft Graph API for that.
The API operation you would want to use is List applications. The information about the certificates associated with each application will be available in keyCredentials attribute in the response.
You could use the Azure SDK for this. For Authorization check out the Identity Package.
Do you want to host your app in Azure as well? Then you should use Manged Identity. Alternatively, create a service principal for your app. This must get the read rights to the app registrations.
If you use the SDK, you only need a credential object that is created with Tenant ID, App Id and Secret (Service Principal). See here: https://learn.microsoft.com/en-us/azure/developer/python/azure-sdk-authenticate
Basically I recommend you to use the SDK instead of calling the GRAPH. The SDK takes a lot of work from you ;)

How to authenticate an end user with OAuth 2.0 for BigQuery API with python as a backend code in cloud function

We have created a Flutter Web app that fetches bigquery data through bigquery API from Cloud Function. We were using a service account for authentication but as we want to make our application public, we need to use OAuth for end-user and use OAuth credentials.
I have tried to deploy the code from this link for testing on cloud function but the cloud function keeps on running and shuts down because of timeout. I then checked the logs and found that, the reason was the cloud function doesn't allow the browser to open for authentication as it would do when run locally.
Logs:
Function execution started
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2 /auth?response_type=code&client_id=XXXXXXXXXXXXXXXX&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fbigquery&state=IXYIkeZUTaisTMGUGkVbgnohlor7Jx&access_type=offline.
Function execution took 180003 ms, finished with status: 'timeout'
I am confused as to how can I now authenticate and authorize a user once and have that credentials for every other bigquery API used in our web app.
I think you are missing the point of the use of Cloud Functions. The documentation you shared clearly states:
This guide explains how to authenticate by using user accounts for access to the BigQuery API when your app is installed onto users' machines.
This is never the case for a Cloud Function, since it is hosted in a Google Cloud Server and available for you to use via an HTTP request or a background process.
Because of that, Cloud Function will interact with other GCP products by using Service Accounts and if you want to setup authentication you will have to set it up in the Cloud Function layer, for which I recommend you to take a look at this documentation which explains the principles of authentication with Cloud Functions

How to connect Google sheet API on AWS EC2

I am trying to use Google Sheets API to load data into EC2 using Python.
So, I tried this quickstart.
But I am stuck configuring OAuth client to get credential.json file. I can't understand what drop-down type I should select.
Hope I was clear. Thanks in advance for your time.
Depending on the type of applications you want to create, you will have to choose one of the options provided in the dropdown.
Since you want to use a Python script, you can use the credentials of type Desktop and connect to AWS EC2 from the same application. However, you can always create new ones in the corresponding GCP project to match the application you are working on.
Reference
Sheets API Authorize Requests.

How to use API key in google cloud client libraries (python/datastore)?

I have a Google Cloud Platform Datstore instance and would like to run queries against it from a Python client. All of the examples show how to do this given a service account keyfile:
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/[FILE_NAME].json"
However, I plan to run this on a Serverless platform and don't want to include a keyfile with my deployment. Instead I've setup an API key and given it access to Datastore.
How can I then make datastore.Client() aware of this API Key?
from google.cloud import datastore
datastore_client = datastore.Client()
It takes credentials as an optional keyword arg, but I can't figure out how to create the proper credentials object out of an API key.
As an aside
I am amenable to other forms of auth if recommended. Basically I want to deploy a web application in Google Cloud Platform - what is the standard way to manage service accounts such that files don't have to be passed around ad-hoc?

How to authenticate to a REST endpoint on Google Cloud Firestore API

I am looking at writing a short python/nodejs script that will call out to the exportDocuments API route from Google's Firestore.
This page shows how to use gcloud but it just isn't an option since I am calling from inside an AWS Lambda function.
I am a GCP newbie, but not python/REST newbie. I couldn't find an SDK that exposes this endpoint (but maybe I am wrong here).
I did poked around the terrible documentation for GCP and made a service account and gave it the Cloud Datastore Import Export Admin role.
I also looked around at Google's Application Default Creds which doesn't help me since I am in Lambda.
The one thing I didn't dive into is the http.proto that GCP uses because I am not familiar with it and it looks like a big rabbit hole.
So does anyone have sample python or nodejs code for how make a POST request and provide a service account to a GCP REST endpoint? Or is there an sdk that uses v1beta1 from firestore. I wasn't able to find it at their docs page

Categories

Resources