How to authorize Azure Python SDK on VM instance? - python

In AWS, you can assign a role to a VM, which then authorizes the instance when it makes queries to the AWS SDK. I am looking for similar functionality in Azure, or something that would enable me to do close to that.
I found this post which suggests that this is not possible in the way AWS does it. Are there any workarounds for this? I really don't want the system administrator to have to login to the instance and give their Azure Active Directory credentials to authorize it.

Excellent question :). I would suggest to wait a few days, we have something in progress that seems to fit your need. I created this issue for tracking.
The most simple would be to create a Service Principal credentials for these VMs. To do that, execute a post deployment script to install the CLI and "az ad sp create-for-rbac --sdk-auth >~/mycredentials.json". Then, you can start SDK script reading this credential file.
The "create-for-rbac" commands already exists if you want to look at it (--sdk-auth is the new option coming), so you can see that you can specify all scope and permissions needed in this command.
(I own the Azure SDK for Python at MS)

Related

Is there a way to check on past Pivot Cloud Foundry (PCF) CLI buildpacks?

I'm currently attempting to stop utilizing a web proxy which allows internet access from an AWS Virtual Private Cloud as it won't be in use anymore soon. I also use the internet access to fetch data from an API endpoint which has past buildpack data such as the version and name of the buildpack itself. (https://buildpacks.cloudfoundry.org/#/buildpacks) General information is that I'm currently using python and AWS to do what I am doing.
Despite my research, I haven't been able to find such a CLI command which allows me to get this data without usage of this PCF API. Is there any way to do this without internet access?

Google Application Credentials Python (Jupyter Notebook)

I'm trying to start using Google Analytics API 4 as instructed with Python and Jupyter Notebook. I follow the instructions https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries and get to Step 3. Configure authentication
And then they write that you need to set GOOGLE_APPLICATION_CREDENTIALS="[PATH]" I downloaded this file to my computer and added it to the project folder, but I can't get verified using the service account.
On github they write https://github.com/googleapis/python-analytics-data#installation that you need to use a virtual environment? Is it so? Will it work without it?
I am using service-account, not oauth 2.0
To be clear GOOGLE_APPLICATION_CREDENTIALS is a virtual environmental variable. This variable is used by many of the google client libraries to load the credentials for any of the APIs. The question i have duplicated this as shows a number of ways to set it.
As you seem to still be a little unsure. Here is some additinal information.
As stated in the docs.
An easy way to provide service account credentials is by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable, the API client will use the value of this variable to find the service account key JSON file.
You need to set an env var on your machine to the path of the service account key file.
There are a number of examples of how to do that.
Authenticating as a service accoun
Set GOOGLE_APPLICATION_CREDENTIALS in Python project to use Google API
I did. I'll tell you how it turns out for beginners:
https://developers.google.com/analytics/devguides/reporting/data/v1/quickstart-client-libraries
Click on the blue button Enable the Google Analytics Data API v1, a project is created, a service account is created in Google Cloud and the Google Analytics API is activated. Download JSON file
After that, you need to add a service account in the GA4 resource
Then you need to authenticate according to the method from here https://cloud.google.com/docs/authentication/production#setting_the_environment_variable
This requires Passing credentials manually. Using the code in the article, the path to the json file on the computer is indicated
When trying to authenticate a service account, 403 GET storage.googleapis.com/storage/v1/… will be issued: starting-account-smhpwtovr5jj #test-data-api-1654114095791.iam.gserviceaccount.com does not have storage.buckets.list access to the Google Cloud project.
To do this, you need to go to IAM and create a new role for the Owner or Storage Admin service account. After that, GOOGLE_APPLICATION_CREDENTIALS finds and you can start installing the Data API library

How to update AzureWebJob after it is created

I created AzureWebJob manually via Azure portal, but now I need to update the source code. I did not found a way to re-upload the source code content (python script). Is there any CLI I can use to update the job or I have to delete the job and re-create it every time I need an update?
There is some guide to deployment via VS but only for .NET, is there something similar to Python?
https://learn.microsoft.com/en-us/azure/app-service/webjobs-dotnet-deploy-vs
In the final it is very easy!
It is possible to connect to App Service via FTPS client.
The credentials can be found in Azure Portal
Home > <App Service Name> > Deployment Center > FTPS credentials
And the source codes of the jobs are in the folder
/site/wwwroot/App_Data/jobs
and thats it!

Best practice to publish Dialogflow JSON Service Account credentials

I made a CLI application in Python which uses Google Dialogflow.
As the documentation provides, I created a Service Account and downloaded the JSON file. Then I loaded it in Python and the application works.
Now I want to publish my software on GitHub and pip but when I load all the files I receive an e-mail from Google that states that I am not managing correctly my credentials. And I agree with that.
The problem is that I do not understand how to manage properly those credentials.
There is 2 solutions:
Either anyone can access to your backend, and you don't need a service account, because it's pubilc
Or, it's private and you don't publish your key (if the secret is known of everyone, the security is useless, make it public!). It's a requirement of the deployment to not commit publicly but to document and to explain how to configure the correct service account to use your app.
Provide more on your context and want you want to achieve to have better pieces of advice. What do you want to protect? Where will you deploy your app? ...

How to determine authentication method while using Google Cloud Platform client libraries locally

I'm currently able to run a local python script that calls the Google vision API using the python client library (specifically, I'm using the google-cloud-vision package). However, I'm curious about how it's authenticating. In the python script that I'm running locally I do not provide any authentication information. From reading the below posts, it seems that a common way to authenticate when running locally is to set an environment variable to the path of a .JSON key file (i.e export GOOGLE_APPLICATION_CREDENTIALS = path/to/JSON/key/file), however, I don't recall doing this and if I run printenv, I do not have an environment variable called GOOGLE_APPLICATION_CREDENTIALS.
The below posts provide great details about different ways to authenticate using the client libraries locally, but how can I see/determine exactly how my program is being authenticated? Is there a way to query for this?
"Authenticating to the Cloud Vision API"...including the "Application Default Credentials" part of the above page
"Authenticating Applications With a Client Library" section of Creating and Enabling Service Accounts for Instances
"Providing Credentials to Your Application" section of "Setting Up Authentication for Server to Server Production Capabilities" page
"Setting the Environment Variable" Section of "Getting Started With Authentication" page:
Python client libraries "Getting Started" page:
"Authenticating to a Cloud API Service"
There's 4 different ways for the request to be authenticated without creating a credentials object.
If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set to the path of a valid service account JSON private key file, then it is used.
If the Google Cloud SDK is installed and has application default credentials set then it is used. Note that if you've done this step once in the past, it will stay valid. (I'm guessing that this is what you're currently using to authenticate.)
If the application is running in the App Engine Standard environment then the credentials and project ID from the App Identity Service are used. (Not applicable here but I'm listing it for completeness' sake.)
If the application is running in Compute Engine or the App Engine flexible environment then the credentials and project ID are obtained from the Metadata Service. (Not applicable here but I'm listing it as well for completeness' sake.)
If no credentials are found using the methods above, DefaultCredentialsError will be raised. Since you're not getting this error, and you don't have the environment variable from #1 set, and options #3 & #4 are not applicable, the only option that remains is number #2.
The above information can be found on the readthedocs.io page for the google-cloud Authentication page, and more specifically in the google.auth package page
You can check if you have the application default credentials set up by running this command:
gcloud auth application-default print-access-token
If this doesn't return an error but an access token, it means that #2 is set up. Don't share this token with anyone of course...
Some related information, you can check the token that was printed out with the command above here, or using the curl command below (paste the token at the end):
curl -i https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=
This doesn't exactly answer your question, but by process of elimination it should be the correct one...

Categories

Resources