I am new to azure.I am learning azure python sdk and have some doubts.
I am not using any credentials to log in azure account and still can access
VM's in subscription in my code below, how?
I am trying to get list of all VM's using list_all() which is present in azure doc https://learn.microsoft.com/en-us/python/api/azure-mgmt-compute/azure.mgmt.compute.v2018_10_01.operations.virtualmachinesoperations?view=azure-python#list-all-custom-headers-none--raw-false----operation-config-
, How can i get list of VM's or how to iterate over VirtualMachinePaged object return by list_all() to get list of VM's?
When i tried to print name of VM using #print(client.virtual_machines.get(resource_group_name='GSLab', vm_name='GSLabVM2')) i got error Resource group 'GSLab' could not be found.
, i checked and sure that name of resource group in 'GSLab', so why am i getting this error?
Here is my code, Thank you and please suggest any other source for better understanding of these concepts if possible.
from azure.common.client_factory import get_client_from_auth_file
from azure.mgmt.compute import ComputeManagementClient
client = get_client_from_auth_file(ComputeManagementClient)
#print(client)
vmlist = client.virtual_machines.list_all()
print(vmlist)
for vm in vmlist:
print(vm.name)
print(client.virtual_machines.get(resource_group_name='GSLab', vm_name='GSLabVM2'))
Q1: You get the credentials from the Authentication file that you set and the service principal is in it.
Q2: You just need to delete the print(vmlist) and then everything is OK.
Q3:
The code:
client.virtual_machines.get(resource_group_name='GSLab', vm_name='GSLabVM2')
The result will like this:
So you need to check that if the resource group 'GSLab' really exist in the subscription you have set in the Authentication file.
vmlist = client.virtual_machines.list_all()
for vm in vmlist:
print(vm.name)
this code is correct and this one as well:
client.virtual_machines.get(resource_group_name='GSLab', vm_name='GSLabVM2')
if they both return nothing you authenticated to the wrong subscription, you need to auth to the proper subscription.
simple way to check you got some output:
vmlist.next().name
Related
For my current Python project I' using the Microsoft Azure SDK for Python.
I want to copy a specific blob from one container path to another and tested already some options, described here.
Overall they are basically "working", but unfortunately the new_blob.start_copy_from_url(source_blob_url) command always leads to an erorr: ErrorCode:CannotVerifyCopySource.
Is someone getting the same error message here, or has an idea, how to solve it?
I was also trying to modify the source_blob_url as a sas-token, but still doesn't work. I have the feeling that there is some connection to the access levels of the storage account, but so far I wasn't able to figure it out. Hopefully someone here can help me.
Is someone getting the same error message here, or has an idea, how to solve it?
As you have mentioned you might be receiving this error due to permissions while including the SAS Token.
The difference to my code was, that I used the blob storage sas_token from the Azure website, instead of generating it directly for the blob client with the azure function.
In order to allow access to certain areas of your storage account, a SAS is generated by default with a number of permissions such as read/write, services, resource type, Start and expiration date/time, and Allowed IP addresses, etc.
It's not that you always need to generate directly for the blob client with the azure function but you can generate one from the portal too by allowing the permissions.
REFERENCES: Grant limited access to Azure Storage resources using SAS - MSFT Document
I just wanted to know if there is a way to check whether a Python script is running inside a Compute Engine or in a local environment?
I want to check that in order to know how to authenticate, for example when a script runs on a Compute Engine and I want to initiate a BigQuery client I do not need to authenticate but when it comes to running a script locally I need to authenticate using a service account JSON file.
If I knew whether a script is running locally or in a Compute Engine I would be able to initiate Google services accordingly.
I could put initialization into a try-except statement but maybe there is another way?
Any help is appreciated.
If I understand your question correctly, I think a better solution is provided by Google called Application Default Credentials. See Best practices to securely auth apps in Google Cloud (thanks #sethvargo) and Application Default Credentials
Using this mechanism, authentication becomes consistent regardless of where you run your app (on- or off-GCP). See finding credentials automatically
When you run off-GCP, you set GOOGLE_APPLICATION_CREDENTIALS to point to the Service Account. When you run on-GCP (and, to be clear, you are still authenticating, it's just transparent), you don't set the environment variable because the library obtains the e.g. Compute Engine instance's service account for you.
So I read a bit on the Google Cloud authentication and came up with this solution:
import google.auth
from google.oauth2 import service_account
try:
credentials, project = google.auth.default()
except:
credentials = service_account.Credentials.from_service_account_file('/path/to/service_account_json_file.json')
client = storage.Client(credentials=credentials)
What this does is it tries to retrieve the default Google Cloud credentials (in environments such as Compute Engine) and if it fails it tries to authenticate using a service account JSON file.
It might not be the best solution but it works and I hope it will help someone else too.
I would like to access the contents in my Azure Datalake gen 2 via my local python editor? What would be the best way to do this?
I googled, but there are multiple ways to do this - SAS, Service principle for instance.
Could someone please provide any pointers in the right direction?
Thank you.
Use this method to link:
https://learn.microsoft.com/en-us/python/api/azure-storage-file-datalake/azure.storage.filedatalake.datalakeserviceclient?view=azure-python
And use this to create credential:
from azure.identity import DefaultAzureCredential
credential = DefaultAzureCredential()
And for any one you want it to access, you need to add access role(RBAC, people without corresponding permissions cannot perform specific operations.):
You can try above steps.
I have code below that was given to me to list Google Cloud Service Accounts for a specific Project.
import os
from googleapiclient import discovery
from gcp import get_key
"""gets all Service Accounts from the Service Account page"""
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = get_key()
service = discovery.build('iam', 'v1')
project_id = 'projects/<google cloud project>'
request = service.projects().serviceAccounts().list(name=project_id)
response = request.execute()
accounts = response['accounts']
for account in accounts:
print(account['email'])
This code works perfectly and prints the accounts as I need them. What I'm trying to figure out is:
Where can I go to see how to construct code like this? I found a site that has references to the Python API Client, but I can't seem to figure out how to make the code above from it. I can see the Method to list the Service Accounts, but it's still not giving me enough information.
Is there somewhere else I should be going to educate myself. Any information you have is appreciated so I don't pull out the rest of my hair.
Thanks, Eric
Let me share with you this documentation page, where there is a detailed explanation on how to build a script such as the one you shared, and what does each line of code mean. It is extracted from the documentation of ML Engine, not IAM, but it is using the same Python Google API Client Libary, so just ignore the references to ML and the rest will be useful for you.
In any case, here it is a commented version of your code, so that you understand it better:
# Imports for the Client API Libraries and the key management
import os
from googleapiclient import discovery
from gcp import get_key
# Look for an environment variable containing the credentials for Google Cloud Platform
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = get_key()
# Build a Python representation of the REST API
service = discovery.build('iam', 'v1')
# Define the Project ID of your project
project_id = 'projects/<google cloud project>'
"""Until this point, the code is general to any API
From this point on, it is specific to the IAM API"""
# Create the request using the appropriate 'serviceAccounts' API
# You can substitute serviceAccounts by any other available API
request = service.projects().serviceAccounts().list(name=project_id)
# Execute the request that was built in the previous step
response = request.execute()
# Process the data from the response obtained with the request execution
accounts = response['accounts']
for account in accounts:
print(account['email'])
Once you understand the first part of the code, the last lines are specific to the API you are using, which in this case is the Google IAM API. In this link, you can find detailed information on the methods available and what they do.
Then, you can follow the Python API Client Library documentation that you shared in order to see how to call the methods. For instance, in the code you shared, the method used depends on service, which is the Python representation of the API, and then goes down the tree of methods in the last link as in projects(), then serviceAccounts() and finally the specificlist() method, which ends up in request = service.projects().serviceAccounts().list(name=project_id).
Finally, just in case you are interested in the other available APIs, please refer to this page for more information.
I hope the comments I made on your code were of help, and that the documentation shared makes it easier for you to understand how a code like that one could be scripted.
You can use ipython having googleapiclient installed - with something like:
sudo pip install --upgrade google-api-python-client
You can go to interactive python console and do:
from googleapiclient import discovery
dir(discovery)
help(discovery)
dir - gives all entries that object has - so:
a = ''
dir(a)
Will tell what you can do with string object. Doing help(a) will give help for string object. You can do dipper:
dir(discovery)
# and then for instance
help(discovery.re)
You can call your script in steps, and see what is result print it, do some research, having something - do %history to printout your session, and have solution that can be triggered as a script.
I get an error when trying to deallocate a virtual machine with the Python SDK for Azure.
Basically I try something like:
credentials = ServicePrincipalCredentials(client_id, secret, tenant)
compute_client = ComputeManagementClient(credentials, subscription_id, '2015-05-01-preview')
compute_client.virtual_machines.deallocate(resource_group_name, vm_name)
pprint (result.result())
-> exception:
msrestazure.azure_exceptions.CloudError: Azure Error: AuthorizationFailed
Message: The client '<some client UUID>' with object id '<same client UUID>' does not have authorization to perform action 'Microsoft.Compute/virtualMachines/deallocate/action' over scope '/subscriptions/<our subscription UUID>/resourceGroups/<resource-group>/providers/Microsoft.Compute/virtualMachines/<our-machine>'.
What I don't understand is that the error message contains an unknown client UUID that I have not used in the credentials.
Python is version 2.7.13 and the SDK version was from yesterday.
What I guess I need is a registration for an Application, which I did to get the information for the credentials. I am not quite sure which exact permission(s) I need to register for the application with IAM. For adding an access entry I can only pick existing users, but not an application.
So is there any programmatic way to find out which permissions are required for an action and which permissions our client application has?
Thanks!
As #GauravMantri & #LaurentMazuel said, the issue was caused by not assign role/permission to a service principal. I had answered another SO thread Cannot list image publishers from Azure java SDK, which is similar with yours.
There are two ways to resolve the issue, which include using Azure CLI & doing these operations on Azure portal, please see the details of my answer for the first, and I update below for the second way which is old.
And for you want to find out these permissions programmatically, you can refer to the REST API Role Definition List to get all role definitions that are applicable at scope and above, or refer to Azure Python SDK Authentication Management to do it via the code authorization_client.role_definitions.list(scope).
Hope it helps.
Thank you all for your answers! The best recipe for creating an application and to register it with the right role - Virtual Machine Contributor - is presented indeed on https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-group-create-service-principal-portal
The main issue I had was that there is a bug in the adding a role within IAM. I use add. I select "Virtual Machine Contributor". With "Select" I get presented a list of users, but not the application that I have created for this purpose. Entering the first few letters of the name of my application will give a filtered output that includes my application this time though. Registration is then finished and things can proceed.