Is it possible to create an Azure Service Principal with the Pulumi? - python

I have a Pulumi Python code that creates some Azure resources. Insted of I user User Account to create resource, I created an Azure Service Principal with Powershell and authenticate with the below method:
pulumi config set azure-native:clientId <clientID>
pulumi config set azure-native:clientSecret <clientSecret> --secret
pulumi config set azure-native:tenantId <tenantID>
pulumi config set azure-native:subscriptionId <subscriptionID>
Now, I want to know Is it possible to create an Azure Service Principal with Pulumi and after that authenticate with Service Principal has been created?
Other Question, Is it the true way?
Edit:
As these documents Service Principal
My code is:
import pulumi
from pulumi.output import Output
import pulumi_azure_native as azure_native
import pulumi_azuread as azuread
current = azuread.get_client_config()
example_application = azuread.Application("exampleApplication",
display_name="example",
owners=[current.object_id])
example_service_principal = azuread.ServicePrincipal("exampleServicePrincipal",
application_id=example_application.application_id,
app_role_assignment_required=False,
owners=[current.object_id])
I received this error. Also I install with pip install pulumi-azuread
ModuleNotFoundError: No module named 'pulumi_azuread'

You certainly can, using the Service Principal resource as part of the Azure Active Directory Provider.
You'll need to define an application first, but in typescript it would look something like:
import * as azuread from "#pulumi/azuread";
const application = new azuread.Application(`application`, {
displayName: `application`,
});
const servicePrincipal = new azuread.ServicePrincipal(`servicePrincipal`, {
applicationId: application.applicationId,
});

Related

How to access Azure Keyvault via private endpoint in Python?

For security purpose, I have disabled public access under Networking Tab in Keyvault and have a private endpoint in place. Both keyvault and private endpoint reside in same resource group. I have an app registration for my application for which I have granted access under Access policies in Keyvault.
Using Python SDK,
from azure.keyvault.secrets import SecretClient
from azure.identity import ClientSecretCredential as cs
keyVaultName = "<NAME>"
kvURI = "https://<NAME>.vault.azure.net"
AZ_TENANT_ID = '<AZ_TENANT_ID>'
AZ_CLIENT_ID = '<AZ_CLIENT_ID>'
AZ_CLIENT_SECRET = '<AZ_CLIENT_SECRET>'
credential = cs(
tenant_id=AZ_TENANT_ID,
client_id=AZ_CLIENT_ID,
client_secret=AZ_CLIENT_SECRET)
def set_secret(secretname,secretvalue):
print(credential)
secret_client = SecretClient(vault_url=kvURI, credential=credential)
secret = secret_client.set_secret(secretname,secretvalue,enabled=True)
sec_dic={}
sec_dic['name']=secret.name
sec_dic['value']=secret.value
sec_dic['properties']=secret.properties.version
return sec_dic
xx=set_secret('g','ff')
print(xx)
When running this code, I get the follwing error,
azure.core.exceptions.HttpResponseError: (Forbidden) Public network access is disabled and request is not from a trusted service nor via an approved private link.
Code: Forbidden
Message: Public network access is disabled and request is not from a trusted service nor via an approved private link.
Inner error: {
"code": "ForbiddenByConnection"
}
What am I doing wrong? How do I connect to keyvault that has no public access only via private endpoint?
I have reproduced in my environment, and got expected results as below:
Firstly, I have done the same process you have explained, and I got same error as you have got:
So, this error comes when you create a private endpoint.
When you create a private endpoint with a particular Virtual Network-Subnet, you need to create a Virtual Machine where Virtual Network is integrated.
Then we need to open the Vm created integrated with the above Virtual Network-Subnet, and now when I try to access the key Vault, I got required result as below:
References:
azure - Unable to get storage account container details - Stack Overflow
Azure Key Vault not allow access via private endpoint connection
Azure functions and Azure KeyVault communicating through service endpoint

How to get DB password from Azure app vault using python ? I am running this python file on google Dataproc cluster

My Sql server DB password is saved on Azure app vault which has DATAREF ID as a identifier. I need that password to create spark dataframe from table which is present in SQL server. I am running this .py file on google Dataproc cluster. How can I get that password using python?
Since you are accessing an Azure service from a non-Azure service, you will need a service principal. You can use certificate or secret. See THIS link for the different methods. You will need to give the service principal proper access and this will depend if you are using RBAC or access policy for your key vault.
So the steps you need to follow are:
Create a key vault and create a secret.
Create a Service principal or application registration. Store the clientid, clientsecret and tenantid.
Give the service principal proper access to the key vault(if you are using access policies) or to the specific secret(if you are using RBAC model)
The python link for the code is HERE.
The code that will work for you is below:
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
tenantid = <your_tenant_id>
clientsecret = <your_client_secret>
clientid = <your_client_id>
my_credentials = ClientSecretCredential(tenant_id=tenantid, client_id=clientid, client_secret=clientsecret)
secret_client = SecretClient(vault_url="https://<your_keyvault_name>.vault.azure.net/", credential=my_credentials)
secret = secret_client.get_secret("<your_secret_name>")
print(secret.name)
print(secret.value)

Trigger Azure Runbook whenever put new object in Azure bucket

I want to automate a azure resources(ex- start/stop VM) currently I am using Automation Account runbook and its working fine but I need to implement a framework something lie this :
1)Trigger runbook whenever put a new object(excel sheet) in azure bucket.
2)Read the excel sheet for input variables
Below is the runbook code
Somebody please tell me best way to trigger runbook which suits the above framework
"""
Azure Automation documentation : https://aka.ms/azure-automation-python-documentation
Azure Python SDK documentation : https://aka.ms/azure-python-sdk
"""
import os
import sys
from azure.mgmt.compute import ComputeManagementClient
import azure.mgmt.resource
import automationassets
def get_automation_runas_credential(runas_connection):
from OpenSSL import crypto
import binascii
from msrestazure import azure_active_directory
import adal
# Get the Azure Automation RunAs service principal certificate
cert = automationassets.get_automation_certificate("AzureRunAsCertificate")
pks12_cert = crypto.load_pkcs12(cert)
pem_pkey = crypto.dump_privatekey(crypto.FILETYPE_PEM,pks12_cert.get_privatekey())
# Get run as connection information for the Azure Automation service principal
application_id = runas_connection["ApplicationId"]
thumbprint = runas_connection["CertificateThumbprint"]
tenant_id = runas_connection["TenantId"]
# Authenticate with service principal certificate
resource ="https://management.core.windows.net/"
authority_url = ("https://login.microsoftonline.com/"+tenant_id)
context = adal.AuthenticationContext(authority_url)
return azure_active_directory.AdalAuthentication(
lambda: context.acquire_token_with_client_certificate(
resource,
application_id,
pem_pkey,
thumbprint)
)
Authenticate to Azure using the Azure Automation RunAs service principal
runas_connection = automationassets.get_automation_connection("AzureRunAsConnection")
azure_credential = get_automation_runas_credential(runas_connection)
Initialize the compute management client with the RunAs credential and specify the subscription to work against.
compute_client = ComputeManagementClient(
azure_credential,
str(runas_connection["SubscriptionId"])
)
print('\nStart VM')
async_vm_start = compute_client.virtual_machines.start(
'resource1', 'vm1')
async_vm_start.wait()
'''
print('\nStop VM')
async_vm_stop=compute_client.virtual_machines.power_off(resource_group_name, vm_name)
async_vm_stop.wait()'''
I believe one way to accomplish your requirement of triggering runbook whenever a new blob (or in your words 'object') is added in an Azure Storage container (on in your words 'bucket') is by leveraging Event Subscription (Event Grid). For related information, refer this document.
To illustrate it in a better way, you would have to go to Azure Portal -> your Storage account (that is of StorageV2 kind) -> Events tile -> More options -> Logic Apps -> Have 2 Steps as shown in below screenshot that does validate if a new storage blob is added and then runs the required runbook
You may also add next steps like sending mail after runbook execution is completed, etc.
Hope this helps!

How can I run a Python script in Azure DevOps with Azure Resource Manager credentials?

I have a Python script I want to run in Azure Resource Manager context within an Azure DevOps pipeline task to be able to access Azure resources (like the Azure CLI or Azure PowerShell tasks).
How can I get Azure RM Service Endpoint credentials stored in Azure DevOps passed - as ServicePrincipal/Secret or OAuth Token - into the script?
If I understand the issue correctly, you want to use the Python Azure CLI wrapper classes to manage or access Azure resources. Rather than using shell or PowerShell commands. I ran across the same issue and used the following steps to solve it.
import sys
from azure.identity import ClientSecretCredential
tenant_id = sys.argv[1]
client_id = sys.argv[2]
client_secret = sys.argv[3]
credentials = ClientSecretCredential(tenant_id, client_id, client_secret)
Add a "User Python version" step to add the correct version of python to your agent's PATH
Add a "Azure CLI" step. The goal here is to install your requirements and execute the script.
Within the Azure CLI step, be sure to check the "Access service principal details in script" box in the Advanced section. This will allow you to pass in the service principal details into your script as arguments.
Pass in the $tenantId $servicePrincipalId $servicePrincipalKey variables as arguments. These variables are pipeline defined so long as the box in step 3 is checked. No action is required on your part to define them.
Setup your Python script to accept the values and setup your
credentials. See the script above
Depends on what you call a python script, but either way Azure DevOps hasn't got native support to authenticate python sdk (or your custom python script), but you can pass in credentials from build\release variables to your script, or try and pull that from the Azure Cli (I think it stores data somewhere under /home/.azure/.
based on the hint given by 4c74356b41 above and with some dissecting of Azure CLI I created this function that allows pulling an OAuth token over ADAL from the Service Princial logged in inside an Azure DevOps - Azure CLI task
import os
import json
import adal
_SERVICE_PRINCIPAL_ID = 'servicePrincipalId'
_SERVICE_PRINCIPAL_TENANT = 'servicePrincipalTenant'
_TOKEN_ENTRY_TOKEN_TYPE = 'tokenType'
_ACCESS_TOKEN = 'accessToken'
def get_config_dir():
return os.getenv('AZURE_CONFIG_DIR', None) or os.path.expanduser(os.path.join('~', '.azure'))
def getOAuthTokenFromCLI():
token_file = (os.environ.get('AZURE_ACCESS_TOKEN_FILE', None)
or os.path.join(get_config_dir(), 'accessTokens.json'))
with open(token_file) as f:
tokenEntry = json.load(f)[0] # just assume first entry
tenantID = tokenEntry[_SERVICE_PRINCIPAL_TENANT]
appId = tokenEntry[_SERVICE_PRINCIPAL_ID]
appPassword = tokenEntry[_ACCESS_TOKEN]
authURL = "https://login.windows.net/" + tenantID
resource = "https://management.azure.com/"
context = adal.AuthenticationContext(authURL, validate_authority=tenantID, api_version=None)
token = context.acquire_token_with_client_credentials(resource,appId,appPassword)
return token[_TOKEN_ENTRY_TOKEN_TYPE] + " " + token[_ACCESS_TOKEN]

how do I get the project of a service account?

I'm using the python google.cloud api
For example using the metrics module
from google.cloud import monitoring
client = monitoring.Client()
client.query(my/gcp/metric, minutes=10)
For my GOOGLE_APPLICATION_CREDENTIALS im using a service account that has specific access to a gcp project.
Does google.cloud have any modules that can let me derive the project from the service account (like get what project the service account is in)?
This would be convenient because each service account only has access to a single project, so I could set my service account and be able to reference that project in code.
Not sure if this will work, you may need to tweak it:
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
service = discovery.build('yourservicename', credentials=credentials)
request = service.projects().list()[0]
Google Cloud Identity and Access Management (IAM) API has ‘serviceAccounts.get’ method and which shows the projects associated with a service account as shown here. You need to have proper permissions on the projects for the API to work.
The method google.auth.default return a tuple (project_id, credentials) if that information is available on the environment.
Also, the client object knows to which project it is linked from (either client.project or client.project_id, I'm not sure which one for the Monitoring API).
If you set the service account manually with the GOOGLE_APPLICATION_CREDENTIALS env var, you can open the file and load its json. One of the parameters in a service account key file is the project id.

Categories

Resources