I am trying to add a reply_url programmatically to an Azure app registration, but I receive an azure.graphrbac.models.graph_error_py3.GraphErrorException: Specified HTTP method is not allowed for the request target.
It fails when I try to update an existing application with new reply_urls.
SDK I am using is: azure-graphrbac==0.61.1
My code:
from azure.common.credentials import ServicePrincipalCredentials
from azure.graphrbac import GraphRbacManagementClient
from azure.graphrbac.models import ApplicationUpdateParameters
class GraphClient:
def __init__(self, client_id, client_secret, tenant_id, object_id):
self._credentials = ServicePrincipalCredentials(
client_id=client_id,
secret=client_secret,
tenant=tenant_id,
resource="https://graph.windows.net"
)
self._graph_client = GraphRbacManagementClient(
credentials=self._credentials,
tenant_id=tenant_id
)
self._object_id = object_id
self._application = self._graph_client.applications.get(self._object_id)
def get_reply_urls(self) -> List[str]:
return self._application.reply_urls
def add_reply_url(self, reply_url) -> None:
reply_urls: list = self.get_reply_urls()
self._graph_client.applications.patch(
self._object_id,
ApplicationUpdateParameters(
reply_urls=[
*reply_urls,
reply_url]
)
)
Could not reproduce your issue, use the same version of azure-graphrbac, I test your code on my side, it works fine.
testclient = GraphClient(client_id = "xxxxx",client_secret = "xxxxx", tenant_id = "xxxxx", object_id = "xxxxx")
testclient.add_reply_url(reply_url = "http://localhost:8085")
Check in the portal:
Also, I test the sdk directly, both work.
from azure.common.credentials import ServicePrincipalCredentials
from azure.graphrbac import GraphRbacManagementClient
from azure.graphrbac.models import ApplicationUpdateParameters
_credentials = ServicePrincipalCredentials(
client_id="xxxxx",
secret="xxxxx",
tenant="xxxxx",
resource="https://graph.windows.net"
)
_graph_client = GraphRbacManagementClient(
credentials=_credentials,
tenant_id="xxxxx"
)
app = _graph_client.applications.patch(
application_object_id = "xxxxx",
parameters = ApplicationUpdateParameters(reply_urls = ["http://localhost:8080","http://localhost:8081"])
)
Update call looks good however it depends on a legacy API (AAD Graph) and tools. It's strongly recommended to move to MS Graph which supports almost all Azure AD Graph operations and will fully support them all in a future. Applications being one of them.
You can use Requests-OAuthlib or Microsoft Graph Python Client Library for this.
Related
I deployed a model to the model registry on Vertex AI. I added an endpoint too, and I am able to make inferences. Below is the code that I wrote (using Python 3.9.12):
from google.cloud import aiplatform
from google.oauth2 import service_account
# settings is a Pydantic BaseSettings subclass object
credentials_json = json.loads(settings.GCP_VERTEX_SERVICE_ACC)
credentials = service_account.Credentials.from_service_account_info(
info=credentials_json
)
aiplatform.init(project=settings.GCLOUD_PROJECT_NUMBER,
location=settings.GCLOUD_LOCATION,
credentials=credentials)
endpoint = aiplatform.Endpoint(settings.GCLOUD_SBERT_ENDPOINT_ID)
...
async def do_inference(list_strs: List[str]):
result = endpoint.predict(instances=list_strs)
return result.predictions
Right now I'm not able to make asynchronous requests. Is there a way around this? For instance, would using the aiplatform_v1beta1.PredictionServiceAsyncClient library be a solution? Thanks in advance!
---- EDIT -----
Below is the piece of code that did it for me in case someone else is struggling with the same thing.
import asyncio
from google.cloud import aiplatform_v1beta1
from google.oauth2 import service_account
from google.protobuf import json_format
from google.protobuf.struct_pb2 import Value
# settings is a Pydantic BaseSettings subclass object
credentials_json = json.loads(settings.GCP_VERTEX_SERVICE_ACC)
credentials = service_account.Credentials.from_service_account_info(
info=credentials_json
)
client_options = {"api_endpoint": f"{settings.GCLOUD_LOCATION}-aiplatform.googleapis.com"}
client = aiplatform_v1beta1.PredictionServiceAsyncClient(credentials=credentials, client_options=client_options)
...
async def do_inference(list_strs: List[str]):
request = aiplatform_v1beta1.PredictRequest(endpoint=endpoint)
request.instances.extend(list_strs)
response = await client.predict(request)
predictions = response.predictions
return predictions
asyncio.get_event_loop().run_until_complete(do_inference())
This code owes a lot to #milad_raesi's answer!
I am working on deploy resources in Azure using python based on provided templates. As a starting point I am working with https://github.com/Azure-Samples/Hybrid-Resource-Manager-Python-Template-Deployment
Using as it is, I am having an issue at the beginning of the deployment (deployer.py deploy function)
def deploy(self, template, parameters):
"""Deploy the template to a resource group."""
self.client.resource_groups.create_or_update(
self.resourceGroup,
{
'location': os.environ['AZURE_RESOURCE_LOCATION']
}
)
The error message is
Message='ServicePrincipalCredentials' object has no attribute
'get_token'
The statement is correct, ServicePrincipalCredentials get_token attribute doesn't exist, however token is, may be an error due to an outdated version?
Based on the constructor information, the error may be on credentials creation or client creation
def __init__(self, subscription_id, resource_group, pub_ssh_key_path='~/id_rsa.pub'):
mystack_cloud = get_cloud_from_metadata_endpoint(
os.environ['ARM_ENDPOINT'])
subscription_id = os.environ['AZURE_SUBSCRIPTION_ID'] //This may be an error as subscription_id is already provided as a parameter
credentials = ServicePrincipalCredentials(
client_id=os.environ['AZURE_CLIENT_ID'],
secret=os.environ['AZURE_CLIENT_SECRET'],
tenant=os.environ['AZURE_TENANT_ID'],
cloud_environment=mystack_cloud
) --> here
self.subscription_id = subscription_id
self.resource_group = resource_group
self.dns_label_prefix = self.name_generator.haikunate()
pub_ssh_key_path = os.path.expanduser(pub_ssh_key_path)
# Will raise if file not exists or not enough permission
with open(pub_ssh_key_path, 'r') as pub_ssh_file_fd:
self.pub_ssh_key = pub_ssh_file_fd.read()
self.credentials = credentials
self.client = ResourceManagementClient(self.credentials, self.subscription_id,
base_url=mystack_cloud.endpoints.resource_manager) --> here
Do you know how I can fix this?
After struggling a little, I could find a solution. Just replace
credentials = ServicePrincipalCredentials(
client_id=os.environ['AZURE_CLIENT_ID'],
secret=os.environ['AZURE_CLIENT_SECRET'],
tenant=os.environ['AZURE_TENANT_ID'],
cloud_environment=mystack_cloud
)
for
self.credentials = DefaultAzureCredential()
The final code looks like:
from azure.identity import DefaultAzureCredential
def __init__(self, subscriptionId, resourceGroup):
endpoints = get_cloud_from_metadata_endpoint(os.environ.get("ARM_ENDPOINT"))
self.subscriptionId = subscriptionId
self.resourceGroup = resourceGroup
self.credentials = DefaultAzureCredential()
self.client = ResourceManagementClient(self.credentials, self.subscriptionId,
base_url=endpoints.endpoints.resource_manager)
I'm attempting to write a GCP Cloud Function in Python that calls the API for creating an IoT device. The initial challenge seems to be getting the appropriate module (specifically iot_v1) loaded within Cloud Functions so that it can make the call.
Example Python code from Google is located at https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/iot/api-client/manager/manager.py. The specific call desired is shown in "create_es_device". Trying to repurpose that into a Cloud Function (code below) errors out with "ImportError: cannot import name 'iot_v1' from 'google.cloud' (unknown location)"
Any thoughts?
import base64
import logging
import json
import datetime
from google.auth import compute_engine
from apiclient import discovery
from google.cloud import iot_v1
def handle_notification(event, context):
#Triggered from a message on a Cloud Pub/Sub topic.
#Args:
# event (dict): Event payload.
# context (google.cloud.functions.Context): Metadata for the event.
#
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
logging.info('New device registration info: {}'.format(pubsub_message))
certData = json.loads(pubsub_message)['certs']
deviceID = certData['device-id']
certKey = certData['certificate']
projectID = certData['project-id']
cloudRegion = certData['cloud-region']
registryID = certData['registry-id']
newDevice = create_device(projectID, cloudRegion, registryID, deviceID, certKey)
logging.info('New device: {}'.format(newDevice))
def create_device(project_id, cloud_region, registry_id, device_id, public_key):
# from https://cloud.google.com/iot/docs/how-tos/devices#api_1
client = iot_v1.DeviceManagerClient()
parent = client.registry_path(project_id, cloud_region, registry_id)
# Note: You can have multiple credentials associated with a device.
device_template = {
#'id': device_id,
'id' : 'testing_device',
'credentials': [{
'public_key': {
'format': 'ES256_PEM',
'key': public_key
}
}]
}
return client.create_device(parent, device_template)
You need to have the google-cloud-iot project listed in your requirements.txt file.
See https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/iot/api-client/manager/requirements.txt
I need to make online predictions from a model that is deployed in cloud ml engine. My code in python is similar to the one found in the docs (https://cloud.google.com/ml-engine/docs/tensorflow/online-predict):
service = googleapiclient.discovery.build('ml', 'v1')
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
However, I receive the "instances" data from outside the script, I wonder if there is a way I could run this script without making the "service = googleapiclient.discovery.build('ml', 'v1')" each time before a request, since it takes time.
pd: this is my very first project on gcp. Thank you.
Something like this will work. You'll want to initialize your service globally then use that service instance to make your call.
import googleapiclient.discovery
AI_SERVICE = None
def ai_platform_init():
global AI_SERVICE
# Set GCP Authentication
credentials = os.environ.get('GOOGLE_APPLICATION_CREDENTIALS')
# Path to your credentials
credentials_path = os.path.join(os.path.dirname(__file__), 'ai-platform-credentials.json')
if credentials is None and os.path.exists(credentials_path):
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = credentials_path
# Create AI Platform Service
if os.path.exists(credentials_path):
AI_SERVICE = googleapiclient.discovery.build('ml', 'v1', cache=MemoryCache())
# Initialize AI Platform on load.
ai_platform_init()
then later on, you can do something like this:
def call_ai_platform():
response = AI_SERVICE.projects().predict(name=name,
body={'instances': instances}).execute()
Bonus! in case you were curious about the MemoryCache class in the googleapiclient.discovery call, that was borrowed from another SO:
class MemoryCache():
"""A workaround for cache warnings from Google.
Check out: https://github.com/googleapis/google-api-python-client/issues/325#issuecomment-274349841
"""
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
We are trying to import our existing users into our B2C tenant. For this, we have been trying to use the azure-graphrbac python library.
I have followed this guide to register an application to be used with the graph api.
I'm using the below code to try and create a user:
from azure.graphrbac import GraphRbacManagementClient
from azure.common.credentials import ServicePrincipalCredentials
from azure.graphrbac.models import UserCreateParameters, PasswordProfile
credentials = ServicePrincipalCredentials(
client_id="<CLIENT ID>",
secret="<SECRET>",
tenant="<TENANT ID>"
)
tenant_id = '<myb2ctenant>.onmicrosoft.com'
graphrbac_client = GraphRbacManagementClient(
credentials,
tenant_id
)
ucp = UserCreateParameters(
user_principal_name="my#mail.com",
account_enabled=True,
display_name='Martin T',
mail_nickname='<mymail>',
additional_properties={
"signInNames": [{"type": "emailAddress", "value": "<mymail>"}]
},
user_type="LocalAccount",
password_profile=PasswordProfile(
password='<somepassword>',
force_change_password_next_login=True
)
)
user = graphrbac_client.users.create(ucp)
I've made sure that the client id, secret and tenant id are correct. However, I keep getting this error:
GraphErrorException: Access Token missing or malformed.
Does anyone have an idea as to what I might be doing wrong?
As Laurent said, you need define resource. The default resource is https://management.core.windows.net/. In your scenario, you want to create a user, the resource is https://graph.windows.net.
Your code also has some mistake, I modify it. The following code works for me.
from azure.graphrbac import GraphRbacManagementClient
from azure.common.credentials import ServicePrincipalCredentials
from azure.graphrbac.models import UserCreateParameters, PasswordProfile
credentials = ServicePrincipalCredentials(
client_id="",
secret="",
resource="https://graph.windows.net",
tenant = ''
)
tenant_id = ''
graphrbac_client = GraphRbacManagementClient(
credentials,
tenant_id
)
ucp = UserCreateParameters(
user_principal_name="",
account_enabled=True,
display_name='Martin T',
##I test in my lab, if I use this line, I will get error log and could not create a user.
#additional_properties={
# "signInNames": [{"type": "emailAddress", "value": ""}]
#},
##user_type only support Member or Guest, see this link https://learn.microsoft.com/en-us/python/api/azure.graphrbac.models.usercreateparameters?view=azure-python
user_type="Member",
mail_nickname = 'shuitest',
password_profile=PasswordProfile(
password='',
force_change_password_next_login=True
)
)
user = graphrbac_client.users.create(ucp)
See SDK in this link.
You service principal authentication needs to define "resource":
https://learn.microsoft.com/en-us/python/api/overview/azure/activedirectory
credentials = UserPassCredentials(
'user#domain.com', # Your user
'my_password', # Your password
resource="https://graph.windows.net"
)