How to create Azure Application gateway using python SDK - python

I'm starting to feel a bit stupid. Have someone been able to successfully create an Application gateway using Python SDK for Azure?
The documentation seems ok, but I'm struggling with finding the right parameters to pass 'parameters' of
azure.mgmt.network.operations.ApplicationGatewaysOperations application_gateways.create_or_update(). I found a complete working example for load_balancer but can't find anything for Application gateway. Getting 'string indices must be integers, not str' doesn't help at all. Any help will be appreciated, Thanks!
Update: Solved. An advice for everyone doing this, look carefully for the type of data required for the Application gateway params

I know there is no Python sample for Application Gateway currently, I apologize for that...
Right now I suggest you to:
Create the Network client using this tutorial or this one
Take a look at this ARM template for Application Gateway. Python parameters will be very close from this JSON. At worst, you can deploy an ARM template using the Python SDK too.
Take a look at the ReadTheDocs page of the create operation, will give you the an idea of what is expected as parameters.
Open an issue on the Github tracker, so you can follow when I do a sample (or at least a unit test you can mimic).
Edit after question in comment:
To get the IP of VM once you have a VM object:
# Gives you the ID if this NIC
nic_id = vm.network_profile.network_interfaces[0].id
# Parse this ID to get the nic name
nic_name = nic_id.split('/')[-1]
# Get the NIC instance
nic = network_client.network_interfaces.get('RG', nic_name)
# Get the actual IP
nic.ip_configurations[0].private_ip_address
Edit:
I finally wrote the sample:
https://github.com/Azure-Samples/network-python-manage-application-gateway
(I work at MS and I'm responsible of the Azure SDK for Python)

Related

Azure SDK for Python: Copy blobs

For my current Python project I' using the Microsoft Azure SDK for Python.
I want to copy a specific blob from one container path to another and tested already some options, described here.
Overall they are basically "working", but unfortunately the new_blob.start_copy_from_url(source_blob_url) command always leads to an erorr: ErrorCode:CannotVerifyCopySource.
Is someone getting the same error message here, or has an idea, how to solve it?
I was also trying to modify the source_blob_url as a sas-token, but still doesn't work. I have the feeling that there is some connection to the access levels of the storage account, but so far I wasn't able to figure it out. Hopefully someone here can help me.
Is someone getting the same error message here, or has an idea, how to solve it?
As you have mentioned you might be receiving this error due to permissions while including the SAS Token.
The difference to my code was, that I used the blob storage sas_token from the Azure website, instead of generating it directly for the blob client with the azure function.
In order to allow access to certain areas of your storage account, a SAS is generated by default with a number of permissions such as read/write, services, resource type, Start and expiration date/time, and Allowed IP addresses, etc.
It's not that you always need to generate directly for the blob client with the azure function but you can generate one from the portal too by allowing the permissions.
REFERENCES: Grant limited access to Azure Storage resources using SAS - MSFT Document

Troubleshooting custom domain using Zappa

I’m doing some troubleshooting for a custom domain (purchased via Google). I’ve updated the custom resource record in Google, done the route 53 + AWS cert manager steps, and ran zappa certify (as well as zappa update). My first thought… is it supposed to show the AWS link after the custom domain: e.g.,
Your updated Zappa is live!: https://customDomain.com (https://abc123.amazonaws.com/dev)
At the moment, the amazonaws link still works, but the custom domain does not. Curious if this could indicate anything per anyone’s experience?
Resolved my own issue ... simple mistake.
In Google Domains, I was using 'Default Name Servers' with CNAME customer records. Instead, I needed to use 'Custom Name Servers' and activated the 4 names servers from route 53. Hopefully this helps anyone who encounters similar troubleshooting w AWS/Google/Zappa!

gcloud monitoring_v3 query fails on AttributeError 'WhichOneof'

Im using this lib for a while, everything was working great. Using it to query cpu utilization of gcloud machines.
this is my code:
query_obj = Query(metric_service_client, project, "compute.googleapis.com/instance/cpu/utilization",
minutes=mins_backward_check)
metric_res = query_obj.as_dataframe()
Everything was working fine until lately it started to fail.
I'm getting:
{AttributeError}'WhichOneof'
Deubbing it, i see it fails inside "as_dataframe()" code, specifically in this part:
data=[_extract_value(point.value) for point in time_series.points]
When it tries to extract the value from the point object.
The _extract_value values code seems to use the WhichOneof attribute which seems to be related to protobuff lib.
I didn't change any of those libs versions, anyone has any clue what causes it to fail now?
If you're confident (!) that you've not changed anything, then this would appear to be Google breaking its API and you may wish to file an issue on Google's issue tracker on one of these components:
https://issuetracker.google.com/issues/new?component=187228&template=1162638
https://issuetracker.google.com/issues/new?component=187143&template=800102
I think Cloud Monitoring is natively a gRPC-based API which would explain the protobuf reference.
A good sanity check is to use APIs Explorer and check the method you're using there to see whether you can account for the request|response, perhaps:
https://cloud.google.com/monitoring/api/ref_v3/rest/v3/projects.timeSeries/query
NOTE Your question may be easy to parse for someone familiar with the Cloud Monitoring Python SDK but isn't easy to repro. Please consider providing a simple repro of your issue, including requirements.txt and a full code snippet.

How to get request hostname from an HTTP-triggered, Python Azure Function request header?

I would greatly benefit from a list of all the available headers that App Service can forward to my (keyword PYTHON) Function. Or if someone knows how to "list-all", that would be awesome.
Through asking questions on SO, I see that the request IP addressed can be gleaned using:
req.headers.get("X-FORWARDED-FOR").
I need the Hostname that a request is coming from.
Looks like this is possible using C# Functions. But I either did it wrong using req.headers.Host or its not available for Python.
Is it possible using Python?
For this requirement, you just need to use req.headers.get("host"). I test it in my side, it works fine on azure portal.

Azure Python SDK to get usage details - UsageDetailsOperations Class

I am new to python.
I need to get the Usage details using python sdk.
I am able to do the same using the usage detail API.
But unable to do so using the sdk.
I am trying to use the azure.mgmt.consumption.operations.UsageDetailsOperations class. The official docs for UsageDetailsOperations
https://learn.microsoft.com/en-us/python/api/azure-mgmt-consumption/azure.mgmt.consumption.operations.usage_details_operations.usagedetailsoperations?view=azure-python#list-by-billing-period
specifies four parameters to create the object
(i.e.client:Client for service requests,config:Configuration of service client,
serializer:An object model serializer,deserializer:An object model deserializer).
Out of these parameters I only have the client.
I need help understanding how to get the other three parameters or is there another way to create the UsageDetailsOperations object.
Or is there any other approach to get the usage details.
Thanks!
This class is not designed to be created manually, you need to create a consumption client, which will have an attribute "usages" which will be the class in question (instanciated correctly).
There is unfortunately no samples for consumption yet, but creating the client will be similar to creating any other client (see Network client creation for instance).
For consumption, what might help is the tests, since they give some idea of scenarios:
https://github.com/Azure/azure-sdk-for-python/blob/fd643a0/sdk/consumption/azure-mgmt-consumption/tests/test_mgmt_consumption.py
If you're new to Azure and Python, you might want to do this quickstart:
https://learn.microsoft.com/en-us/azure/python/python-sdk-azure-get-started
Feel free to open an issue in the main Python repo, asking for more documentation about this client (this will help prioritize it):
https://github.com/Azure/azure-sdk-for-python/issues
(I'm working at Microsoft in the Python SDK team).

Categories

Resources