Pyspark - REST API call to get azure service bus connection string - python

I am trying to send messages using Databricks to Azure Service bus topic using connection string defined in the Pyspark code/using key vault. As per client policy they will be updating the keys frequently so has asked to use REST API calls to get the connection string from the service bus everytime we need to send a message!
Is there any way I can do this using rest API calls?

Yes you can use ListKeys api call from the management.azure.com REST API to list them, see below
https://learn.microsoft.com/en-us/rest/api/servicebus/stable/disaster-recovery-configs/list-keys

Related

How to integrate Twilio's Voice API service with AWS S3 Storage?

I'm trying to create a short program that calls a user's number and records the conversation using Twilio and send the recording to an S3 bucket
Here's a link that does it to a dropbox instead of an S3:
https://www.twilio.com/blog/recording-saving-outbound-voice-calls-python-twilio-dropbox
Here's the code I have so far that allows me to call and recorded conversations go to Twilio's online storage:
call = client.calls.create(
record=True,
url='http://demo.twilio.com/docs/voice.xml',
to='+15558889988',
from_='+18889992222'
)
print(call.sid)
Twillio has inbuilt mechanism to do it, any specific use case you want to do it. https://www.twilio.com/blog/announcing-external-aws-s3-storage-support-for-voice-recordings
When you create the call you can also create a webhook that tells you when the recording is ready. When you then receive the webhook you can get the file and send it to S3.
...
record=True,
recording_status_callback=callbackURL+"/recordings",
recording_status_callback_event=["completed"],
...

How to add filter for azure function service bus topic trigger using python code

I have requirement of : - I have azure function service bus topic trigger by using python code, So the service bus topic having one topic and multiple subscription with in it.
I have to add a sqlfilter to the subscription so that the message which I sent right it should only go to that subscription if the filter condition satisfies and triggers the function app
How to add the filter option in python code. I found multiple of reference in c# but I need for python.
public async Task SendMessage(MyPayload payload)
{
string messagePayload = JsonSerializer.Serialize(payload);
ServiceBusMessage message = new ServiceBusMessage(messagePayload);
message.ApplicationProperties.Add("goals", payload.Goals);
try
for sample I have add the code for c# where there are adding application properties in function app code , so which ever subscription satisfy the condition which is goals = payload.Goals the mgs will go to that subscription.
I want to know how can we add the application properties in python azure function app code for service bus topic trigger
Using the python client sdk for Azure Service bus, you can apply SqlFilter and SqlRuleAction before you start processing your messages.
Pseudocode will be like,
servicebus_mgmt_client.create_rule(topicname,sub_name,filtername, filter, action)
send_mesgs_to_topic() #set filter in your message
receive_mesgs() #received mesg will have properties
See the detailed examples here in github.

How to initialize a Google BigQuery Client within a Cloud Function using the JWT of a service account passed by API Gateway to the Cloud Function

Goal:
I have set up a Google API Gateway. The backend for the API is a cloud function (written in python). The cloud function should query data from Google BigQuery (BQ). To do that, I want to create a BQ Client (google.cloud.bigquery.Client()). The API should be accessed by different applications using different service accounts. The service accounts have permission to access only specific datasets within my project. Therefore, the service accounts/applications should only be able to query the datasets they have the permission for. Therefore, the BQ Client within the cloud function should be initialized with the service account that sends the request to the API.
What I tried:
The API is secured with the following OpenAPI definition so that a JWT signed by the service account SA-EMAIL is required to send a request there:
securityDefinitions:
sec-def1:
authorizationUrl: ""
flow: "implicit"
type: "oauth2"
x-google-issuer: "SA-EMAIL"
x-google-jwks_uri: "https://www.googleapis.com/robot/v1/metadata/x509/SA-EMAIL"
x-google-audiences: "SERVICE"
For the path that uses my cloud function, I use the following backend configuration:
x-google-backend:
address: https://PROJECT-ID.cloudfunctions.net/CLOUD-FUNCTION
path_translation: CONSTANT_ADDRESS
So in the cloud function itself I get the forwarded JWT as X-Forwarded-Authorization and also the already verified base64url encoded JWT payload as X-Apigateway-Api-Userinfo from the API Gateway.
I tried to use the JWT from X-Forwarded-Authorization to obtain credentials:
bearer_token = request.headers.get('X-Forwarded-Authorization')
token = bearer_token.split(" ")[1]
cred = google.auth.credentials.Credentials(token)
At first, this seems to work since cred.valid returns True, but when trying to create the client with google.cloud.bigquery.Client(credentials=cred) it returns the following error in the logs:
google.auth.exceptions.RefreshError: The credentials do not contain
the necessary fields need to refresh the access token. You must
specify refresh_token, token_uri, client_id, and client_secret.
I do not have much experience with auth/oauth at all, but I think I do not have the necessary tokens/attributes the error is saying are missing available in my cloud function.
Also, I am not exactly sure why there is a RefreshError, since I don't want to refresh the token (and don't do so explicitly) and just use it again (might be bad practice?).
Question:
Is it possible to achieve my goal in the way I have tried or in any other way?
Your goal is to catch the credential that called the API Gateway, and to reuse it in your Cloud Functions to call BigQuery.
Sadly, you can't. Why? Because API Gateway prevent you to achieve that (and it's a good news for security reason). The JWT token is correctly forwarded to your Cloud Functions, but the signature part has been removed (you receive only the header and the body of the JWT token).
The security verification has been done by API Gateway and you have to rely on that authentication.
What's the solution?
My solution is the following: In the truncated JWT that you receive, you can get the body and get the Service Account email. From there, you can use the Cloud Functions service account, to impersonate the Service Account email that you receive.
Like that, the Cloud Functions service account only needs the permission to impersonate these service account, and you keep the permission provided on the original service account.
I don't see other solutions to solve your issue.
The JWT that you are receiving from API Gateway is not an OAuth Access Token. Therefore the JWT Payload portion is not a credential that you can use for the BigQuery Client authorization.
As #guillaume blaquiere pointed out, the Payload contains the email address of an identity. If the identity is a service account, you could implement impersonation of that identity. This could be a good solution if you are using multiple identities with API Gateway. If the identity is a user account, then you would need to implement Domain-Wide Delegation.
I recommend simply using the service account assigned to the Cloud Function with proper roles assigned to initialize the BigQuery client. Provided that API Gateway is providing authorization to reach your Cloud Function, there is no need for the extra layer of impersonation.
Another option is to store the matching service account JSON key file in Secret Manager and pull it when required to create the BigQuery Client.

Trying to Connect to azure cosmos client using python, Gives 104 connection aborted error

Okay so I have an azure cosmos subscription, where I have created a Mongo DB resource, Now when I am using python SDK to connect it, now it's given when 104, error, connection reset by peer.
Now I am not sure what's the issue,
I am using endpoint with SSL True and Primary Key.
code
endpoint = "http://XXX.mongo.cosmos.azure.com:10255/?ssl=true"
key = 'xxxxxxxxxxxxxxxx'
# <create_cosmos_client>
client = CosmosClient(endpoint, key)
When choosing the MongoDB API, you must use a native MongoDB SDK (in your case, pymongo); the wire protocol is MongoDB, and operations are performed via the same protocol as MongoDB.
Your code is attempting to use the Cosmos DB SDK, which is specific to, and will only work with, the Core (SQL) API.
If you look in the portal blade for your MongoDB-API instance, you'll see examples under Quick Start tab, which each use a MongoDB SDK in its examples (or the mongo shell). Same thing with the Connection Strings tab, showing native MongoDB connection strings (as well as the separate parts of the connection string).

Cannot connect to EC2 using python boto

I'm a complete noob with Python and boto and trying to establish a basic connection to ec2 services.
I'm running the following code:
ec2Conn = boto.connect_ec2('username','password')
group_name = 'python_central'
description = 'Python Central: Test Security Group.'
group = ec2Conn.create_security_group(group_name, description)
group.authorize('tcp', 8888,8888, '0.0.0.0/0')
and getting the following error:
AWS was not able to validate the provided access credentials
I've read some posts that this might be due to time difference between my machine and the EC2 server but according to the logs, they are the same:
host:ec2.us-east-1.amazonaws.com x-amz-date:20161213T192005Z
host;x-amz-date
515db222f793e7f96aa93818abf3891c7fd858f6b1b9596f20551dcddd5ca1be
2016-12-13 19:20:05,132 boto [DEBUG]:StringToSign:
Any idea how to get this connection running?
Thanks!
Call made to the AWS API require authentication via Access Key and Secret Key. These can be obtained from the Identity and Access Management (IAM) console, under the Security Credentials tab for a user.
See: Getting Your Access Key ID and Secret Access Key
If you are unfamiliar with Python, you might find it easier to call AWS services by using the AWS Command-Line Interface (CLI). For example, this single-line command can launch an Amazon EC2 instance:
aws ec2 run-instances --image-id ami-c2d687ad --key-name joe --security-group-id sg-23cb34f6 --instance-type t1.micro
See: AWS CLI run-instances documentation

Categories

Resources