I have a time-triggered Azure Function written in Python which gets a list of URLs (list is not static). For every URL I want to trigger an Azure Function and pass the URL to it for further processing.
How can I do this transition from one Azure Function to another? What's the best way to trigger the second function and pass the data to it?
You can do this one of 3 ways:
Once your Function ends, call the http triggered function that you want with a post request and a body filled with data that you want to send.
Write the function output to a blob or cosmosdb or postgresdb and create a blob/cosmos/postgres triggered function that triggers off of that input.
Create a durable function and chain a few functions together!
Good luck :)
How can I do this transition from one Azure Function to another? What's
the best way to trigger the second function and pass the data to it?
In your situation, you can foreach the list of the urls, create a new httptrigger function, put the url as the body of the request and process the url in the httptrigger function. You can call the httptrigger function by sending request to the httptrigger url.
I think you should try to use Durable Functions for this usecase. You will have better control over the activities sharing data from one another.
https://learn.microsoft.com/en-us/azure/azure-functions/durable/quickstart-python-vscode
Related
Just for reference I am coming from AWS so any comparisons would be welcome.
I need to create a function which detects when a blob is placed into a storage container and then downloads the blob to perform some actions on the data in it.
I have created a storage account with a container in, and a function app with a python function in it. I have then set up a event grid topic and subscription so that blob creation events trigger the event. I can verify that this is working. This gives me the URL of the blob which looks something like https://<name>.blob.core.windows.net/<container>/<blob-name>. However then when I try to download this blob using BlobClient I get various errors about not having the correct authentication or key. Is there a way in which I can just allow the function to access the container in the same way that in AWS I would give a lambda an execution role with S3 permissions, or do I need to create some key to pass through somehow?
Edit: I need this to run ASAP when the blob is put in the container so as far as I can tell I need to use EventGrid triggers not the normal blob triggers
I need to create a function which detects when a blob is placed into a storage container and then downloads the blob to perform some actions on the data in it.
This can be achieved by using an Azure Blob storage trigger for Azure Functions.
The Blob storage trigger starts a function when a new or updated blob is detected. The blob contents are provided as input to the function.
This last sentence, "The blob contents are provided as input to the function", means the blob can be an input parameter to the Function. This way, there's no (or less) need for you to download it manually.
Is there a way in which I can just allow the function to access the container in the same way that in AWS I would give a lambda an execution role with S3 permissions
Have a look at Using Managed Identity between Azure Functions and Azure Storage.
EDIT
I have understood correctly the normal blob trigger can have up to 10 minutes of delay?
This is correct, a Blob trigger could have up to 10 minutes of delay before it actually triggers the Function. The second part of the answer still stands, though.
The answer lied somewhere between #rickvdbosch's answer and Abdul's comment. I first had to assign an identity to the function giving it permission to access the storage account. Then I was able to use the azure.identity.DefaultAzureCredential class to automatically handle the credentials for the BlobClient
I want to use 2 blobs containers to trigger that azure function. Is there also a way to recognized which blob storage trigger the azure function? Please help. Thank you! Python
There are no plans to support multiple triggers per Function.
Each function has only one trigger but it can have multiple input bindings.
For your need, aving your blob uploads trigger an Event Grid event, and have an Event Grid Triggered function which is fired for each blob uploaded.
Can someone help me with executing python function from azure data factory.
I have stored python function in blob and i'm trying to trigger the same.
However i'm not able to do it. Please assist.
Second, Can i parameterize python function call from ADF?
You could get an idea of Azure Function Activity in ADF which allows you to run Azure Functions in a Data Factory pipeline.
And you could duplicate your python function into Python Azure Function.
Also,it want to pass parameters into python function,you could set them into body properties.
The Azure Function Activity supports routing. For example, if your app uses the following routing - https://functionAPP.azurewebsites.net/api/functionName/{value}?code=<secret> - then the functionName is functionName/{value}, which you can parameterize to provide the desired functionName at runtime.
I have a view function that initiates user login requests
it looks something like this:
def initiate_login(request):
# get request parameters
return check_user_and_send_otp(login_id)
The request is then processed by another function
def check_user_and_send_otp(login_id):
# check if user exits
return send_otp_to_user(phone_number)
And then another function
def send_otp_to_user(phone_number):
# sends a message to user
return response
The problem is while testing my code, I don't want to send messages to a phone number while testing.
My login test function looks somewhat like this, is it possible to mock it without changing my code?
def test_login_initiator(self):
response = self.client.post(self.login_url, data=self.login_data, content_type="application/json", **self.headers)
self.assertEqual(response.status_code, 200)
All these functions that were called by others are located in seperate modules
If you don't want to actually receive the message on a physical phone, you can use online sms receivers. Check out this blog.
Additionally, you can send messages through other free online services like Way2sms. You just have to google them up. To do this from within Python, you need to use web parsing using urllib2/requests and beautifulsoup, which is a totally new question.
Or you can skip this function by simply commenting out the message sender code or returning true from the function.
If you wanna live dangerously, think about making a config file which can help you to make switches that tell whether to execute something or not.
The right thing to do would be use something like magic mock and structure the code into proper classes so that we can create mock objects for each of them.
I have a use case where i want to invoke my lambda function whenever a object has been pushed in S3 and then push this notification to slack.
I know this is vague but how can i start doing so ? How can i basically achieve this ? I need to see the structure
There are a lot of resources available for both integrations (s3+lambda and lambda+slack) so if you can put these together you can make it work.
You can use S3 Event Notifications to trigger a lambda function directly:
http://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
Here are some blueprints to integrate lambda with slack:
https://aws.amazon.com/blogs/aws/new-slack-integration-blueprints-for-aws-lambda/
Good luck!
You can use S3 Event Notifications to trigger the lambda function.
In bucket's properties, create a new event notification for an event type of s3:ObjectCreated:Put and set the destination to a Lambda function.
Then for the lambda function, write a code either in Python or NodeJS (or whatever you like) and parse the received event and send it to Slack webhook URL.