I have a use case where i want to invoke my lambda function whenever a object has been pushed in S3 and then push this notification to slack.
I know this is vague but how can i start doing so ? How can i basically achieve this ? I need to see the structure
There are a lot of resources available for both integrations (s3+lambda and lambda+slack) so if you can put these together you can make it work.
You can use S3 Event Notifications to trigger a lambda function directly:
http://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
Here are some blueprints to integrate lambda with slack:
https://aws.amazon.com/blogs/aws/new-slack-integration-blueprints-for-aws-lambda/
Good luck!
You can use S3 Event Notifications to trigger the lambda function.
In bucket's properties, create a new event notification for an event type of s3:ObjectCreated:Put and set the destination to a Lambda function.
Then for the lambda function, write a code either in Python or NodeJS (or whatever you like) and parse the received event and send it to Slack webhook URL.
Related
I have a time-triggered Azure Function written in Python which gets a list of URLs (list is not static). For every URL I want to trigger an Azure Function and pass the URL to it for further processing.
How can I do this transition from one Azure Function to another? What's the best way to trigger the second function and pass the data to it?
You can do this one of 3 ways:
Once your Function ends, call the http triggered function that you want with a post request and a body filled with data that you want to send.
Write the function output to a blob or cosmosdb or postgresdb and create a blob/cosmos/postgres triggered function that triggers off of that input.
Create a durable function and chain a few functions together!
Good luck :)
How can I do this transition from one Azure Function to another? What's
the best way to trigger the second function and pass the data to it?
In your situation, you can foreach the list of the urls, create a new httptrigger function, put the url as the body of the request and process the url in the httptrigger function. You can call the httptrigger function by sending request to the httptrigger url.
I think you should try to use Durable Functions for this usecase. You will have better control over the activities sharing data from one another.
https://learn.microsoft.com/en-us/azure/azure-functions/durable/quickstart-python-vscode
I want to use API Gateway to send a message to SQS which then needs to trigger Lambda. After calculations are finished within Lambda, I need to pass the results back to API Gateway. In other words, something like this:
Get request --> Gateway API --> SQS --> Lambda --> (back to the same SQS?) --> Gateway API
I have setup all the necessary permissions meaning that I can call Gateway API and send message to SQS which then sends that to Lambda (I can see in the Cloudwatch that Lambda received the message). However, I cannot get the Lambda response back to Gateway API...
Does anybody has some advice/tutorial/blog post about that? I have watched various youtube videos and searched posts on SO but didn't find solution for my problem.
AWS Lambda can handle a large number of concurrent invocations. The default is 1000 (one thousand) and can be increased via a support ticket to "Hundreds of thousands".
If you want to use SQS to smoothen intermittent request spikes, then the Lambda function invocations will be asynchronous with respect to the caller's/client's API Gateway call and you need to use other means to feedback the Lambda invocation result to the API Gateway caller/client.
One such possibility can be a callback URL that your Lambda will invoke on the caller's/client's side once it has processed the invocation. Or you can store the lambda invocation result somewhere (such as S3 or DynamoDB) and the caller/client can use polling to periodically ask for the invocation result (check whether it is ready and if so, retrieve it).
Either way, once you use SQS to decouple API Gateway invocations from the processing of those invocations by your Lambda function via SQS messages, then the processing of the Lambda invocations will be asynchronous to the API Gateway caller/client request. So, the HTTP request of the API Gateway caller/client will return right away without waiting for the Lambda invocation result.
I want to use 2 blobs containers to trigger that azure function. Is there also a way to recognized which blob storage trigger the azure function? Please help. Thank you! Python
There are no plans to support multiple triggers per Function.
Each function has only one trigger but it can have multiple input bindings.
For your need, aving your blob uploads trigger an Event Grid event, and have an Event Grid Triggered function which is fired for each blob uploaded.
How to add configure event if we are uploading as zip
Normally configure event is available as like below
You could invoke your lambda with configure event from AWS console no matter if it is archive or inline-edit.
See the dropdown before Test button, once you have created your event just run the button as usual.
why would you need to configure test events when uploading a lambda? Those are just meant for trigger the lambda for testing. If you want to add events to the lambda on deployment like so:
functions:
createUser: # Function name
handler: handler.createUser # Reference to file handler.js & exported function 'createUser'
events: # All events associated with this function
- http:
path: users/create
method: post
I would take a look at serverless, makes it very simple to deploy resources to AWS.
Serverless
I need to code the entire Lambda function creation which can be used to deploy the solution to various environments using github.
Currently my lambda function (.py) is in a script. But the S3 trigger is currently added only through AWS console.
How do i add the event trigger either on S3 bucket or in Lambda function through scripting? I am not allowed to use AWS console, but still want to use the lambda triggers.
THere has to be a way but i can't find a working solution. Any help is really appreciated.
Serverless Framework is what I use, it's simple to build complex services with aws resources and events. Have a look at https://serverless.com/framework/docs/providers/aws/events/s3/
This is all you need:
functions:
users:
handler: mypythonfilename.handler
events:
- s3:
bucket: mybucketname
event: s3:ObjectCreated:*
It basically builds the CloudFormation for you, and deploys the lambda with the serverless deploy command.