How to add configure event for zip file in AWS - python

How to add configure event if we are uploading as zip
Normally configure event is available as like below

You could invoke your lambda with configure event from AWS console no matter if it is archive or inline-edit.
See the dropdown before Test button, once you have created your event just run the button as usual.

why would you need to configure test events when uploading a lambda? Those are just meant for trigger the lambda for testing. If you want to add events to the lambda on deployment like so:
functions:
createUser: # Function name
handler: handler.createUser # Reference to file handler.js & exported function 'createUser'
events: # All events associated with this function
- http:
path: users/create
method: post
I would take a look at serverless, makes it very simple to deploy resources to AWS.
Serverless

Related

How to write a Cloud Function to listen and respond events such as when a file is created, changed, or removed?

In GCP, I am having trouble finding an example on how to listen and respond to Cloud Storage events such as when a file is created, changed, or removed.
I tried looking at the GCP docs on how to do this, but there is nothing there. I am looking for a simple Python example using Cloud Functions to listen and respond to when a file is created, changed, or removed in my GCS Bucket.
You can trigger a Cloud Function V2 on a Cloud Storage event :
gcloud functions deploy YOUR_FUNCTION_NAME \
--gen2 \
--trigger-event-filters="type=EVENT_TYPE" \
--trigger-event-filters="bucket=YOUR_STORAGE_BUCKET" \
...
YOUR_FUNCTION_NAME is the Cloud Function name
EVENT_TYPE is the GCS file event (Object finalized, Object deleted
...). Check the link above on the doc for the different event types
YOUR_STORAGE_BUCKET is the GCS bucket concerned by the event
For the code of the Cloud Function, you check this link :
import functions_framework
# Register a CloudEvent function with the Functions Framework
#functions_framework.cloud_event
def my_cloudevent_function(cloud_event):
# Your code here
# Access the CloudEvent data payload via cloud_event.data

Can Azure blob storage functions bind with multiple input?

I want to use 2 blobs containers to trigger that azure function. Is there also a way to recognized which blob storage trigger the azure function? Please help. Thank you! Python
There are no plans to support multiple triggers per Function.
Each function has only one trigger but it can have multiple input bindings.
For your need, aving your blob uploads trigger an Event Grid event, and have an Event Grid Triggered function which is fired for each blob uploaded.

create aws sagemker endpoint with lambda function

I created an endpoint in aws sagemaker and it works well, I created a lambda function(python3.6) that takes files from S3, invoke the endpoint and then put the output in a file in S3.
I wonder if I can create the endpoint at every event(a file uploaded in an s3 bucket) and then delete the endpoint
Yes you can Using S3 event notification for object-created and call a lambda for creating endpoint for sagemaker.
This example shows how to make object-created event trigger lambda
https://docs.aws.amazon.com/lambda/latest/dg/with-s3.html
You can use python sdk to create endpoint for sagemaker
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sagemaker.html#SageMaker.Client.create_endpoint
But it might be slow for creating endpoint so you may be need to wait.

Need to code AWS lambda function with S3 triggers

I need to code the entire Lambda function creation which can be used to deploy the solution to various environments using github.
Currently my lambda function (.py) is in a script. But the S3 trigger is currently added only through AWS console.
How do i add the event trigger either on S3 bucket or in Lambda function through scripting? I am not allowed to use AWS console, but still want to use the lambda triggers.
THere has to be a way but i can't find a working solution. Any help is really appreciated.
Serverless Framework is what I use, it's simple to build complex services with aws resources and events. Have a look at https://serverless.com/framework/docs/providers/aws/events/s3/
This is all you need:
functions:
users:
handler: mypythonfilename.handler
events:
- s3:
bucket: mybucketname
event: s3:ObjectCreated:*
It basically builds the CloudFormation for you, and deploys the lambda with the serverless deploy command.

How to write a AWS lambda function with S3 and Slack integration

I have a use case where i want to invoke my lambda function whenever a object has been pushed in S3 and then push this notification to slack.
I know this is vague but how can i start doing so ? How can i basically achieve this ? I need to see the structure
There are a lot of resources available for both integrations (s3+lambda and lambda+slack) so if you can put these together you can make it work.
You can use S3 Event Notifications to trigger a lambda function directly:
http://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
Here are some blueprints to integrate lambda with slack:
https://aws.amazon.com/blogs/aws/new-slack-integration-blueprints-for-aws-lambda/
Good luck!
You can use S3 Event Notifications to trigger the lambda function.
In bucket's properties, create a new event notification for an event type of s3:ObjectCreated:Put and set the destination to a Lambda function.
Then for the lambda function, write a code either in Python or NodeJS (or whatever you like) and parse the received event and send it to Slack webhook URL.

Categories

Resources