My Lambda python Function is returning null even after successful execution - python

Am a beginner to AWS services and python, i used below code in lambda to connect
RDS and invoke this in API gateway
After Successful execution of below code it is returning null.
#!/usr/bin/python
import sys
import logging
import pymysql
import json
rds_host="host"
name="name"
password="password"
db_name="DB"
port = 3306
def save_events(event):
"""
This function fetches content from mysql RDS instance
"""
result = []
conn = pymysql.connect(rds_host, user=name, passwd=password,
db=db_name,connect_timeout=30)
with conn.cursor() as cur:
cur.execute("SELECT * FROM exercise WHERE bid = '1'")
for row in cur:
result.append(list(row))
print ("Data from RDS...")
print (result)
cur.close()
print(json.dumps({'bodyParts':result}))
def lambda_handler(event, context):
save_events(event)

As pointed out in a comment by #John Gordon, you need to return something from your lambda_handler function.
It should be something like:
def lambda_handler(event, context):
save_events(event)
return {
"statusCode": 200,
"result": "Here is my result"
}
Additionally, I don't see any return statement from save_events either.

Related

Does my lambda function go inside my main python script?

I don't know how to write a Lambda. Here is my main_script.py that executes 2 stored procedures. It inserts records every day then finds the difference between yesterday's and today's records and writes them to a table.
import logging
import pymysql as pm
import os
import json
class className:
env=None
config=None
def __init__(self, env_filename):
self.env=env_filename
self.config=self.get_config()
def get_config(self):
with open(self.env) as file_in:
return json.load(file_in)
def DB_connection(self):
config=className.get_config(self)
username=config["exceptions"]["database-secrets"]["aws_secret_username"]
password=config["exceptions"]["database-secrets"]["aws_secret_password"]
host=config["exceptions"]["database-secrets"]["aws_secret_host"]
port=config["exceptions"]["database-secrets"]["aws_secret_port"]
database=config["exceptions"]["database-secrets"]["aws_secret_db"]
return pm.connect(
user=username,
password=password,
host=host,
port=port,
database=database
)
def run_all(self):
def test_function(self):
test_function_INSERT_QUERY = "CALL sp_test_insert();"
test_function_EXCEPTIONS_QUERY = "CALL sp_test_exceptions();"
test = self.config["exceptions"]["functions"]["test_function"]
if test:
with self.DB_connection() as cnxn:
with cnxn.cursor() as cur:
try:
cur.execute(test_function_INSERT_QUERY)
print("test_function_INSERT_QUERY insertion query ran successfully, {} records updated.".format(cur.rowcount))
cur.execute(test_function_EXCEPTIONS_QUERY)
print("test_function_EXCEPTIONS_QUERY exceptions query ran successfully, {} exceptions updated.".format(cur.rowcount))
except pm.Error as e:
print(f"Error: {e}")
except Exception as e:
logging.exception(e)
else:
cnxn.commit()
test_function(self)
def main():
cwd=os.getcwd()
vfc=(cwd+"\_config"+".json")
ve=className(vfc)
ve.run_all()
if __name__ == "__main__":
main()
Would I write my lambda_handler function inside my script above or have it as a separate script?
def lambda_handler(event, context):
#some code
I would treat lambda_handler(event, context) as the equivalent of main() with the exception that you do not need if __name__ ... clause because you never run a lambda function from the console.
You would also need to use boto3 library to abstract away AWS services and their functions. Have a look at the tutorial to get started.
As the first order of business, I would put the DB credentials out of the file system and into a secure datastore. You can of course configure Lambda environment variables, but Systems Manager Parameter Store is more secure and super-easy to call from the code, e.g.:
import boto3
ssm = boto3.client('ssm', region_name='us-east-1')
def lambda_handler(event, context):
password = ssm.get_parameters(Names=['/pathto/password'], WithDecryption=True)['Parameters'][0]['Value']
return {"password": password}
There is a more advanced option, the Secrets Manager, which for a little money will even rotate passwords for you (because it is fully integrated with Relational Database Service).

AWS Lambda function returning same values unless redeployed

I have a simple AWS Lambda function connected to a mysql rds. When I update a field in my apps UI it updates it in the database when viewed from the MySQL workbench but when using the Lambda function it returns the same value until I redeploy the function and then it gives me the new correct value
"""Search Function for Lambda"""
from urllib.parse import unquote
import json
import pymysql
# Configuration Values
##CONFIG VALUES REMOVED
# Connection
connection = pymysql.connect(ENDPOINT, user=USERNAME,
passwd=PASSWORD, db=DATABASE_NAME)
def get(event, context):
"""Takes searches key from event and searches the database on that key"""
print(context)
cursor = connection.cursor()
search_key = unquote(event['pathParameters']['search_key'])
cmd = ('SELECT * from LCI_Data WHERE Description Like "{}"'.format("%"+search_key+"%"))
cursor.execute(cmd)
result = [dict((cursor.description[i][0], value)
for i, value in enumerate(row)) for row in cursor.fetchall()]
response = {
"statusCode": 200,
"body": json.dumps(result),
"headers": {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Credentials": "true"
}
}
return response
Moving
connection = pymysql.connect(ENDPOINT, user=USERNAME,
passwd=PASSWORD, db=DATABASE_NAME)
into the get() function solved my issues.

Import Query String from AWS API Gateway into Lambda Python Function

I am working on creating a Lambda Function which imports query string information into a database but I'm running into some trouble accessing the query string itself in my python code. I have setup the proper string variables within the API Gateway and also enabled Lambda Proxy integration from the Integration Request section.
Some articles and previous responses said I should be doing so by using:
event["queryStringParameters"]['querystring1']
I've setup a handler and I'm curious how to pass the request body into the function my_handler
Here's a snippet of the code:
import logging, traceback, os
#environment variables
ep = os.environ["EP"]
p = os.environ["PORT"]
du = os.environ["USER"]
pw = os.environ["PASSWORD"]
db = os.environ["DATABASE"]
#query string variables
def my_handler(event):
servername = event["queryStringParameters"]["servername"]
hostDesc = event["queryStringParameters"]["description"]
hostRegion = event["queryStringParameters"]["region"]
response = servername + hostDesc + hostRegion
return {
'status code' : 200,
'body' : json.dumps(response)
}
This was due to an error.
I was trying to pass a SQL query later in the script via a global string by attempting to use the handler results. What should be done is defining the query as a function and call it during the handler. Once the handler has the event variables from the body request, they can be passed into the function.
EX:
import logging, traceback, json
def query(hostServername, hostDesc, hostRegion):
return "SELECT * FROM TABLE_NAME WHERE host ='"+hostIP+"' AND '"+hostPort+"' AND '"+hostServername+"'"
#query string variables
def my_handler(event):
server = event["queryStringParameters"]["servername"]
host = event["queryStringParameters"]["description"]
region = event["queryStringParameters"]["region"]
try:
cnx = some_database_connect_function()
cursor=cnx.cursor()
try:
cursor.execute(query(server, host, region))
return{
'Status Code' : 200
}
except:
return log_err ("ERROR: Cannot execute cursor.\n{}".format(
traceback.format_exc()) )
except:
return log_err("ERROR: Cannot connect to database from handler.\n{}".format(
traceback.format_exc()))

Return json using pymysql and Mysql Aurora DB

I using pymysql for connection to AWS Mysql Aurora DB. I created lambda function, which should return the data as json for use in the js framework.
def lambda_handler(event, context):
responses = []
try:
conn = pymysql.connect(rds_host, user=name, passwd=password, db=db_name, connect_timeout=5)
except pymysql.MySQLError as e:
...
with conn.cursor() as cur:
cur.execute('SELECT * FROM Plans')
conn.commit()
for row in cur:
resp = {
"id": row[0],
"name": row[1],
...
...
"type": row[14],
}
responses.append(resp)
return responses
This code return list of dict.
I try to use json module result = json.dumps(responses), but this code return str
How can I get json?
The json module won't return you a JSON object.
json.loads()
will return a Python object using this formatting table.
json.dumps()
will return you a JSON formatted str using this formatting table.
You will be able to create a .json file using following code :
with open ('path/to/myfile.json', 'w') as myfile:
json.dump(responses, my_file)

Mocking a python database query

I am trying to test by mocking a database query, but receiving an error:
Asssertion error:AssertionError: Expected call: execute()
Not called
and create_table() not defined.
I want to execute() to be called and use create_table() to return response for asserting against pre-defined values.
app.py
from flask import Flask,g
#app.before_request
def before_request():
g.db = mysql.connector.connect(user='root', password='root', database='mysql')
def create_table():
try:
cur = g.db.cursor() #here g is imported form Flask module
cursor.execute ('CREATE TABLE IF NOT EXISTS Man (id INT NOT NULL AUTO_INCREMENT PRIMARY KEY, name VARCHAR(40)')
data = dict(Table='Man is created')
resp = jsonify(data)
cursor.close()
return resp
test.py
import unittest
from app import *
from mock import patch
class Test(unittest.TestCase):
def test_create(self):
with patch("app.g") as mock_g:
mock_g.db.cursor()
mock_g.execute.assert_called_with()
resp = create_table()
assertEqual(json, '{"Table":"Testmysql is created","Columns": ["id","name","email"]}')
What am I doing wrong?Can someone please tell me how to fix it
I believe you need to add your changes before closing the cursor, or the execute won't occur. Try adding cursor.commit() before (or instead of) cursor.close().

Categories

Resources