I'm new to Python. I have a bit of code in Python within a Lambda function that updates the a value in a DynamoDB table (ebsDaysToExpire). That works. I get stuck when I want to the get that new updated value so I can pass it later in the script as part of a send_mail function.
I've tried adding in response = table.get_item statements but I just can't get that to work.
if response['Count'] == 0: #volume not being tracked in table
try:
response = table.put_item(
Item={
'volID': vid,
'ebsDaysToExpire': 7,
'snapshotStatus': 'incomplete',
'snapshotDate': 'incomplete',
'lifecycleStatus': 'start_7',
'snapshotID': 'incomplete',
'snapshotDaysToExpire': '30'
},
ConditionExpression='attribute_not_exists(volID)'
)
except ClientError as e:
print(e.response['Error']['Message'])
else:
try:
response = table.update_item(
Key={
'volID': vid
},
UpdateExpression='set ebsDaysToExpire = ebsDaysToExpire + :val',
ExpressionAttributeValues={
':val': decimal.Decimal(-1)
},
ReturnValues='UPDATED_NEW'
)
except ClientError as e:
print(e.response['Error']['Message'])
This is what my code looks like now and this returns the new value from the DynamoDB table after the 'table_put.item' updates the table (the returned value). This is getting passed as 'xdays'. Thanks to ohlr for his assistance.
if response['Count'] == 0: #volume not being tracked in table
try:
response = table.put_item(
Item={
'volID': vid,
'ebsDaysToExpire': 7,
'snapshotStatus': 'incomplete',
'snapshotDate': 'incomplete',
'lifecycleStatus': 'start_7',
'snapshotID': 'incomplete',
'snapshotDaysToExpire': '30'
},
ConditionExpression='attribute_not_exists(volID)'
)
except ClientError as e:
print(e.response['Error']['Message'])
else:
try:
response = table.update_item(
Key={
'volID': vid
},
UpdateExpression='set ebsDaysToExpire = ebsDaysToExpire + :val',
ExpressionAttributeValues={
':val': decimal.Decimal(-1)
},
ReturnValues='UPDATED_NEW'
)
xdays = response['Attributes']['ebsDaysToExpire']
print xdays
except ClientError as e:
print(e.response['Error']['Message'])
Related
I have recently uploaded contact records to HubSpot using Postman. Here is a raw JSON data example and POST method that I use to successfully upload a contact:
https://api.hubapi.com/crm/v3/objects/contacts?hapikey={{hapikey}}
{properties": {
"smbapi": "yes",
"email": "fcgrinding#junkstermail.com",
"business_name":"Forest City Grinding Inc",
"srvc_address_1":"3544 Stenstrom Rd",
"srvc_city_1":"",
"srvc_state_1":"IL",
"srvc_zip_1":"61109",
"proposal_date":"2021-12-07",
"proposal_start_date": "2022-01-01",
"udc_code_1": "COMED",
"eog":"electric",
"fixedprice_1_gas_mcf": 6.63,
"fixedprice_2_gas_mcf": 6.11,
"fixedprice_3_gas_mcf": 5.9,
"term_1": 12,
"term_2": 24,
"term_3": 36,
"smb_bdm_name": "Timothy Chin",
"smb_bdm_phone": "833-999-9999",
"smb_bdm_email": "tim.chin#junkstermail.com"
}
}
Next, I then created a python lambda function to automate this process because we want to ingest CSV files that may have many records to extract. So, I had constructed the dictionary to look the same as the string above which had worked great/fine with Postman. However, when I try and do a Post method API call to HubSpot, using my dictionary payload, I am getting this error:
Invalid input JSON : Cannot build ObjectSchemaEgg, Invalid input JSON
on line 1, column 2: Cannot build ObjectSchemaEgg, some of required
attributes are not set [name, labels]
Here is the processed dictionary string that my code constructed for the API call:
{'properties': '{"smbapi": "yes", "business_name": "Forest City Grinding Inc", "srvc_address_1": "4844 Stenstrom Rd", "srvc_state_1": "IL", "srvc_zip_1": "61109", "proposal_date": "2021-12-07", "proposal_start_date": "2022-01-01", "udc_code_1": "COMED", "fixedprice_1": "6.63", "fixedprice_2": "6.11", "fixedprice_3": "5.9", "term_2": "24", "term_3": "36", "smb_bdm_name": "Gary Wysong", "smb_bdm_phone": "833-389-0881", "smb_bdm_email": "gary.wysong#constellation.com"}'}
Here is my Lambda code in full (give special attention to both the call to post_to_hubspot() and also the post_to_hubspot() function itself). The code that loads the dynamo table is working correctly.:
import boto3
import json
import decimal
from botocore.exceptions import ClientError
from boto3.dynamodb.conditions import Key, Attr
import re
import pandas as pd
import numpy as np
import os
import datetime
from os import urandom
import email
import base64
import requests
from datetime import datetime, timedelta, timezone
import mailparser
import calendar
global payload_data
landing_zone_bucket_name = str(os.environ['BUCKETNAME'])
s3 = boto3.resource('s3')
landing_zone_bucket = s3.Bucket(landing_zone_bucket_name )
s3r = boto3.client('s3')
dynamodb = boto3.resource('dynamodb', region_name='us-west-2')
table = dynamodb.Table(str(os.environ['DYNAMOTABLE']))
unprocessed_records_table = dynamodb.Table(str(os.environ['UNPROCESSEDTABLE']))
email_table = dynamodb.Table(str(os.environ['EMAILSTATUSTABLE']))
endpoint_url=os.environ['ENDPOINT_URL']
access_key = os.environ['ACCESSKEY']
now = datetime.now()
today_date = datetime.strftime(now,'%d')
today_month = datetime.strftime(now,'%m')
today_year = datetime.strftime(now,'%Y')
time_stamp = datetime.now().strftime('%Y%m%d%H%M%S')
payload_data = {}
#WRITE RECORDS TO DYNAMO
def dynamoPut(dObj,table_name=None):
try:
for each in list(dObj['Information']):
if dObj['Information'][each]:
dObj['Information'][each] = str(dObj['Information'][each])
else:
del dObj['Information'][each]
dObj['Month'] = today_month
dObj['Year'] = today_year
dObj['Day'] = today_date
for each in list(dObj):
if dObj[each] != '':
dObj[each] = dObj[each]
else:
del dObj[each]
if table_name != None:
response = unprocessed_records_table.put_item(Item = dObj)
else:
response = table.put_item(Item = dObj)
if response['ResponseMetadata']['HTTPStatusCode'] == 200:
return True
else:
return False
except Exception as e:
print(e)
return False
def dynamoPutFileName(filename,source_type):
try:
dObj = {}
dObj['id'] = urandom(20).hex()
dObj['CreatedAt'] = str(datetime.now())
dObj['FileName'] = filename
dObj['Type'] = source_type
dObj['EmailSent'] = False
response = email_table.put_item(Item = dObj)
if response['ResponseMetadata']['HTTPStatusCode'] == 200:
return True
else:
return False
except Exception as e:
print(e)
return False
def parse_csv_hubspot(event, obj):
#parsing CSV file to write to dynamo
try:
def auto_truncate(val):
return val[:255 ]
print('<< IN PARSE CSV HUBSPOT >>')
print(event)
csv = pd.read_csv(obj['Body'], encoding = "ISO-8859-1")
csv_nn = csv.replace(np.nan, 'null', regex=True)
d = csv_nn.to_dict(orient='records')
source_id = urandom(20).hex()
file_name = event['file_path'].split('/')[-1]
print('<< FILE NAME >>', file_name)
for each in d:
try:
dbObj = {}
#PASSING THE EXTERNAL KEY
UniqueKey = ''
if 'smbapi' in each and each['smbapi'] != 'null':
dbObj['smbapi' ] = each['smbapi']
print('<< SMB API>>', dbObj['smbapi' ])
if 'business_name' in each and each['business_name'] != 'null':
dbObj['business_name'] = each['business_name']
print('<< BUSINESS NAME >>', dbObj['business_name'])
if 'srvc_address_1' in each and each['srvc_address_1'] != 'null':
dbObj['srvc_address_1'] = each['srvc_address_1']
print('<< ADDRESS 1 >>', dbObj['srvc_address_1'])
if 'srvc_city_1' in each and each['srvc_city_1'] != 'null':
dbObj['srvc_city_1'] = each['srvc_city_1']
if 'srvc_state_1' in each and each['srvc_state_1'] != 'null':
dbObj['srvc_state_1'] = each['srvc_state_1']
if 'srvc_zip_1' in each and each['srvc_zip_1'] != 'null':
dbObj['srvc_zip_1']= str(each['srvc_zip_1']).zfill(5)
if 'proposal_date' in each and each['proposal_date'] != 'null':
dbObj['proposal_date']= try_parsing_date(each['proposal_date']).date().isoformat()
if 'proposal_start_date' in each and each['proposal_start_date'] != 'null':
dbObj['proposal_start_date']= try_parsing_date(each['proposal_start_date']).date().isoformat()
if 'udc_code_1' in each and each['udc_code_1'] != 'null':
dbObj['udc_code_1']= each['udc_code_1']
if 'eog' in each and each['eog'] != 'null':
dbObj['eog']= each['eog']
if 'fixedprice_1' in each and each['fixedprice_1'] != 'null':
dbObj['fixedprice_1']= each['fixedprice_1']
if 'fixedprice_2' in each and each['fixedprice_2'] != 'null':
dbObj['fixedprice_2']= each['fixedprice_2']
if 'fixedprice_3' in each and each['fixedprice_3'] != 'null':
dbObj['fixedprice_3']= each['fixedprice_3']
if 'fixedprice_1_gas_therm' in each and each['fixedprice_1_gas_therm'] != 'null':
dbObj['fixedprice_1_gas_therm']= each['fixedprice_1_gas_therm']
if 'fixedprice_2_gas_therm' in each and each['fixedprice_2_gas_therm'] != 'null':
dbObj['fixedprice_2_gas_therm']= each['fixedprice_2_gas_therm']
if 'fixedprice_3_gas_therm' in each and each['fixedprice_3_gas_therm'] != 'null':
dbObj['fixedprice_3_gas_therm']= each['fixedprice_3_gas_therm']
if 'fixedprice_1_gas_ccf' in each and each['fixedprice_1_gas_ccf'] != 'null':
dbObj['fixedprice_1_gas_ccf']= each['fixedprice_1_gas_ccf']
if 'fixedprice_2_gas_ccf' in each and each['fixedprice_2_gas_ccf'] != 'null':
dbObj['fixedprice_2_gas_ccf']= each['fixedprice_2_gas_ccf']
if 'fixedprice_3_gas_ccf' in each and each['fixedprice_3_gas_ccf'] != 'null':
dbObj['fixedprice_3_gas_ccf']= each['fixedprice_3_gas_ccf']
if 'fixedprice_1_gas_dth' in each and each['fixedprice_1_gas_dth'] != 'null':
dbObj['fixedprice_1_gas_dth']= each['fixedprice_1_gas_dth']
if 'fixedprice_2_gas_dth' in each and each['fixedprice_2_gas_dth'] != 'null':
dbObj['fixedprice_2_gas_dth']= each['fixedprice_2_gas_dth']
if 'fixedprice_3_gas_dth' in each and each['fixedprice_3_gas_dth'] != 'null':
dbObj['fixedprice_3_gas_dth']= each['fixedprice_3_gas_dth']
if 'fixedprice_1_gas_mcf' in each and each['fixedprice_1_gas_mcf'] != 'null':
dbObj['fixedprice_1_gas_mcf']= each['fixedprice_1_gas_mcf']
if 'fixedprice_2_gas_mcf' in each and each['fixedprice_2_gas_mcf'] != 'null':
dbObj['fixedprice_2_gas_mcf']= each['fixedprice_2_gas_mcf']
if 'fixedprice_3_gas_mcf' in each and each['fixedprice_3_gas_mcf'] != 'null':
dbObj['fixedprice_3_gas_mcf']= each['fixedprice_3_gas_mcf']
if 'term_1' in each and each['term_1'] != 'null':
dbObj['term_1']= each['term_1']
if 'term_2' in each and each['term_2'] != 'null':
dbObj['term_2']= each['term_2']
if 'term_3' in each and each['term_3'] != 'null':
dbObj['term_3']= each['term_3']
if 'smb_bdm_name' in each and each['smb_bdm_name'] != 'null':
dbObj['smb_bdm_name']= each['smb_bdm_name']
if 'smb_bdm_phone' in each and each['smb_bdm_phone'] != 'null':
if '.' in str(each['smb_bdm_phone']):
dbObj['smb_bdm_phone']= str(int(float(each['smb_bdm_phone'])))
else:
dbObj['smb_bdm_phone']= str(each['smb_bdm_phone'])
if 'smb_bdm_email' in each and each['smb_bdm_email'] != 'null' and each['smb_bdm_email'].strip() != '' and each['smb_bdm_email'] != None:
dbObj['smb_bdm_email']= each['smb_bdm_email']
print('<< OBJ >> ',dbObj)
N = urandom(20).hex()
now = str(datetime.now())
#<< END of HUBSPOT INGESTION >>
# table.put_item(
Item = {
'CogId' : str(N),
'CreatedAt': now,
'ExternalId': UniqueKey,
'Information' : dbObj,
'SourceBucket': landing_zone_bucket_name,
'SourcePath' : event['file_path'],
'Source' : 'HubSpot',
'SourceId' : source_id,
'SourceFileName': time_stamp + '_' + file_name
}
#WRITE-TO-DYNAMO
files_processing = dynamoPut(Item)
if not files_processing:
print('Writing {} record to dynamodb Failed'.format(Item))
except Exception as e:
print(e)
N = urandom(20).hex()
Item = {
'CogId' : str(N),
'CreatedAt': now,
'Information' : each,
'SourceBucket': landing_zone_bucket_name,
'SourcePath' : event['file_path'],
'Source' : 'HubSpot',
'message': str(e),
'SourceId' : source_id,
'ExternalId': UniqueKey
}
files_processing = dynamoPut(Item,'Fail')
pass
temp_file_name = time_stamp + '_' + file_name
isert_file_name = dynamoPutFileName(temp_file_name,'HubSpot')
post_to_hubspot(dbObj)
return True
except Exception as e:
print(e)
new_folder_path = os.environ['CSV_NEW_FOLDER_HUBSPOT']
unprocessed_folder_path = os.environ['CSV_ERROR_FOLDER_HUBSPOT']
# MOVING PROCESSED FILES FROM NEW TO UNPROCESSED FOLDER
move_file_to_processed = moving_files_new_to_processed(event, new_folder_path,unprocessed_folder_path)
return False
def try_parsing_date(text):
for fmt in ('%m/%d/%Y','%Y-%m-%dT%H:%M:%S-%f', '%m/%d/%y', '%Y-%m-%d', '%m.%d.%Y','%Y-%m-%dT%I', '%Y-%m-%dT%I%p', '%Y-%m-%dT%H:%M:%S.%f', '%Y-%m-%dT%H:%M:%S.%f+','%Y-%m-%dT%H:%M:%S'):#2018-11-20T08:05:54-0500
try:
return datetime.strptime(text, fmt)
except ValueError:
print('in except')
pass
return ValueError('no valid date format found')
def post_to_hubspot(list_contacts):
print('<< IN POST-To-HUBSPOT >>')
data_string = **json.dumps(list_contacts)**
payload_data = {"properties": data_string}
print('<< dbOBJ LIST >> ',payload_data)
response = requests.request("POST", endpoint_url+access_key, headers={'Content-Type': 'application/json'}, data=payload_data)
token_response=json.loads(response.text)
print('<< TOKEN RESPONSE >>',token_response)
def moving_files_new_to_processed(event, new_folder,processed_folder):
#MOVING-FILES-TO-PROCESSED
try:
copy_source = {
'Bucket': landing_zone_bucket_name,
'Key': event['file_path']
}
path = event['file_path']
processed_folder = processed_folder + time_stamp + '_'
new_key = path.replace(new_folder, processed_folder)
new_obj = landing_zone_bucket.Object(new_key)
new_obj.copy(copy_source)
s3.Object(landing_zone_bucket_name, event['file_path']).delete()
return True
except Exception as e:
print(e)
return False
def lambda_handler(event,context):
print("Starting to Push Records to Dynamo Lambda")
print(event)
try:
parse_flag = False
new_folder_path = ''
processed_folder_path = ''
#Gets file path and calls required function to parse it out
key = str(os.environ['CSV_NEW_FOLDER_HUBSPOT'])
obj = s3r.get_object(Bucket=landing_zone_bucket_name, Key=event['file_path'])
print('after obj')
print(os.environ['CSV_NEW_FOLDER_HUBSPOT'])
print('in HubSpot parse_csv')
parse_csv_func = parse_csv_hubspot(event, obj)
# Checks if parse_csv return empty dictionary
if parse_csv_func:
parse_flag = True
new_folder_path = os.environ['CSV_NEW_FOLDER_HUBSPOT']
processed_folder_path = os.environ['CSV_PROCESSED_FOLDER_HUBSPOT']
else:
print('File Format not Supported for {}'.format(event['file_path']))
if parse_flag:
# UPLOADING CONTACT.MOVING PROCESSED FILES FROM NEW TO PROCESSED FOLDER
#print('<< PAYLOAD >> ',payload)
#response = requests.request("POST", "https://api.hubapi.com/crm/v3/schemas/?hapikey="+access_key, headers={'Content-Type': 'application/json'}, data=json.dumps(str(payload)))
#token_response=json.loads(response.text)
#print('<< TOKEN RESPONSE >>',token_response)
#MOVING PROCESSED FILES FROM NEW TO PROCESSED FOLDER
move_file_to_processed = moving_files_new_to_processed(event, new_folder_path,processed_folder_path)
if move_file_to_processed:
print('File {} moved Successfully from {} to {}'.format(event['file_path'],new_folder_path,processed_folder_path))
else:
print('Moving {} file from new to processing folder Failed'.format(event['file_path']))
except Exception as e:
print(e)
What could be the problem? Thanks for your help.
The problem was caused by two issues:
The dictionary should have been placed in json.dumps() to convert it to JSON string when doing a POST so the dictionary didn't need to change its structure. Here's the response from the POST:
<< TOKEN RESPONSE >> {
"id": "135120801",
"properties": {
"business_name": "Millers Brand Oats",
"createdate": "2021-12-21T02:31:12.452Z",
"fixedprice_1": "6.63",
"fixedprice_2": "6.11",
"fixedprice_3": "5.9",
"hs_all_contact_vids": "135120801",
"hs_is_contact": "true",
"hs_is_unworked": "true",
"hs_marketable_until_renewal": "false",
"hs_object_id": "135120801",
"hs_pipeline": "contacts-lifecycle-pipeline",
"lastmodifieddate": "2021-12-21T02:31:12.452Z",
"proposal_date": "2021-12-07",
"proposal_start_date": "2022-01-01",
"smb_bdm_email": "Tim.Chu#junkster.com",
"smb_bdm_name": "Tim Chu",
"smb_bdm_phone": "833-999-9999",
"smbapi": "yes",
"srvc_address_1": "4844 Stenstrom Rd",
"srvc_state_1": "IL",
"srvc_zip_1": "61109",
"term_2": "24",
"term_3": "36",
"udc_code_1": "COMED"
},
"createdAt": "2021-12-21T02:31:12.452Z",
"updatedAt": "2021-12-21T02:31:12.452Z",
"archived": false
}
I was using the wrong endpoint:
https://api.hubapi.com/crm/v3/schemas/
instead of:
https://api.hubapi.com/crm/v3/objects/contacts/
Now I just need to find out why the AWS Lambda POSTs allow for duplicate contacts to be created in HubSpot while Postman POSTs prohibit duplicate contacts to be created.
I am calling this function to put items if an item(state) does not exist, something which I am referring from here : How do I conditionally insert an item into a dynamodb table using boto3 ..
def put_items_if_doesnt_exist():
dynamodb = boto3.resource('dynamodb',region_name='us-east-1')
try:
table = dynamodb.Table('awssolutions-ssm-hybrid-table')
response = table.put_item(
Item={
'name':'Execution',
'state': 'Locked',
},
ConditionExpression='attribute_not_exists(state) AND attribute_not_exists(name)'
)
except ClientError as e:
# Ignore the ConditionalCheckFailedException
if e.response['Error']['Code'] != 'ConditionalCheckFailedException':
raise
Problem here is that the state is a reserved word and therefore it fails with the error :
[ERROR] ClientError: An error occurred (ValidationException) when calling the PutItem operation: Invalid ConditionExpression: Attribute name is a reserved keyword; reserved keyword: state
Any suggestions to handle this ?
This is where ExpressionAttributeNames come in, they let you use reserved names. You just add a placeholder with the # prefix and in the ExpressionAttributeNames parameter specify its value.
def put_items_if_doesnt_exist():
dynamodb = boto3.resource('dynamodb',region_name='us-east-1')
try:
table = dynamodb.Table('awssolutions-ssm-hybrid-table')
response = table.put_item(
Item={
'name':'Execution',
'state': 'Locked',
},
ConditionExpression='attribute_not_exists(#state) AND attribute_not_exists(#name)',
ExpressionAttributeNames={"#state": "state", "#name", "name"}
)
except ClientError as e:
# Ignore the ConditionalCheckFailedException
if e.response['Error']['Code'] != 'ConditionalCheckFailedException':
raise
I am a bit new to dynamodb
See error I get when trying to get the max id of my dynamodb table in python lambda function using instructions in below StackOverflow post in below link
Dynamodb max value
An error occurred (ValidationException) when calling the Query operation: Invalid KeyConditionExpression: The expression can not be empty;\"}"
see my lambda function code below
import json
import boto3
TABLE_NAME = 'user-profiles'
dynamo_DB = boto3.resource('dynamodb')
def lambda_handler(event, context):
user_id = event['user_id']
email = event['email']
bvn = event['bvn']
password = event['password']
phone = event['phone']
gender = event['gender']
output = ''
if len(user_id) > 1 and len(password) > 5:
try:
table = dynamo_DB.Table(TABLE_NAME)
values = list(table.query(
KeyConditionExpression='',
ScanIndexForward=False,
Limit=1
)
)
max_id = values[0]['id']
new_id = max_id + 1
Item = {
'id': str(new_id),
'profile-id': str(new_id),
'user_id': user_id,
'email': email,
'bvn': bvn,
'password': password,
'phone': phone,
'gender': gender
}
table.put_item(Item=Item)
output += 'Data Inserted To Dynamodb Successfully'
except Exception as e:
output += 'error with dynamo registration ' + str(e)
# print(output)
else:
output += 'invalid user or password entered, this is ' \
'what i received:\nusername: ' \
+ str(user_id) + '\npassword: ' + str(password)
return {
"statusCode": 200,
"body": json.dumps({
"message": output,
}),
}
# print(output)
You cannot query with empty KeyConditionExpression, if you need to read all records from the table you need to use scan. But you cannot use ScanIndexForward there to order records forward.
Seems like you're trying to implement primary key incrementation. I want to warn you, your solution is not really awesome, because you easily can hit a race condition.
What I would suggest:
I guess you are using id as a primary key (aka partition key). it's okay. what I would do is upsert an extra record in the table, with say increment value:
increment = table.update_item(
Key={'id': 'increment'},
UpdateExpression='ADD #increment :increment',
ExpressionAttributeNames={'#increment': 'increment'},
ExpressionAttributeValues={':increment': 1},
ReturnValues='UPDATED_NEW',
)
new_id = increment['Attributes']['increment']
This query will update the existing record with id: 'increment' and store a new incremented number in the record, if it is the very first query the record will be created with increment: 1 and subsequent calls will increment it. ReturnValues means the query will return the result after the update and you will get a new id.
put the code in place instead of where you query the last record
so your code would look like:
import json
import boto3
TABLE_NAME = 'user-profiles'
dynamo_DB = boto3.resource('dynamodb')
def lambda_handler(event, context):
user_id = event['user_id']
email = event['email']
bvn = event['bvn']
password = event['password']
phone = event['phone']
gender = event['gender']
output = ''
if len(user_id) > 1 and len(password) > 5:
try:
table = dynamo_DB.Table(TABLE_NAME)
increment = table.update_item(
Key={'id': 'increment'},
UpdateExpression='ADD #increment :increment',
ExpressionAttributeNames={'#increment': 'increment'},
ExpressionAttributeValues={':increment': 1},
ReturnValues='UPDATED_NEW',
)
new_id = increment['Attributes']['increment']
Item = {
'id': str(new_id),
'profile-id': str(new_id),
'user_id': user_id,
'email': email,
'bvn': bvn,
'password': password,
'phone': phone,
'gender': gender
}
table.put_item(Item=Item)
output += 'Data Inserted To Dynamodb Successfully'
except Exception as e:
output += 'error with dynamo registration ' + str(e)
# print(output)
else:
output += 'invalid user or password entered, this is ' \
'what i received:\nusername: ' \
+ str(user_id) + '\npassword: ' + str(password)
return {
"statusCode": 200,
"body": json.dumps({
"message": output,
}),
}
# print(output)
and you're good.
Extra thoughts:
And to be 100% sure that there is no race condition on incrementation, you can implement a locking mechanism this way: Before incrementing, put an extra record with id value lock and lock attribute with any value, and use ConditionExpression='attribute_not_exists(lock)'. Then make an increment and then release the lock by removing the record lock. So while the record is there the second attempt to 'make a lock' would break by the condition that attribute lock exists and throw error ConditionalCheckFailedException (you can catch the error and show to a user that the record is locked or whatever.)
Here is an example in JavaScript sorry:
module.exports.DynamoDbClient = class DynamoDbClient {
constructor(tableName) {
this.dynamoDb = new DynamoDB.DocumentClient();
this.tableName = tableName;
}
async increment() {
await this.lock();
const {Attributes: {increment}} = await this.dynamoDb.update({
TableName: this.tableName,
Key: {id: 'increment'},
UpdateExpression: 'ADD #increment :increment',
ExpressionAttributeNames: {'#increment': 'increment'},
ExpressionAttributeValues: {':increment': 1},
ReturnValues: 'UPDATED_NEW',
}).promise();
await this.unlock();
return increment;
}
async lock(key) {
try {
await this.dynamoDb.put({
TableName: this.tableName,
Item: {id: 'lock', _lock: true},
ConditionExpression: 'attribute_not_exists(#lock)',
ExpressionAttributeNames: {'#lock': '_lock'},
}).promise();
} catch (error) {
if (error.code === 'ConditionalCheckFailedException') {
throw new LockError(`Key is locked.`);
}
throw error;
}
}
unlock() {
return this.delete({id: 'lock'});
}
async delete(key) {
await this.dynamoDb.delete({
TableName: this.tableName,
Key: key,
}).promise();
}
}
// usage
const client = new DynamoDbClient('table');
const newId = await client.increment();
...
Code is below
import boto3
dynamodb = boto3.resource ('dynamodb')
table =dynamodb.Table('test')
def lambda_handler(event, context):
response = table.update_item(
Key={
'id': "100",
'name': "David"
})
I have created a DynamoDB table test my primary key is id which is string.
in DynamoDB my table value for id 100 is John i need to update to David. Above is the code. Why error is throwing the meta schema
Full error is below
"errorMessage": "An error occurred (ValidationException) when calling the UpdateItem operation: The document path provided in the update expression is invalid for update",
"errorType": "ClientError",
Tried below code
import boto3
dynamodb = boto3.resource ('dynamodb')
table =dynamodb.Table('test')
def lambda_handler(event, context):
response = table.update_item(
Key={
'id': '100'
},
UpdateExpression='SET name = :val1',
ExpressionAttributeValues={
':val1': 'David'
})
Adding one more table for replicate the case
TO put the table: Output >> Success
First create table newTable in DynamoDB
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource ('dynamodb')
table =dynamodb.Table('newTable')
response = table.put_item(
Item={
'username': 'Ac',
'first_name': 'DEF',
'last_name': 'FHI',
'age': 10,
'account': 'GOld'
})
How to get the item ? Output >> Error
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource ('dynamodb')
table =dynamodb.Table('newTable')
response = table.get_item(
Key={
'username':'Ac'
}
)
print (response)
Error >> Response:
"errorMessage": "An error occurred (ValidationException) when calling the GetItem operation: The provided key element does not match the schema",
"errorType": "ClientError",
Answer of second one
get and update need the exact item to be updated not batches, so you also need to provide the corresponding sort key
Courtesy #Sairsreenivas
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource ('dynamodb')
table =dynamodb.Table('newTable')
# response = table.put_item(
# Item={
# 'username': 'Ac',
# 'first_name': 'DEF',
# 'last_name': 'GH',
# 'age': 10,
# 'account': 'GOld'
# })
# try:
# response = table.get_item(Key={'username':'Mak'})
# except Exception as e:
# print(e.response['Error']['Message'])
# else:
# return response['Item']
# item = response['Item']
# print (item)
#Get Item
response = table.get_item(Key={'username':'Ac', 'last_name':'GH'})
print (response['Item'])
table.update_item(
Key ={
'username':'Ac', 'last_name':'GH'
},
UpdateExpression = 'SET age = :value1',
ExpressionAttributeValues={
':value1':20
}
)
print ("After update \n")
response = table.get_item(Key={'username':'Ac', 'last_name':'GH'})
print (response['Item'])
I'm creating a python app, using requests module. I recently add multiprocessing to speed it up a bit, but I started to get some strange errors like [Errno 1] _ssl.c:1428: error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number or [Errno 1] _ssl.c:1428: error:1408F119:SSL routines:SSL3_GET_RECORD:decryption failed or bad record mac.
The code looks like this:
def hometables_one(conn, request, s, hostname, payload, company):
date1 = request.query.get('date1', '')
date2 = request.query.get('date2', '')
prijmyCelk = 0;
vydajeCelk = 0;
neuhrPrijCelk = 0;
neuhrVydCelk = 0;
dph = 0;
dbNazev = company['dbNazev'];
nazev = company['nazev'];
if date1 and date2:
try:
r = s.get("%s/c/%s/faktura-vydana/(duzpPuv between %s %s)/$sum.json" % (hostname, dbNazev, date1[0], date2[0]), params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
#response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
#return response
conn.send({ "success": False, "errors": { "reason": str(err)}})
conn.close()
return None
else:
try:
r = s.get("%s/c/%s/faktura-vydana/$sum.json" % (hostname, dbNazev), params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
#response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
#return response
conn.send({ "success": False, "errors": { "reason": str(err)}})
conn.close()
return None
obj_vydana = r.json()
data_vydana = obj_vydana['winstrom']['sum']['sumDoklUcetni']['values']
prijmyCelk = float(data_vydana['sumDoklCelkem']['value'])
neuhrVydCelk = float(data_vydana['sumDoklZbyvaUh']['value'])
dph_vydane = float(data_vydana['sumDoklDphCelk']['value'])
if date1 and date2:
try:
r = s.get("%s/c/%s/faktura-prijata/(duzpPuv between %s %s)/$sum.json" % (hostname, dbNazev, date1[0], date2[0]), params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
#response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
#return response
conn.send({ "success": False, "errors": { "reason": str(err)}})
conn.close()
return None
else:
try:
r = s.get("%s/c/%s/faktura-prijata/$sum.json" % (hostname, dbNazev), params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
#response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
#return response
conn.send({ "success": False, "errors": { "reason": str(err)}})
conn.close()
return None
obj_prijata = r.json();
data_prijata = obj_prijata['winstrom']['sum']['sumDoklUcetni']['values']
vydajeCelk = float(data_prijata['sumDoklCelkem']['value'])
neuhrPrijCelk = float(data_prijata['sumDoklZbyvaUh']['value'])
dph_prijate = float(data_prijata['sumDoklDphCelk']['value'])
if prijmyCelk != 0:
result = {
"corporation": nazev,
"dbName": dbNazev,
"prijmyCelk": "%s €" % prijmyCelk,
"nakladyCelk": "%s €" % vydajeCelk,
"ziskCelk": "%s €" % (prijmyCelk-vydajeCelk),
"marzaCelk": ((prijmyCelk-vydajeCelk)/prijmyCelk*100),
"neuhrVydCelk": "%s €" % neuhrVydCelk,
"neuhrPrijCelk": "%s €" % neuhrPrijCelk,
"dph": "%s €" % (dph_vydane-dph_prijate),
}
else:
result = {
"corporation": nazev,
"dbName": dbNazev,
"prijmyCelk": "%s €" % prijmyCelk,
"nakladyCelk": "%s €" % vydajeCelk,
"ziskCelk": "%s €" % (prijmyCelk-vydajeCelk),
"marzaCelk": 0,
"neuhrVydCelk": "%s €" % neuhrVydCelk,
"neuhrPrijCelk": "%s €" % neuhrPrijCelk,
"dph": "%s €" % (dph_vydane-dph_prijate),
}
conn.send(result)
conn.close()
return None
#####################################################################################
def hometables(request):
s = requests.Session()
response = HTTPResponse()
hostname = request.query.get('hostname', '')[0]
auth = request.query.get('auth', '')[0]
p_queue = []
result = []
json_r = {"success": True}
payload = {'authSessionId': request.query.get('auth', '')[0]}
try:
r = s.get("%s/c.json" % hostname, params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
return response
obj = r.json()
data = obj['companies']['company']
data = make_list(data)
parent_conn, child_conn = Pipe()
for company in data:
p_queue.append(Process(target=hometables_one, args=(child_conn, request, s, hostname, payload, company))) #create a new process with hometables_one function
p_queue[-1].start()
for p in p_queue:
received_data = parent_conn.recv()
if "success" not in received_data:
result.append(received_data)s
p.join()
else:
response.write(ujson.dumps(received_data)) #error in hometables_one function
return response
json_r["data"] = result
response.write(ujson.dumps(json_r))
return response
In this part
try:
r = s.get("%s/c.json" % hostname, params=payload, verify=False)
r.raise_for_status()
except requests.exceptions.RequestException as err:
response.write(ujson.dumps({ "success": False, "errors": { "reason": str(err)}}))
return response
obj = r.json()
data = obj['companies']['company']
data = make_list(data)
I get a JSON request with all companies currently in system and then I run the hometables_one function for each of them. The final data may look like this:
[{"createDt":"2014-01-28T00:00:00+01:00","dbNazev":"sveatlo","id":"4","licenseGroup":"null","nazev":"Sveatlo","show":"true","stavEnum":"ESTABLISHED","watchingChanges":"false"}]
or like this:
[{"createDt":"2014-01-28T00:00:00+01:00","dbNazev":"sveatlo","id":"4","licenseGroup":"null","nazev":"Sveatlo","show":"true","stavEnum":"ESTABLISHED","watchingChanges":"false"},{"createDt":"2014-01-28T00:00:00+01:00","dbNazev":"sveatlo1","id":"4","licenseGroup":"null","nazev":"Sveatlo1","show":"true","stavEnum":"ESTABLISHED","watchingChanges":"false"}]
In the first case, when there is just one item the hometables_one function runs without any problems, but adding another item results in error [Errno 1] _ssl.c:1428: error:1408F10B:SSL routines:SSL3_GET_RECORD:wrong version number or [Errno 1] _ssl.c:1428: error:1408F119:SSL routines:SSL3_GET_RECORD:decryption failed or bad record mac. Another thing is that when I run the code without multiprocessing, i.e. content of hometable_one function is in the for loop in hometables function, it runs without any problems.
Why am I getting these errors? Could anybody help me please?
Thanks for any answer
I have experienced similar problems. I think this error is the result of multiple processes trying to access the same SSL connection. What you can try is to introduce a random delay for each process before they fire off the request:
time.sleep(random.randrange(10))
r = s.get("%s/c.json" % hostname, params=payload, verify=False)