Need help in fetching a particular value from the json output - python

I need to obtain the Tag values from the below code, it initially fetches the Id and then passes this to the describe_cluster, the value is then in the json format. Tryging to fetch a particular value from this "Cluster" json using "GET". However, it returns a error message as "'str' object has no attribute 'get'", Please suggest.
Here is a reference link of boto3 which I'm referring:
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/emr.html#EMR.Client.describe_cluster
import boto3
import json
from datetime import timedelta
REGION = 'us-east-1'
emrclient = boto3.client('emr', region_name=REGION)
snsclient = boto3.client('sns', region_name=REGION)
def lambda_handler(event, context):
EMRS = emrclient.list_clusters(
ClusterStates = ['STARTING', 'RUNNING', 'WAITING']
)
clusters = EMRS["Clusters"]
for cluster_details in clusters :
id = cluster_details.get("Id")
describe_cluster = emrclient.describe_cluster(
ClusterId = id
)
cluster_values = describe_cluster["Cluster"]
for details in cluster_values :
tag_values = details.get("Tags")
print(tag_values)

The error is in the last part of the code.
describe_cluster = emrclient.describe_cluster(
ClusterId = id
)
cluster_values = describe_cluster["Cluster"]
for details in cluster_values: # ERROR HERE
tag_values = details.get("Tags")
print(tag_values)
The returned value from describe_cluster is a dictionary. The Cluster is also a dictionary. So you don't need to iterate over it. You can directly access cluster_values.get("Tags")

Related

AWS lambda updating an item in a dynamodb table - Parameter validation failed:\nMissing required parameter

I have a Dynamodb Table called agent. agentId is the Hashkey
It has 100 items/records.
I want to update an item agentId is 7 with new agentName.
I am using boto3 documenation
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Table.update_item
I keep am getting this error
"errorMessage": "Parameter validation failed:\nMissing required parameter in input:
\"Key\"\nUnknown parameter in input: \"key\", must be one of:
TableName, Key, AttributeUpdates, Expected, ConditionalOperator, ReturnValues,
ReturnConsumedCapacity, ReturnItemCollectionMetrics, UpdateExpression,
ConditionExpression, ExpressionAttributeNames, ExpressionAttributeValues",
In the boto3 documentation I see only "key" as the REQUIRED attribute. Not sure why it is complaining.
import json
import os
import boto3
from pprint import pprint
from boto3.dynamodb.conditions import Attr
tableName = os.environ.get('TABLE')
fieldName = os.environ.get('FIELD')
keytofind = os.environ.get('FILTER')
fieldname = "agentRole"
dbclient = boto3.resource('dynamodb')
def lambda_handler(event, context):
tableresource = dbclient.Table(tableName)
count = tableresource.item_count
response = tableresource.scan(FilterExpression=Attr('agentRole').eq('Receiver'))
response2 = tableresource.update_item(
key ={'agentId':7},
UpdateExpression = "SET agentName= :name",
ExpressionAttributeValues={'name': 'Mahindra'},
ReturnValues="UPDATED_NEW"
)
pprint(response2)
Modified code that is working after fixing the case sensistive "Key"
response2 = tableresource.update_item(
Key={'agentId':7},
UpdateExpression = 'SET agentName= :val1',
ExpressionAttributeValues={':val1': 'Mahindra'}
)

AWS boto3: how to get hourly price of a specific instance id

I'm trying to write a python script using boto3 in order to get hourly prices of an instance, given the instance ID. I should remark that I'm not speaking about costs that you can get from cost explorer, I'm speaking about nominal hourly price, for example for an 'ec2' instance.
I've already found some examples using "boto3.client('pricing',...)" and a bunch of parameters and filters as in:
https://www.saisci.com/aws/how-to-get-the-on-demand-price-of-ec2-instances-using-boto3-and-python/
which also requires region code to region name conversion.
I would like not to have to specify every instance detail and parameter for that query.
Can anybody help me to find a way to get that info just having the ec2 instance ID?
Thanks in advance.
You have to pass all that information. If you want to write a script that takes an instance ID and returns the hourly price, then you would first need to use the instance ID to lookup the instance details, and then pass those details to the pricing query.
You have to specify most of the information but not all of it.
For example, region_name is optional if you:
Have configured AWS CLI on the machine on which your Python script is running (ie. ~/.aws/config is present and the region is configured).
OR
You are running the Python script on an AWS resource that has a role attached to it with a policy that allows you to retrieve the spot pricing information.
For example, I am able to run this script that retrieves my current spot instances and gets their current hourly cost, and calculates a bid price for me based on the spot price history for that particular instance type without specifying the region anywhere:
#!/usr/bin/env python3
import boto3
import json
from datetime import datetime
from datetime import timedelta
from collections import namedtuple
def get_current_pricing():
pricing = []
ec2_client = boto3.client('ec2')
ec2_resource = boto3.resource('ec2')
response = ec2_client.describe_spot_instance_requests()
spot_instance_requests = response['SpotInstanceRequests']
for instance_request in spot_instance_requests:
if instance_request['State'] == 'active':
instance = ec2_resource.Instance(instance_request['InstanceId'])
for tag in instance.tags:
if tag['Key'] == 'Name':
application = tag['Value']
break
price = {
'application': application,
'instance_type': instance_request['LaunchSpecification']['InstanceType'],
'current_price': float(instance_request['SpotPrice']),
'bid_price': get_bid_price(instance_request['LaunchSpecification']['InstanceType'])
}
pricing.append(price)
return pricing
def get_bid_price(instancetype):
instance_types = [instancetype]
start = datetime.now() - timedelta(days=1)
ec2 = boto3.client('ec2')
price_dict = ec2.describe_spot_price_history(
StartTime=start,
InstanceTypes=instance_types,
ProductDescriptions=['Linux/UNIX']
)
if len(price_dict.get('SpotPriceHistory')) > 0:
PriceHistory = namedtuple('PriceHistory', 'price timestamp')
for item in price_dict.get('SpotPriceHistory'):
price_list = [PriceHistory(round(float(item.get('SpotPrice')), 5), item.get('Timestamp'))]
price_list.sort(key=lambda tup: tup.timestamp, reverse=True)
bid_price = round(float(price_list[0][0]), 5)
leeway = round(float(bid_price / 100 * 10), 5)
bid_price = round(float(bid_price + leeway), 5)
return bid_price
else:
raise ValueError(f'Invalid instance type: {instancetype} provided. '
'Please provide correct instance type.')
if __name__ == '__main__':
current_pricing = get_current_pricing()
print(json.dumps(current_pricing, indent=4, default=str))

Filter aws ec2 snapshots by current date

How to filter AWS EC2 snapshots by current day?
I'm filtering snapshots by tag:Disaster_Recovery with value:Full, using python code below, and I need also filter it by the current day.
import boto3
region_source = 'us-east-1'
client_source = boto3.client('ec2', region_name=region_source)
# Getting all snapshots as per specified filter
def get_snapshots():
response = client_source.describe_snapshots(
Filters=[{'Name': 'tag:Disaster_Recovery', 'Values': ['Full']}]
)
return response["Snapshots"]
print(*get_snapshots(), sep="\n")
solve it, by code below:
import boto3
from datetime import date
region_source = 'us-east-1'
client_source = boto3.client('ec2', region_name=region_source)
date_today = date.isoformat(date.today())
# Getting all snapshots as per specified filter
def get_snapshots():
response = client_source.describe_snapshots(
Filters=[{'Name': 'tag:Disaster_Recovery', 'Values': ['Full']}]
)
return response["Snapshots"]
# Getting snapshots were created today
snapshots = [s for s in get_snapshots() if s["StartTime"].strftime('%Y-%m-%d') == date_today]
print(*snapshots, sep="\n")
This could do the trick:
import boto3
from datetime import date
region_source = 'us-east-1'
client_source = boto3.client('ec2', region_name=region_source)
# Getting all snapshots as per specified filter
def get_snapshots():
response = client_source.describe_snapshots(
Filters=[{'Name': 'tag:Disaster_Recovery', 'Values': ['Full']}]
)
snapshotsInDay = []
for snapshots in response["Snapshots"]:
if(snapshots["StartTime"].strftime('%Y-%m-%d') == date.isoformat(date.today())):
snapshotsInDay.append(snapshots)
return snapshotsInDay
print(*get_snapshots(), sep="\n")
After reading the docs the rest is a simple date comparision

Using Boto3 to create loop on specific folder

I am testing the new data feeds as in XML. Those data will be stored in S3 in the following format:
2018\1\2\1.xml
2018\1\3\1.xml
2018\1\3\2.xml
etc. So, multiple .xml files are possible on one day. Also, important to note that there are folders in this bucket that I do NOT want to pull. So I have to target a very specific directory.
There is no date time stamp within the file, so I need to use created, modified, something to go off of. To do this I think of using a dictionary of key, values with folder+xml file as the key, created/modified timestamp as the value. Then, use that dict to essentially re-pull all the objects.
Here's what I've tried..
i
mport boto3
from pprint import pprint
client = boto3.client('s3')
paginator = client.get_paginator('list_objects_v2')
result = paginator.paginate(
Bucket='bucket',
Prefix='folder/folder1/folder2')
bucket_object_list = []
for page in result:
pprint(page)
if "Contents" in page:
for key in page[ "Contents" ]:
keyString = key[ "Key" ]
pprint(keyString)
bucket_object_list.append(keyString)
s3 = boto3.resource('s3')
obj = s3.Object('bucket','bucket_object_list')
obj.get()["Contents"].read().decode('utf-8')
pprint(obj.get())
sys.exit()
This is throwing an error from the key within the obj = s3.Object('cluster','key') line.
Traceback (most recent call last):
File "s3test2.py", line 25, in <module>
obj = s3.Object('cluster', key)
NameError: name 'key' is not defined
The Maxitems is purely for testing purposes although it's interesting since this translates to 1000 when run.
NameError: name 'key' is not defined
As far as error is concerned, it's because key is not defined.
From this documentation:
Object(bucket_name, key)
Creates a Object resource.:
object = s3.Object('bucket_name','key')
Parameters
bucket_name(string) -- The Object's bucket_name identifier. This must be set.
key(string) -- The Object's key identifier. This must be set.
You need to assign an object key name to the 'key' you're using in the code
The keyName is the "name" (=unique identifier) by which your file will be stored in the S3 bucket
Code based on what you posted:
import boto3
client = boto3.client('s3')
paginator = client.get_paginator('list_objects_v2')
result = paginator.paginate( Bucket='bucket_name', Prefix='folder/folder1/folder2')
bucket_object_list = []
for page in result:
if "Contents" in page:
for key in page[ "Contents" ]:
keyString = key[ "Key" ]
print(keyString)
bucket_object_list.append(keyString)
print bucket_object_list
s3 = boto3.resource('s3')
for file_name in bucket_object_list:
obj = s3.Object('bucket_name',file_name)
print(obj.get())
print(obj.get()["Body"].read().decode('utf-8'))

How to delete snapshot based on the description?

I am trying to delete any previous snapshot with description having tags_description , how can i do so ? The following code throws me error :
def call_cleaner(data):
regions = ['us-west-2', 'eu-central-1', 'ap-southeast-1']
for index, region in enumerate(regions):
for ip_address, tags_descrip, regions_az, volume_id in data[index]:
ec2 = boto3.resource('ec2', region, aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY, )
delete_snapshot = ec2.describe_snapshots(Filters=tags_descrip)
for snap in delete_snapshot['Snapshots']:
print "Deleting snapshot %s" % snap['SnapshotId']
ec2.delete_snapshot(SnapshotId=snap['SnapshotId'])
ERROR
delete_snapshot = ec2.describe_snapshots(Filters=tags_descrip)
AttributeError: 'ec2.ServiceResource' object has no attribute 'describe_snapshots'
In order to use these functions in boto3 you need to use ec2 client which can be initialized like this: client = boto3.client('ec2'). (Basically, replace ec2.* with client.*

Categories

Resources