Boto3 Not Locating Credentials with DynamoDB Encryption SDK - python

For some reason, the credentials supplied in boto3's Session is not being picked up the EncryptedTable method of dynamodb-encryption-sdk. The same credentials work if I just use the unecrypted table method direct from boto3.
import boto3
from dynamodb_encryption_sdk import EncryptedTable
from dynamodb_encryption_sdk.material_providers.aws_kms import AwsKmsCryptographicMaterialsProvider
from environs import Env
env = Env()
env.read_env('local.env', False)
session = boto3.Session(aws_access_key_id=env('AWS_ACCESS_ID'),
aws_secret_access_key=env('AWS_SECRET_KEY'),
region_name=env('AWS_REGION'))
dynamodb = session.resource('dynamodb')
table = dynamodb.Table('accounts-table')
aws_cmk_id = env('AWS_CMK_ID')
aws_kms_cmp = AwsKmsCryptographicMaterialsProvider(key_id=aws_cmk_id)
encrypted_table = EncryptedTable(
table=table,
materials_provider=aws_kms_cmp,
)
plaintext_item = {
'account_id': '4548',
'account_name': 'Blah',
}
encrypted_table.put_item(Item=plaintext_item)
This is what I get while executing this code:
File "/Users/nirmalnatarajan/venvs/account-postman/lib/python3.7/site-packages/botocore/auth.py", line 357, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Any idea what I might be doing wrong? Appreciate your help.

If you name your .env variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY then they will be picked up automatically from os.environ and you don't need a session.

Try passing in the session to the AwsKmsCryptographicMaterialsProvider:
aws_kms_cmp = AwsKmsCryptographicMaterialsProvider(key_id=aws_cmk_id,
botocore_session=session)
Alternatively, I think you could set it as the default session.

Related

Only get new data from DynamoDB using Python

I'm trying to export data from a DynamoDB transaction table using Python. Until now I was able to get all the data from the table but I would like to add a filter that allows me to only get the data from a certain date until today.
There is a field called CreatedAt that indicates the time when the transaction was made, I was thinking of using this field to filter the new data.
This is the code I've been using to query the table, it would be really helpful if anyone can tell me how to apply this filter into this script.
import pandas as pd
from boto3.dynamodb.conditions
aws_access_key_id = '*****'
aws_secret_access_key = '*****'
region='****'
dynamodb = boto3.resource(
'dynamodb',
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
region_name=region
)
transactions_table = dynamodb.Table('transactions_table')
result = transactions_table.scan()
items = result['Items']
df_transactions_table = pd.json_normalize(items)
print(df_transactions_table)
Thanks!
Boto3 allows for FilterExpressions as part of a DynamoDB query that will achieve filtering on the field. See here
Optionally using FilterExpressions will still consume the same amount of read capacity units.
You need to use FilterExpression which would look like the following:
import boto3
from boto3.dynamodb.conditions import Key, Attr, And
aws_access_key_id = '*****'
aws_secret_access_key = '*****'
region='****'
dynamodb = boto3.resource(
'dynamodb',
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
region_name=region
)
transactions_table = dynamodb.Table('transactions_table')
result = transactions_table.scan(
FilterExpression=Attr('CreatedAt').gt('2020-08-10'),
)
items = result['Items']
df_transactions_table = pd.json_normalize(items)
print(df_transactions_table)
You can learn more from the docs on Boto3 Scan and FilterExpression.
Some advice: Please do not hard code your keys the way you have done in this code, use an IAM role. If you are testing locally, configure the AWS CLI which will provide credentials that you can assume when testing, that way you wont make a mistake and share keys on GitHub etc...

Python Boto3 to Aws Sdk for blob storage

This code retrieves the buckets of a Amazon S3-compatible storage (not Amazon AWS but the Zadara compatible cloud storage) and IT WORKS:
import boto3
from botocore.client import Config
session = boto3.session.Session( )
s3_client = session.client(
service_name = 's3',
region_name = 'IT',
aws_access_key_id = 'xyz',
aws_secret_access_key = 'abcedf',
endpoint_url = 'https://nothing.com:443',
config = Config(signature_version='s3v4'),
)
print('Buckets')
boto3.set_stream_logger(name='botocore')
print(s3_client.list_buckets())
I am trying to use the same method to access S3 via C# and AWS SDK, anyway I always obtain the error "The request signature we calculated does not match the signature you provided. Check your key and signing method.".
AmazonS3Config config = new AmazonS3Config();
config.AuthenticationServiceName = "s3";
config.ServiceURL = "https://nothing.com:443";
config.SignatureVersion = "s3v4";
config.AuthenticationRegion = "it";
AmazonS3Client client = new AmazonS3Client(
"xyz",
"abcdef",
config);
ListBucketsResponse r = await client.ListBucketsAsync();
What can I do? Why it is not working? I can't get a solution.
I tried also to trace debug infos:
Python
boto3.set_stream_logger(name='botocore')
C#
AWSConfigs.LoggingConfig.LogResponses = ResponseLoggingOption.Always;
AWSConfigs.LoggingConfig.LogMetrics = true;
AWSConfigs.LoggingConfig.LogTo = Amazon.LoggingOptions.SystemDiagnostics;
AWSConfigs.AddTraceListener("Amazon", new System.Diagnostics.ConsoleTraceListener());
but for C# it is not logging the whole request.
Any suggestion?

How to use dialogflow Client In Python

what i am trying is to get the response in python
import dialogflow
from google.api_core.exceptions import InvalidArgument
DIALOGFLOW_PROJECT_ID = 'imposing-fx-333333'
DIALOGFLOW_LANGUAGE_CODE = 'en'
GOOGLE_APPLICATION_CREDENTIALS = 'imposing-fx-333333-e6e3cb9e4adb.json'
text_to_be_analyzed = "Hi! I'm David and I'd like to eat some sushi, can you help me?"
session_client = dialogflow.SessionsClient()
session = session_client.session_path(DIALOGFLOW_PROJECT_ID, SESSION_ID)
text_input = dialogflow.types.TextInput(text=text_to_be_analyzed,
language_code=DIALOGFLOW_LANGUAGE_CODE)
query_input = dialogflow.types.QueryInput(text=text_input)
try:
response = session_client.detect_intent(session=session, query_input=query_input)
except InvalidArgument:
raise
print("Query text:", response.query_result.query_text)
print("Detected intent:", response.query_result.intent.display_name)
print("Detected intent confidence:", response.query_result.intent_detection_confidence)
print("Fulfillment text:", response.query_result.fulfillment_text)
And i am getting unable to verify credentials
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
This is my first question in stackoverflow :) i know i have done many
You need to export Service Account Key (JSON) file from your , and set an environment variable GOOGLE_APPLICATION_CREDENTIALS to the file path of the JSON file that contains your service account key. Then you can make call to dialogflow.
Steps to get Service Account Key:
Make sure you are using Dialogflow v2.
Go to general settings and click on your Service Account. This will redirect you to Google Cloud Platform project’s service account page.
Next step is to create a new key for the service account. Now create a service account and choose JSON as output key. Follow the instructions and a JSON file will be downloaded to your computer. This file will be used as GOOGLE_APPLICATION_CREDENTIALS.
Now in code,
import os
import dialogflow
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/path/to/file.json"
project_id = "your_project_id"
session_id = "your_session_id"
language_code = "en"
session_client = dialogflow.SessionsClient()
session = session_client.session_path(project_id, session_id)
text_input = dialogflow.types.TextInput(text=text, language_code=language_code)
query_input = dialogflow.types.QueryInput(text=text_input)
response_dialogflow = session_client.detect_intent(session=session, query_input=query_input)
This one works too in case you want to pick up the file from file system.
Recomended way is using env variables thoguh
import json
from google.cloud import dialogflow_v2
from google.oauth2 import *
session_client = None
dialogflow_key = None
creds_file = "/path/to/json/file.json"
dialogflow_key = json.load(open(creds_file))
credentials = (service_account.Credentials.from_service_account_info(dialogflow_key))
session_client = dialogflow_v2.SessionsClient(credentials=credentials)
print("it works : " + session_client.DEFAULT_ENDPOINT) if session_client is not None
else print("does not work")
I forgot to add the main article sorry...
Here it is :
https://googleapis.dev/python/google-auth/latest/user-guide.html#service-account-private-key-files

get aws_secret_access_key to store in ~/.boto

Im trying to store in ~/.boto file the user aws_access_key_id and aws_secret_access_key keys.
I'm already storing aws_access_key_id correctly, but now, I don't know how can I get aws_secret_access_key so I can store it in ~/.boto file.
Do you know how can I get aws_secret_access_key?
import os
import boto.iam.connection
username = "user"
conection = boto.iam.connect_to_region("us-east-1")
conection.create_access_key(username)
conection.create_access_key(username)
k = conection.get_all_access_keys(username)
ackey = k['list_access_keys_response']['list_access_keys_result']['access_key_metadata'][0]['access_key_id']
# and how to return the aws_secret_access_key??
with open(os.path.expanduser("~/.boto"),"w") as f:
f.write("[Credentials]")
f.write("/n")
f.write("aws_access_key_id" + ackey)
f.write("/n")
f.write("aws_secret_access_key" + ??)
The secret_access_key associated with AWS API credentials is returned via the API when the access key is created. You must store the key at that point because it is never returned by the IAM service again. If you change your code to be something like this, you can capture the secret key at key creation time.
conection = boto.iam.connect_to_region("us-east-1")
response = connection.create_access_key(username)
secret_access_key = response['create_access_key_response']['create_access_key_result']['access_key']['secret_access_key']

Boto: Dynamically get aws_access_key_id and aws_secret_access_key in Python code from config?

I have my aws_access_key_id and aws_secret_access_key stored in ~/.boto and was wondering if there was a way for me to retrieve these values in my python code using Boto as I need to insert them in to my SQL statement to to copy a CSV file from S3.
This should work:
import boto
access_key = boto.config.get_value('Credentials', 'aws_access_key_id')
secret_key = boto.config.get_value('Credentials', 'aws_secret_access_key')
Here's a helper that will look in ~/.aws/credentials if boto.config doesn't work. I didn't look into it in great detail, but it kind of appears that Boto 2 doesn't look in ~/.aws/credentials.
def get_aws_credentials():
# I think this will look in ~/.boto ([Credentials] section)
aws_access_key_id = boto.config.get_value("Credentials", 'aws_access_key_id')
aws_secret_access_key = boto.config.get_value("Credentials", 'aws_secret_access_key')
# I don't think Boto 2 looks in ~/.aws/credentials, so we look
if aws_access_key_id is None or aws_secret_access_key is None:
with open(os.path.expanduser("~/.aws/credentials")) as f:
for line in f:
try:
key, val = line.strip().split('=')
if key == 'aws_access_key_id':
aws_access_key_id = val
elif key == 'aws_secret_access_key':
aws_secret_access_key = val
except ValueError:
pass
return aws_access_key_id, aws_secret_access_key
Because the aws credentials & boto files both use the .ini format, you can parse them with ConfigParser. Here's an example of parsing the ~/.aws/credentials file (this is python 2, but should be easy enough to port to python 3):
from os.path import expanduser
import ConfigParser
def read_credentials_from_config_section(section_name):
# parsing ~/.aws/credentials but it's just as easy to parse ~/.boto
aws_credentials_path = os.path.join(expanduser("~"), '.aws', 'credentials')
c = ConfigParser.ConfigParser()
c.read(aws_credentials_path)
return c.get(section_name, 'aws_access_key_id'), c.get(section_name, 'aws_secret_access_key')
Use via:
k, s = read_credentials_from_config_section('default')
If you want to use the ~/.boto file, modify the above code to read the ~/.boto file, and adjust for its naming conventions -- the code is very similar.
An alternative way to read the ~/.aws/credentials file (assuming you have awscli installed) is to shell out to the aws cli and let it deal with the details. This is a lot slower, though (takes ~1.5s to run on my machine, which is unacceptable for a lot of use cases).
import subprocess
print subprocess.check_output(['aws', 'configure', 'get', 'aws_access_key_id', '--profile', aws_profile_name])
print subprocess.check_output(['aws', 'configure', 'get', 'aws_secret_access_key', '--profile', aws_profile_name])

Categories

Resources