I would like to use a boto3 config object to configure connection timeout and other attributes when interacting with DynamoDB through boto3. I have already written my code using a boto3 resource, and all the examples I have been able to find leverage a boto3 client instead when using a config object.
Is it possible to use a config object with a boto3 resource, and if not, why?
I learned that a boto3 resource objects does in fact accept a parameter for a config object. So I was able to define a config object in my wrapper class:
from botocore.config import Config
..
self.config = Config(
connect_timeout = 1,
read_timeout =1
)
And then later do this:
self.dynamodb = boto3.resource('dynamodb', config = self.config)
Related
I am trying to download files from a public aws s3 from this website with python scripts. For example, the first the object on the link. I tried boto3 and I got a No Credentials error:
s3 = boto3.resource('s3')
bucket = s3.Bucket('oedi-data-lake')
keys = []
for obj in bucket.objects.filter(Prefix='nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/'):
if obj.key.endswith('bldg0000001-up10.zip'):
keys.append(obj.key)
print(keys)
I also found a post Download file/folder from Public AWS S3 with Python, no credentials
and I tried as the following:
import requests
headers = {'Host' : 'oedi-data-lake.s3.amazonaws.com'}
url = 'https://oedi-data-lake.s3.amazonaws.com/nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/bldg0000001-up10.zip'
r = requests.get(url)
but got a SSLCertVerificationError
Please help. :)
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Thank you, jhashimoto!
But by doing the following, I still have the NoCredentialsError
import boto3
from botocore import UNSIGNED
from botocore.config import Config
s3 = boto3.resource("s3", config=Config(signature_version=UNSIGNED))
s3_client = boto3.client('s3')
s3_client.download_file('oedi-data-lake', 'nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/bldg0000001-up10.zip', 'bldg1.zip')
I also read can-i-use-boto3-anonymously and changed the code as below:
import boto3
from botocore import UNSIGNED
from botocore.config import Config
client = boto3.client('s3', aws_access_key_id='', aws_secret_access_key='')
client._request_signer.sign = (lambda *args, **kwargs: None)
client.download_file('oedi-data-lake', 'nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/bldg0000001-up10.zip', 'bldg01.zip')
and got SSLCertVerificationError.
is this something that caused by the company security policy?
Sorry for the naive questions. Completely new on AWS.
thank you so much
To access a bucket that allows anonymous access, configure it not to use credentials.
import boto3
from botocore import UNSIGNED
from botocore.config import Config
s3 = boto3.resource("s3", config=Config(signature_version=UNSIGNED))
# output:
# ['nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/bldg0000001-up10.zip']
python - Can I use boto3 anonymously? - Stack Overflow
Yes. Your credentials are used to sign all the requests you send out, so what you have to do is configure the client to not perform the signing step at all.
Note:
Unrelated to the main topic, The AWS Python SDK team does not intend to add new features to the resource interface. You can use the client interface instead.
Resources — Boto3 Docs 1.26.54 documentation
The AWS Python SDK team does not intend to add new features to the resources interface in boto3. Existing interfaces will continue to operate during boto3's lifecycle. Customers can find access to newer service features through the client interface.
Added at 2023/01/21 12:00:
This is a sample code using the client interface.
import boto3
from botocore import UNSIGNED
from botocore.config import Config
s3_client = boto3.client('s3', config=Config(signature_version=UNSIGNED))
s3_client.download_file('oedi-data-lake', 'nrel-pds-building-stock/end-use-load-profiles-for-us-building-stock/2022/resstock_tmy3_release_1/building_energy_models/upgrade=10/bldg0000001-up10.zip', 'bldg1.zip')
I have my config file set up with multiple profiles and I am trying to assume an IAM role, but all the articles I see about assuming roles are starting with making an sts client using
import boto3 client = boto3.client('sts')
which makes sense but the only problem is, It gives me an error when I try to do it like this. but when I do it like this, while passing a profile that exists in my config file, it works. here is the code below:
import boto3 session = boto3.Session(profile_name="test_profile")
sts = session.client("sts")
response = sts.assume_role(
RoleArn="arn:aws:iam::xxx:role/role-name",
RoleSessionName="test-session"
)
new_session = Session(aws_access_key_id=response['Credentials']['AccessKeyId'], aws_secret_access_key=response['Credentials']['SecretAccessKey'], aws_session_token=response['Credentials']['SessionToken'])
when other people are assuming roles in their codes without passing a profile in, how does that even work? does boto3 automatically grabs the default profile from the config file or something like that in their case?
Yes. This line:
sts = session.client("sts")
tells boto3 to create a session using the default credentials.
The credentials can be provided in the ~/.aws/credentials file. If the code is running on an Amazon EC2 instance, boto3 will automatically use credentials associated with the IAM Role associated with the instance.
Credentials can also be passed via Environment Variables.
See: Credentials — Boto3 documentation
When I simply run the following code, I always gets this error.
import boto3 as boto
import sys
import json
role_to_assume_arn="arn:aws:iam::xxxxxxxxxxxx:role/AWSxxxx_xxxxxxAdminaccess_xxxxx24fexxx"
role_session_name='AssumeRoleSession1'
sts_client=boto.client('sts')
assumed_role_object=sts_client.assume_role(
RoleArn="arn:aws:iam::xxxxxxxxxxxx:role/AWSxxxx_xxxxxxAdminaccess_xxxxx24fexxx",
RoleSessionName="Sess1",
)
creds=assumed_role_object['Credentials']
sts_assumed_role = boto3.client('sts',
aws_access_key_id=creds['AccessKeyId'],
aws_secret_access_key=creds['SecretAccessKey'],
aws_session_token=creds['SessionToken'],
)
rds_client = boto.client('rds',
aws_access_key_id=creds['AccessKeyId'],
aws_secret_access_key=creds['SecretAccessKey'],
aws_session_token=creds['SessionToken']
)
I don't want to set and change the temporary session keys frequently, instead I want them to be set directly through a code like I've just written.
Am I wrong? Is there a way to set the credentials like this directly in the program or not?
Or is it mandatory to give the credentials in the "~/.aws/credentials"
I assume you are running this code in your local machine.
The STS client you created is expecting access key and secret access key.
You have to either configure it using credentials file or you can directly hardcode your access key and secret access key like below(Not recommended).
client = boto3.client('sts', aws_access_key_id=key, aws_secret_access_key=sec_key, region_name=region_name)
https://docs.aws.amazon.com/sdk-for-php/v3/developer-guide/guide_credentials_profiles.html
If you are running this code in EC2 instance, install boto3 and do AWS Configure. Follow the below link.
https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html
I am trying to mock AWS s3 api calls using boto2.
I create local s3 endpoint using localstack and can use this using boto3 easily as below,
import boto3
s3_client = boto3.client('s3', endpoint_url='http://localhost:4572')
bucket_name = 'my-bucket'
s3_client.create_bucket(Bucket=bucket_name)
But I did not find way to do this using boto2. Is there any way preferably using ~/.boto or ~/.aws/config?
Tried providing endpoint with boto2 but it failed.
import boto
boto.s3.S3RegionInfo(name='test-s3-region', endpoint='http://127.0.0.1:4572/')
s3 = boto.s3.connect_to_region('test-s3-region')
print s3.get_bucket('test-poc')
error:
AttributeError: 'NoneType' object has no attribute 'get_bucket'
I am looking to use local endpoints for all AWS services for testing purpose.
This works for me:
import boto
from boto.s3.connection import S3Connection
region = boto.s3.S3RegionInfo(name='test-s3-region', endpoint='http://127.0.0.1:4572/', connection_cls=S3Connection)
conn = region.connect()
print conn.get_bucket('test-poc')
You need to set the connection_cls attribute wish is NoneType by default.
I want Boto3 to get the access and secret key from a config file instead of hard coding them. On my Linux server I set the following environment variable AWS_SHARED_CREDENTIALS_FILE with the value /app/.aws/credentials. In /app/.aws/ I put a file with the name credentials with the following content:
[default]
aws_access_key_id = abcd
aws_secret_access_key = abcd
Of course I used the actual keys instead of abcd.
Python:
import boto3
conn = boto3.client('s3',
region_name="eu-west-1",
endpoint_url="endpoint",
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key,
config=Config(signature_version="s3", s3={'addressing_style': 'path'}))
However it says name'aws_access_key_id' is not defined. How can I fix it? Thanks
Edit:
>>> os.environ['AWS_SHARED_CREDENTIALS_FILE']
'/app/.aws/credentials'
If you have credentials folder with aws credentials already created, this means you don't need to specify them when instantiating your client. The following should work:
import boto3
conn = boto3.client('s3',
region_name="eu-west-1",
endpoint_url="endpoint",
config=Config(signature_version="s3", s3={'addressing_style': 'path'}))
If you are running your application on an EC2 instance, you can also assign a S3 role to your instance and have boto assume that role. Prevents you from having to store keys in your instance.
Look at the section "Assume Role Provider" in the docs:
http://boto3.readthedocs.io/en/latest/guide/configuration.html
Link to IAM roles as well:
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html