Requirement: Find out unencrypted s3 buckets from AWS account and add tag to it.
Implemented so far
import boto3
from botocore.exceptions import ClientError
# Retrieve the list of existing buckets
s3 = boto3.client('s3')
response = s3.list_buckets()
# Find out unencrypted bucket list
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket["Name"])
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
else:
print("Bucket with unexpected error: %s, unexpected error: %s" % (bucket['Name'], e))
Following line gives me the unencrypted bucketslist:
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
Result:
Bucket with no server-side encryption: xyz1
Bucket with no server-side encryption: xyz2
Need support for following
I can get the list of unencrypted s3 buckets but not sure how to use the output from except python code and utilize unencrypted bucket names to add tag later.
If you declare a list outside of your try-catch, you can access it later on
E.g.
import boto3
from botocore.exceptions import ClientError
#this is our new list
buckets = []
# Retrieve the list of existing buckets
s3 = boto3.client('s3')
response = s3.list_buckets()
# Find out unencrypted bucket list
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket["Name"])
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
#add the bucket name to our new list
buckets.append(bucket['Name'])
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
else:
print("Bucket with unexpected error: %s, unexpected error: %s" % (bucket['Name'], e))
#now you can use the "buckets" variable and it will contain all the unencrypted buckets
for bucket in buckets:
print(bucket)
Related
I have been trying to use Secret Manager in the lambda function in AWS.
I am using Secret Manager to store my Redshift credentials and want to use the sample code given by the AWS Secret manager to retrieve the secret via the lambda function.
I have set up a Secret in secret manager which contains my redshift credentials (username, password)
I am trying to set up a lambda function which would get the secrets from Secret Manger: below is the sample code:
import boto3
import base64
from botocore.exceptions import ClientError
def lambda_handler(event, context):
def get_secret():
secret_name = "test/MySecret"
region_name = "eu-west-2"
# Create a Secrets Manager client
session = boto3.session.Session()
client = session.client(
service_name='secretsmanager',
region_name=region_name
)
try:
get_secret_value_response = client.get_secret_value(
SecretId = secret_name
)
except ClientError as e:
if e.response['Error']['Code'] == 'DecryptionFailureException':
# Secrets Manager can't decrypt the protected secret text using the provided KMS key.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InternalServiceErrorException':
# An error occurred on the server side.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InvalidParameterException':
# You provided an invalid value for a parameter.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InvalidRequestException':
# You provided a parameter value that is not valid for the current state of the resource.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'ResourceNotFoundException':
# We can't find the resource that you asked for.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
else:
# Decrypts secret using the associated KMS CMK.
# Depending on whether the secret is a string or binary, one of these fields will be populated.
if 'SecretString' in get_secret_value_response:
secret = get_secret_value_response['SecretString']
else:
decoded_binary_secret = base64.b64decode(get_secret_value_response['SecretBinary'])
I am getting the following errors whilst running the lambda function:
{
"errorMessage": "name 'secret_name' is not defined",
"errorType": "NameError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 22, in lambda_handler\n SecretId = secret_name\n"
]
}
I have defined the secret_name at the start of the lambda function but I am getting the secret_name' is not defined error. Any suggestions how i can fix the issue
So the thing is python cannot get the value of secret_name variable, the reason is it is under a function
def get_secret():
secret_name = "test/MySecret"
region_name = "eu-west-2"
So instead if you just use
secret_name = "test/MySecret" without the function part, the sample code should work
Add VersionStage Parameter in your code!
get_secret_value_response = client.get_secret_value(
SecretId = secret_name,VersionStage='AWSCURRENT')
Hi Iam trying to turn on default s3 encryption on all my buckets in an account using python boto3 script see below.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
for bucket in response['Buckets']:
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
s3.put_bucket_encryption(
Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
}
)
But i am struggling with my code which is not working
gives error
File "apply.py", line 10, in <module>
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 272, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 576, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ServerSideEncryptionConfigurationNotFoundError) when calling the GetBucketEncryption operation: The server side encryption configuration was not found
You are passing the wrong bucket name. Change Bucket=enc to Bucket=bucket['Name'] in your call to put_bucket_encryption.
Note also that the call to get_bucket_encryption will throw an exception if the bucket does not actually have encryption configured. While that might seem odd, that's the way it works (see boto3/issues/1899 for more details). So, to handle this potential exception:
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(...)
else:
print("Unexpected error: %s" % e)
Please see below fixed code thanks #jarmod for quick help.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
client = boto3.client('s3')
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
for bucket in response['Buckets']:
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
})
else:
print("Unexpected error: %s" % e)
```
I have the following code posted below which gets all the s3 bucket list on aws and I am trying to write code that checks if the buckets are encrypted in python but I am having trouble figuring out how to do that. Can anyone tell me how to modify my code to do that. I tried online examples and looked at the documentation.
my code is:
from __future__ import print_function
import boto3
import os
os.environ['AWS_DEFAULT_REGION'] = "us-east-1"
# Create an S3 client
s3 = boto3.client('s3')
# Call S3 to list current buckets
response = s3.list_buckets()
# Get a list of all bucket names from the response
buckets = [bucket['Name'] for bucket in response['Buckets']]
# Print out the bucket list
print("Bucket List: %s" % buckets)
Tried the following codes but they don't work:
s3 = boto3.resource('s3')
bucket = s3.Bucket('my-bucket-name')
for obj in bucket.objects.all():
key = s3.Object(bucket.name, obj.key)
print key.server_side_encryption
and
#!/usr/bin/env python
import boto3
s3_client = boto3.client('s3')
head = s3_client.head_object(
Bucket="<S3 bucket name>",
Key="<S3 object key>"
)
if 'ServerSideEncryption' in head:
print head['ServerSideEncryption']
It's first worth understanding a few things about S3 and encryption.
When you enable default encryption on an S3 bucket, you're actually configuring a server-side encryption configuration rule on the bucket that will cause S3 to encrypt every object uploaded to the bucket after the rule was configured.
Unrelated to #1, you can apply an S3 bucket policy to a bucket, denying any uploads of objects that are not encrypted. This will prevent you from adding unencrypted data but it will not automatically encrypt anything.
You can encrypt uploads on an object-by-object basis; encryption does not have to be bucket-wide.
So, one way to find out which buckets fall into category #1 (will automatically encrypt anything uploaded to them), you can do this:
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
rules = enc['ServerSideEncryptionConfiguration']['Rules']
print('Bucket: %s, Encryption: %s' % (bucket['Name'], rules))
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
print('Bucket: %s, no server-side encryption' % (bucket['Name']))
else:
print("Bucket: %s, unexpected error: %s" % (bucket['Name'], e))
This will result in output like this:
Bucket: mycats, no server-side encryption
Bucket: mydogs, no server-side encryption
Bucket: mytaxreturn, Encryption: [{'ApplyServerSideEncryptionByDefault': {'SSEAlgorithm': 'AES256'}}]
i am trying to write my streamed data(json format) to s3 bucket. i am using below code but not able to write. No error while executing the below code. but no json files in s3
class TweetsListener( StreamListener):
def __init__(self,path):
self.path = path
def on_data(self, data):
try:
s3 = boto3.resource('s3')
s3.put_object(Bucket='bucket',Body=data.encode('UTF-8'),Key='/A/'+self.path+'/test.json')
return True
except BaseException as e:
print("Error on_data: %s" % str(e))
return True
def on_error(self, status):
print(status)
return True
From what I can tell, you are trying to use put_object action on boto3's S3 Service Resource instead of the S3 Client.
The Service Resource doesn't have a put_object method.
Also, you should remove the leading / in the Key, and make sure that your bucket is already created.
I want to convert m4a files uploaded to S3 to mp3. The files would only be 15 seconds max, so using Elastic Transcoder would be overkill. I downloaded a binary from https://www.johnvansickle.com/ffmpeg/. But I am very new to AWS and im still not sure how uploading binaries works. How would I would include it so that I could convert a file?
import boto3
import urllib
print('Loading function')
s3 = boto3.client('s3')
def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'].encode('utf8'))
try:
#convert to mp3
#upload to bucket
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e