I have the following code posted below which gets all the s3 bucket list on aws and I am trying to write code that checks if the buckets are encrypted in python but I am having trouble figuring out how to do that. Can anyone tell me how to modify my code to do that. I tried online examples and looked at the documentation.
my code is:
from __future__ import print_function
import boto3
import os
os.environ['AWS_DEFAULT_REGION'] = "us-east-1"
# Create an S3 client
s3 = boto3.client('s3')
# Call S3 to list current buckets
response = s3.list_buckets()
# Get a list of all bucket names from the response
buckets = [bucket['Name'] for bucket in response['Buckets']]
# Print out the bucket list
print("Bucket List: %s" % buckets)
Tried the following codes but they don't work:
s3 = boto3.resource('s3')
bucket = s3.Bucket('my-bucket-name')
for obj in bucket.objects.all():
key = s3.Object(bucket.name, obj.key)
print key.server_side_encryption
and
#!/usr/bin/env python
import boto3
s3_client = boto3.client('s3')
head = s3_client.head_object(
Bucket="<S3 bucket name>",
Key="<S3 object key>"
)
if 'ServerSideEncryption' in head:
print head['ServerSideEncryption']
It's first worth understanding a few things about S3 and encryption.
When you enable default encryption on an S3 bucket, you're actually configuring a server-side encryption configuration rule on the bucket that will cause S3 to encrypt every object uploaded to the bucket after the rule was configured.
Unrelated to #1, you can apply an S3 bucket policy to a bucket, denying any uploads of objects that are not encrypted. This will prevent you from adding unencrypted data but it will not automatically encrypt anything.
You can encrypt uploads on an object-by-object basis; encryption does not have to be bucket-wide.
So, one way to find out which buckets fall into category #1 (will automatically encrypt anything uploaded to them), you can do this:
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
rules = enc['ServerSideEncryptionConfiguration']['Rules']
print('Bucket: %s, Encryption: %s' % (bucket['Name'], rules))
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
print('Bucket: %s, no server-side encryption' % (bucket['Name']))
else:
print("Bucket: %s, unexpected error: %s" % (bucket['Name'], e))
This will result in output like this:
Bucket: mycats, no server-side encryption
Bucket: mydogs, no server-side encryption
Bucket: mytaxreturn, Encryption: [{'ApplyServerSideEncryptionByDefault': {'SSEAlgorithm': 'AES256'}}]
Related
Requirement: Find out unencrypted s3 buckets from AWS account and add tag to it.
Implemented so far
import boto3
from botocore.exceptions import ClientError
# Retrieve the list of existing buckets
s3 = boto3.client('s3')
response = s3.list_buckets()
# Find out unencrypted bucket list
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket["Name"])
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
else:
print("Bucket with unexpected error: %s, unexpected error: %s" % (bucket['Name'], e))
Following line gives me the unencrypted bucketslist:
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
Result:
Bucket with no server-side encryption: xyz1
Bucket with no server-side encryption: xyz2
Need support for following
I can get the list of unencrypted s3 buckets but not sure how to use the output from except python code and utilize unencrypted bucket names to add tag later.
If you declare a list outside of your try-catch, you can access it later on
E.g.
import boto3
from botocore.exceptions import ClientError
#this is our new list
buckets = []
# Retrieve the list of existing buckets
s3 = boto3.client('s3')
response = s3.list_buckets()
# Find out unencrypted bucket list
for bucket in response['Buckets']:
try:
enc = s3.get_bucket_encryption(Bucket=bucket["Name"])
except ClientError as e:
if e.response['Error']['Code'] == 'ServerSideEncryptionConfigurationNotFoundError':
#add the bucket name to our new list
buckets.append(bucket['Name'])
print('Bucket with no server-side encryption: %s' % (bucket['Name']))
else:
print("Bucket with unexpected error: %s, unexpected error: %s" % (bucket['Name'], e))
#now you can use the "buckets" variable and it will contain all the unencrypted buckets
for bucket in buckets:
print(bucket)
I have flask python rest api which is called by another flask rest api.
the input for my api is one parquet file (FileStorage object) and ECS connection and bucket details.
I want to save parquet file to ECS in a specific folder using boto or boto3
the code I have tried
def uploadFileToGivenBucket(self,inputData,file):
BucketName = inputData.ecsbucketname
calling_format = OrdinaryCallingFormat()
client = S3Connection(inputData.access_key_id, inputData.secret_key, port=inputData.ecsport,
host=inputData.ecsEndpoint, debug=2,
calling_format=calling_format)
#client.upload_file(BucketName, inputData.filename, inputData.folderpath)
bucket = client.get_bucket(BucketName,validate=False)
key = boto.s3.key.Key(bucket, inputData.filename)
fileName = NamedTemporaryFile(delete=False,suffix=".parquet")
file.save(fileName)
with open(fileName.name) as f:
key.send_file(f)
but it is not working and giving me error like...
signature_host = '%s:%d' % (self.host, port)
TypeError: %d format: a number is required, not str
I tried google but no luck Can anyone help me with this or any sample code for the same.
After a lot of hit and tried and time, I finally got the solution. I posting it for everyone else who are facing the same issue.
You need to use Boto3 and here is the code...
def uploadFileToGivenBucket(self,inputData,file):
BucketName = inputData.ecsbucketname
#bucket = client.get_bucket(BucketName,validate=False)
f = NamedTemporaryFile(delete=False,suffix=".parquet")
file.save(f)
endpointurl = "<your endpoints>"
s3_client = boto3.client('s3',endpoint_url=endpointurl, aws_access_key_id=inputData.access_key_id,aws_secret_access_key=inputData.secret_key)
try:
newkey = 'yourfolderpath/anotherfolder'+inputData.filename
response = s3_client.upload_file(f.name, BucketName,newkey)
except ClientError as e:
logging.error(e)
return False
return True
So I have a file.csv on my bucket 'test', I'm creating a new session and I wanna download the contents of this file:
session = boto3.Session(
aws_access_key_id=KEY,
aws_secret_access_key=SECRET_KEY
)
s3 = session.resource('s3')
obj = s3.Bucket('test').objects.filter(Prefix='file.csv')
This returns me a collection but is there a way to fetch the file directly? Without any loops, I wanna do something like:
s3.Bucket('test').objects.get(key='file.csv')
I could achieve the same result without passing credentials like this:
s3 = boto3.client('s3')
obj = s3.get_object(Bucket='test', Key='file.csv')
If you take a look at the client method:
import boto3
s3_client = boto3.client('s3')
s3_client.download_file('mybucket', 'hello.txt', '/tmp/hello.txt')
and the resource method:
import boto3
s3 = boto3.resource('s3')
s3.meta.client.download_file('mybucket', 'hello.txt', '/tmp/hello.txt')
you'll notice that you can convert from the resource to the client with meta.client.
So, combine it with your code to get:
session = boto3.Session(aws_access_key_id=KEY, aws_secret_access_key=SECRET_KEY)
s3 = session.resource('s3')
obj = s3.meta.client.download_file('mybucket', 'hello.txt', '/tmp/hello.txt')
I like mpu.aws.s3_download, but I'm biased ;-)
It does it like that:
import os
import boto3
def s3_download(bucket_name, key, profile_name, exists_strategy='raise'):
session = boto3.Session(profile_name=profile_name)
s3 = session.resource('s3')
if os.path.isfile(destination):
if exists_strategy == 'raise':
raise RuntimeError('File \'{}\' already exists.'
.format(destination))
elif exists_strategy == 'abort':
return
s3.Bucket(bucket_name).download_file(key, destination)
For authentication, I recommend using environment variables. See boto3: Configuring Credentials for details.
you can use the following boto3 method.
download_file(Bucket, Key, Filename, ExtraArgs=None, Callback=None,
Config=None)
s3 = boto3.resource('s3')
s3.meta.client.download_file('mybucket', 'hello.txt', '/tmp/hello.txt')
find more details here - download_file()
I want to convert m4a files uploaded to S3 to mp3. The files would only be 15 seconds max, so using Elastic Transcoder would be overkill. I downloaded a binary from https://www.johnvansickle.com/ffmpeg/. But I am very new to AWS and im still not sure how uploading binaries works. How would I would include it so that I could convert a file?
import boto3
import urllib
print('Loading function')
s3 = boto3.client('s3')
def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.unquote_plus(event['Records'][0]['s3']['object']['key'].encode('utf8'))
try:
#convert to mp3
#upload to bucket
except Exception as e:
print(e)
print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
raise e
I cannot download a file or even get a listing of the public S3 bucket with boto3.
The code below works with my own bucket, but not with public one:
def s3_list(bucket, s3path_or_prefix):
bsession = boto3.Session(aws_access_key_id=settings.AWS['ACCESS_KEY'],
aws_secret_access_key=settings.AWS['SECRET_ACCESS_KEY'],
region_name=settings.AWS['REGION_NAME'])
s3 = bsession.resource('s3')
my_bucket = s3.Bucket(bucket)
items = my_bucket.objects.filter(Prefix=s3path_or_prefix)
return [ii.key for ii in items]
I get an AccessDenied error on this code. The bucket is not in my own and I cannot set permissions there, but I am sure it is open to public read.
I had the similar issue in the past. I have found a key to this bug in https://github.com/boto/boto3/issues/134 .
You can use undocumented trick:
import botocore
def s3_list(bucket, s3path_or_prefix, public=False):
bsession = boto3.Session(aws_access_key_id=settings.AWS['ACCESS_KEY'],
aws_secret_access_key=settings.AWS['SECRET_ACCESS_KEY'],
region_name=settings.AWS['REGION_NAME'])
client = bsession.client('s3')
if public:
client.meta.events.register('choose-signer.s3.*', botocore.handlers.disable_signing)
result = client.list_objects(Bucket=bucket, Delimiter='/', Prefix=s3path_or_prefix)
return [obj['Prefix'] for obj in result.get('CommonPrefixes')]