Getting AccessDenied when trying to retrieve a file from S3 - python

I am trying to retrieve a file from an S3 bucket (using boto3), but I keep getting "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied". Below is how I created the S3 bucket:
# S3
bucket = s3.Bucket(
self,
"testS3Bucket",
bucket_name=f"test_s3_bucket",
versioned=True,
)
bucket.grant_read_write(service_lambda)
and the method I used to get the file from S3:
def download_file(self, file_name, s3_bucket):
try:
file = self.s3.Bucket(s3_bucket).Object(file_name)
file_content = file.get()["Body"].read()
except ClientError as e:
if e.response["Error"]["Code"] == "404":
log.error("File does not exist for partner")
return {}
else:
raise e
except Exception as e:
raise e
return file_content.decode()
I even went as far as to add "s3:*" action to the IAM policy statement, but I still got the same error. I was able to use the command below to successfully retrieve the file's metadata so I don't think it is a permissions error.
aws s3api head-object --bucket test_s3_bucket --key configuration.txt

Related

ExpiredToken error when using python moto lib to mock s3

I am new to Python. Anyone knows how moto mock test work? I am going to test the code which will fetch data from S3 and convert data and upload to S3. I used moto lib to mock s3, however it shows "An error occurred (ExpiredToken) when calling the ListObjects operation" when calling S3 from my code.
Do I need to mock aws credential? How should I mock it? (I check the moto lib and there is no such thing mock_arn/mock_credential )
Thank you in advance.
Here is my code: inside parse_requests method, it will fetch data from S3 and convert data, then upload to S3.
Class Test_processing_data(TestCase):
def setUp(self):
self.mock_s3 = mock_s3()
self.mock_s3.start()
self.mock_logs = mock_logs()
self.mock_logs.start()
self.bucket_region = "us-east-1"
self.bucket_name = "test-bucket"
self.s3_client = boto3.client("s3", region_name=self.bucket_region)
self.s3_client.create_bucket(Bucket=self.bucket_name)
#mock_s3
def test_parse_requests(self):
bucket_name = "test-bucket"
prefix = "test/model/main/"
execution_date = "2022/09/13/12"
parse_requests(execution_date, bucket_name, prefix)
Here is the error message:
self = <botocore.client.S3 object at 0x10abf18e0>
operation_name = 'ListObjects'
api_params = {'Bucket': 'campaign-performance-forecasting-offline', 'EncodingType': 'url', 'Prefix': 'test/model/main/2022/09/13/12'}
...
if http.status_code >= 300:
error_code = parsed_response.get("Error", {}).get("Code")
error_class = self.exceptions.from_code(error_code)
> raise error_class(parsed_response, operation_name)
E botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the ListObjects operation: The provided token has expired.
../../venv/lib/python3.8/site-packages/botocore/client.py:914: ClientError

Unable to download file from S3 because "A client error (403) occurred when calling the HeadObject operation: Forbidden"

I am trying to download a file in code from an S3 bucket I created through AWS CDK, but got this error "A client error (403) occurred when calling the HeadObject operation: Forbidden". At first I thought it was because I didn't add s3:GetObject action to the IAM policy statement, but I still get that error. Below is how I created the bucket:
# S3
bucket = s3.Bucket(
self, "testS3Bucket", bucket_name=f"test_s3_bucket"
)
service_lambda.add_to_role_policy(
iam.PolicyStatement(
effect=iam.Effect.ALLOW,
actions=[
"s3:ListBucket",
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:HeadObject",
],
resources=[bucket.arn_for_objects("*")],
)
)
Here is the code where I download the file from S3:
def download_file(self, file_name, s3_bucket):
try:
file = self.s3.Object(s3_bucket, file_name).load()
except ClientError as e:
if e.response["Error"]["Code"] == "404":
log.error("File does not exist for partner")
return {}
else:
raise e
except Exception as e:
raise e
return file
Does anybody know how I can get past this issue?
A simpler way to grant your lambda appropriate permissions would be something like this:
bucket = s3.Bucket(
self, "testS3Bucket", bucket_name=f"test_s3_bucket"
)
bucket.grant_read_write(service_lambda.role)
Based on docs
If an encryption key is used, permission to use the key for encrypt/decrypt will also be granted.
Give that a try and see if you still receive a permissions error

S3 Default server side encryption on large number of buckets using Python boto3

Hi Iam trying to turn on default s3 encryption on all my buckets in an account using python boto3 script see below.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
for bucket in response['Buckets']:
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
s3.put_bucket_encryption(
Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
}
)
But i am struggling with my code which is not working
gives error
File "apply.py", line 10, in <module>
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 272, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 576, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ServerSideEncryptionConfigurationNotFoundError) when calling the GetBucketEncryption operation: The server side encryption configuration was not found
You are passing the wrong bucket name. Change Bucket=enc to Bucket=bucket['Name'] in your call to put_bucket_encryption.
Note also that the call to get_bucket_encryption will throw an exception if the bucket does not actually have encryption configured. While that might seem odd, that's the way it works (see boto3/issues/1899 for more details). So, to handle this potential exception:
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(...)
else:
print("Unexpected error: %s" % e)
Please see below fixed code thanks #jarmod for quick help.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
client = boto3.client('s3')
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
for bucket in response['Buckets']:
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
})
else:
print("Unexpected error: %s" % e)
```

BadDigest when calling the PutObject operation s3 boto

This code works fine
import boto3
def upload(request):
try:
get_file = request.POST['file'].file
d_filename = 'foo-'+uuid.uuid4().hex
s3 = boto3.resource('s3')
s3.Bucket('Bucket_name').put_object(Key=d_filename, Body=get_file, ContentType=ContentType)
return d_filename
except Exception, e:
log.error(str(e))
return 'error'
but when i want to put md5 hash of file in filename it throws error- "An error occurred (BadDigest) when calling the PutObject operation (reached max retries: 4): The Content-MD5 you specified did not match what we received."
import boto3
import hashlib
def upload(request):
try:
get_file = request.POST['file'].file
d_filename = 'foo-'+str(hashlib.md5(get_file.read()).hexdigest())
s3 = boto3.resource('s3')
s3.Bucket('Bucket_name').put_object(Key=d_filename, Body=get_file, ContentType=ContentType)
return d_filename
except Exception, e:
log.error(str(e))
return 'error'
I am not trying to calculate md5 of file already uploaded or trying to set it for file but want md5 in name of the file.

S3ResponseError: 403 Forbidden using boto

I have a script that copy files from one S3 account to another S3 account, It was working befoure!!!! That's for sure. Than I tried it today and it doesn't any more it gives me error S3ResponseError: 403 Forbidden. I'm 100% sure credentials are correct and I can go and download keys from both accounts manualy using aws console.
Code
def run(self):
while True:
# Remove and return an item from the queue
key_name = self.q.get()
k = Key(self.s_bucket, key_name)
d_key = Key(self.d_bucket, k.key)
if not d_key.exists() or k.etag != d_key.etag:
print 'Moving {file_name} from {s_bucket} to {d_bucket}'.format(
file_name = k.key,
s_bucket = source_bucket,
d_bucket = dest_bucket
)
# Create a new key in the bucket by copying another existing key
acl = self.s_bucket.get_acl(k)
self.d_bucket.copy_key( d_key.key, self.s_bucket.name, k.key, storage_class=k.storage_class)
d_key.set_acl(acl)
else:
print 'File exist'
self.q.task_done()
Error:
File "s3_to_s3.py", line 88, in run
self.d_bucket.copy_key( d_key.key, self.s_bucket.name, k.key, storage_class=k.storage_class)
File "/usr/lib/python2.7/dist-packages/boto/s3/bucket.py", line 689, in copy_key
response.reason, body)
S3ResponseError: S3ResponseError: 403 Forbidden
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>0729E8ADBD7A9E60</RequestId><HostId>PSbbWCLBtLAC9cjW+52X1fUSVErnZeN79/w7rliDgNbLIdCpc9V0bPi8xO9fp1od</HostId></Error>
Try this: copy key from source bucket to destination bucket using boto's Key class
source_key_name = 'image.jpg' # for example
#return Key object
source_key = source_bucket.get_key(source_key_name)
#use Key.copy
source_key.copy(destination_bucket,source_key_name)
regarding the copy function. you can set preserve_acl to True and it will be copied from the source key.
Boto's Key.copy signature:
def copy(self, dst_bucket, dst_key, metadata=None,
reduced_redundancy=False, preserve_acl=False,
encrypt_key=False, validate_dst_bucket=True):

Categories

Resources