How can I use boto3 resource to read a KMS encrypted file from S3 bucket?
Below is the snippet that I am using to read a non-encrypted file -
s3 = boto3.resource('s3')
obj = s3.Object(bucket_name, key)
body = obj.get()['Body'].read()
print(' body = {}'.format(body))
There's a helpful answer at Do I need to specify the AWS KMS key when I download a KMS-encrypted object from Amazon S3?
No, you don’t need to specify the AWS KMS key ID when you download an
SSE-KMS-encrypted object from an S3 bucket. Instead, you need the
permission to decrypt the AWS KMS key.
So, you don't need to provide KMS info on a GetObject request (which is what the boto3 resource-level methods are doing under the covers), unless you're doing CMK. You just need to have permission to access the KMS key for decryption. S3/KMS will do the rest for you.
You can configure the IAM policy associated with the Lambda function’s IAM role per the linked article.
Related
I need to download a file from this URL https://desafio-rkd.s3.amazonaws.com/disney_plus_titles.csv with Python, try to do it with "" require.get '", but it returns me denied access. I understand that I have to authenticate. I have the key and the secret key, but I do not know how to do it.
Help me please?
The preferred way would be to use the boto3 library for Amazon S3. It has a download_file() command, for which you would use:
import boto3
s3_client = boto3.client('s3')
s3_client.download_file('desafio-rkd', 'disney_plus_titles.csv', 'disney_plus_titles.csv')
The parameters are: Bucket, Key, local filename to use when saving the file
Also, you will need to provide an Access Key and Secret Key. The preferred way to do this is to store them in a credentials file. This can be done by using the AWS Command-Line Interface (CLI) aws configure command.
See: Credentials — Boto3 documentation
I need to upload a csv diretly to AWS S3 with public access.
My current code is:
import boto3
s3 = boto3.resource('s3', aws_access_key_id='xxx', aws_secret_access_key='yyy')
s3.Bucket('test1234542').upload_file('C:/Users/output1.csv', 'output1.csv')
Unfortunately the permission is private and i dont know who to change to code to upload it with public access directly.
At the moment i have to go manually to the bucket, click on the folder "Permissions",click on "Public Access" and then make a tick at "Read object".
Does someone know a python code to add the public access permission?
Best
Michi
When you upload a file in AWS S3, by default its private.
In order to give public access to the file, first change the public access options of the bucket to allow the changes made to its objects.
change public access settings of the bucket
code to upload file with public access :
import boto3
s3 = boto3.resource('s3')
s3.Bucket('bucket_name').upload_file('data.csv', 'data.csv')
file_object = s3.Bucket('bucket_name').Object('data.csv')
print(file_object.Acl().put(ACL = 'public-read'))
Hope It helps !!
You could use this to make the S3 bucket public. Make sure your EC2 instance has the correct instance-profile attached and it has the correct permissions to interact with S3( Possible s3 full access). Use the following link and let me know if it helps!
S3 Bucket Access Public
Provide the ACL argument in your put_object statement and the uploaded object will be public. Refer to this documentation.
client.put_object(
ACL='public-read',
Body='file that is to be uploaded',
Bucket='name of the bucket',
Key='obejct key'
)
I am trying to upload the image to S3 and then have AWS Rekognition fetch it from S3 for face detection, but Rekognition cannot do that.
Here is my code - uploading and then detecting:
import boto3
s3 = boto3.client('s3')
s3.put_object(
ACL='public-read',
Body=open('/Users/1111/Desktop/kitten800300/kitten.jpeg', 'rb'),
Bucket='mobo2apps',
Key='kitten_img.jpeg'
)
rekognition = boto3.client('rekognition')
response = rekognition.detect_faces(
Image={
'S3Object': {
'Bucket': 'mobo2apps',
'Name': 'kitten_img.jpeg',
}
}
)
this produces an error:
Unable to get object metadata from S3. Check object key, region and/or access permissions.
Why is that?
About the permissions: I am authorized with AWS root access keys, so I have full access to all resources.
Here are the few things that you can do:
Make sure the region of the S3 bucket is the same as Recognition. Otherwise, it won't work. S3 service is global but every bucket is created in a specific region. The same region should be used by AWS clients.
Make sure the access keys of the user or role have the right set of permissions for the resource.
Make sure the file is actually uploaded.
Make sure there is no bucket policy applied that revokes access.
You can enable logging on your S3 bucket to see errors.
Make sure the bucket is not versioned. If versioned, specify the object version.
Make sure the object has the correct set of ACLs defined.
If the object is encrypted, make sure you have permission to use that KMS key to decrypt the object.
You have to wait for a while that the image uploading is done.
The code looks running smoothly, so your jpeg starts to upload and even before the uploading is finished, Rekognition starts to detect the face from the image. Since the uploading is not finished when the code runs, it cannot find the object from your S3. Put a wait time a bit.
Is there a way to generate a presigned url S3 for a private file encrypted with KMS Key using Python or AWS cli?
I searched in boto3 library docs and AWS cli docs, but i didn't find anything about KMS.
Thank you in advance
I have been given a bucket name with ARN Number as below:
arn:aws:iam::<>:user/user-name
I was also given an access key.
I know that this can be done using boto.
Connect to s3 bucket using IAM ARN in boto3
As in the above link do i need to use 'sts'?
if so why am i provided with an access key?
First, I recommend you install the AWS Command-Line Interface (CLI), which provides a command-line for accessing AWS.
You can then store your credentials in a configuration file by running:
aws configure
It will prompt you for the Access Key and Secret Key, which will be stored in a config file.
Then, you will want to refer to S3 — Boto 3 documentation to find out how to access Amazon S3 from Python.
Here's some sample code:
import boto3
client = boto3.client('s3', region_name = 'ap-southeast-2') # Change as appropriate
client.upload_file('/tmp/hello.txt', 'mybucket', 'hello.txt')