Using Boto3 and Python How Make a Folder Public - python

I want to make a folder called img that already exists within my private bucket public. I'm using Boto3. I just want to make this folder public not anything else using a script..
This is how I'm currently connecting to the bucket and how far I have got....
ACCESS_KEY_ID = 'xxxxx'
ACCESS_KEY_SECRET = 'xxxx'
bucket_name = 'mybucket'
sourceDir = "../../docs/buildHTML/html/"
destDir = ''
r = boto3.setup_default_session(region_name='eu-west-1')
s3 = boto3.resource('s3', aws_access_key_id=ACCESS_KEY_ID, aws_secret_access_key=ACCESS_KEY_SECRET)
bucket = s3.Bucket(bucket_name)
So I have the bucket and this works. How do I now make the folder img that already exists public?

You need to add a policy to the bucket, something like this:
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"PublicReadImages",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::mybucket/abc/img/*"]
}
]
}
You can do this through the AWS console, or any of the SDKs. In boto3, I think you do it like this:
bucket = s3.Bucket(bucket_name)
response = bucket.put(
Policy = '<policy string here>'
)

Related

How can we upload multiple images using presigned_post url in s3

I am trying to upload multiple images in s3 from react application using aws api gateway.
I have tried below approach:
Setup api gateway which target to lambda function.
lambda function code:
import json
import boto3
def lambda_handler(event, context):
print(event)
s3 = boto3.client('s3', region_name='us-east-1')
bucket_name = 'testimagesbucketupload'
URL = s3.generate_presigned_post(
Bucket= bucket_name,
Key="${filename}",
# Conditions=[
# ["starts-with", "$success_action_redirect", ""],
# ["eq", "$userid", "test"],
# ],
ExpiresIn=3600)
data = {"url": URL['url'], "fields": URL['fields']}
print(type(data))
# print(data)
return data
Using above code i am able to upload single image from web and postman both but now i want to upload multiple image using this url and also want to retrieve image for preview..
If any one worked please help me
Thanks in advance..
I tried presigned_post and presigned-url for achieve this but still i am not able to achieve this
You'll need to create one url for image, but you can use a loop to create all of them. I think something like this could work for you
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3', region_name='us-east-1')
bucket_name = 'testimagesbucketupload'
image_list = event['image_list']
data = []
for image in image_list:
URL = s3.generate_presigned_post(
Bucket= bucket_name,
Key=image,
ExpiresIn=3600)
data.append({"url": URL['url'], "fields": URL['fields']})
return data
Note that you need to pass a list of images in the event
For the preview, you could use a presigned url to return the image as a public url...
from botocore.client import Config
import boto3
s3 = boto3.client('s3', config=Config(signature_version='s3v4'), region_name = "your_region")
presigned_url = s3.generate_presigned_url('get_object',
Params={'Bucket': "your_bucket",
'Key': "your_file_key"},
ExpiresIn=3600)
presigned_url

How to point GOOGLE_APPLICATION_CREDENTIALS to a json file stored in an S3 Bucket?

I have my credentials stored in an S3 bucket and can access the file using the boto3 library but how can I pointos.environ['GOOGLE_APPLICATION_CREDENTIALS'] to the file stored in S3
client = boto3.client(
"s3",
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
credentials = client.get_object(Bucket=bucket_name, Key='name-of-file.json')
GOOGLE_CREDENTIALS = json.loads(credentials["Body"].read().decode('utf-8'))
# THIS DOES NOT WORK
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = GOOGLE_CREDENTIALS

Upload multiple files from S3 to Frame IO

On file upload in S3, I am triggering lambda function which will generate s3 url and create file in Frame IO. Whenever I am trying to upload many files at once in S3, file is not creating properly in Frame IO and throwing Preview Unsupported Error (for mp4 files which is supported by default). To fix this issue, I tried to use index as a request parameter which worked out only on 2 or 3 files upload. If I am trying to upload more files, the same error arise. Please find the lambda function code below
import requests
import boto3
import json
import urllib.parse
import mimetypes
from botocore.config import Config
import os
s3_client = boto3.client('s3', config = Config(signature_version='s3v4'))
client = boto3.client('ssm')
def lambda_handler(event, context):
print(event)
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
if not key.endswith('/'):
if key.find('/') >= 0:
temp_key = key.rsplit('/', 1)
key = temp_key[1]
print(key)
size = event['Records'][0]['s3']['object']['size']
frameioIndex = int(client.get_parameter(Name='/frameio/asset/index_Dev')['Parameter']['Value']) - 1
print(frameioIndex)
s3_url = s3_client.generate_presigned_url("get_object", Params={"Bucket": bucket, "Key": key})
response = requests.post(os.environ['FRAMEIO_BASE_API_URL'] + "assets" + "/" + os.environ['FRAMEIO_PROJECT_ID'] + "/" + "children",data=json.dumps({"type": "file","name": key,"filesize": size,"filetype": mimetypes.guess_type(key)[0],"source": {"url": s3_url},"index": frameioIndex}), headers={"Authorization":"Bearer " + os.environ['FRAMEIO_TOKEN'], "Content-type": "application/json"}) client.put_parameter(Name='/frameio/asset/index_Dev',Value=str(frameioIndex),Type='String',Overwrite=True)
print(response)
return {
'statusCode': 200,
'body': json.dumps('Successfully uploaded the asset!')
}
return {
'statusCode': 200,
'body': json.dumps('Uploaded object is not a file!')
}
The issue resolved by changing variable name 'key' to 'filename' in line number 14 (key = temp_key[1]) and used filename in requests API. The above issue occurred as I tried to override the filename and passing it to generate_presigned_url method to generate s3 url.

AWS presigned URLS location constraint is incompatible for the region specific endpoint this request was sent to

I am using a lambda to create a pre-signed URL to download files that land in an S3 bucket -
the code works and I get a URL but when trying to access it I get
af-south-1 location constraint is incompatible for the region-specific endpoint this request was sent to.
both the bucket and the lambda are in the same region
I'm at a loss as to what is actually happening any ideas or solutions would be greatly appreciated.
my code is below
import json
import boto3
import boto3.session
def lambda_handler(event, context):
session = boto3.session.Session(region_name='af-south-1')
s3 = session.client('s3')
for record in event['Records']:
bucket = record['s3']['bucket']['name']
key = record['s3']['object']['key']
url = s3.generate_presigned_url(ClientMethod='get_object',
Params={'Bucket': bucket,
'Key': key}, ExpiresIn = 400)
print (url)```
Set an endpoint_url=https://s3.af-south-1.amazonaws.com while generating the s3_client
s3_client = session.client('s3',
region_name='af-south-1',
endpoint_url='https://s3.af-south-1.amazonaws.com')
Could you please try using the boto3 client directly rather than via the session, and generate the pre-signed url :
import boto3
import requests
# Get the service client.
s3 = boto3.client('s3',region_name='af-south-1')
# Generate the URL to get 'key-name' from 'bucket-name'
url = s3.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': 'bucket-name',
'Key': 'key-name'
}
)
You could also have a look at these 1 & 2, which resembles the same issue.

creating a CSV file and then attaching to an email using boto3

I have a report I created. it basically pulls data, manipulates it than sends the report to a S3 bucket. What I would like to know how to do is how I can pull that CSV from the S3 bucket and email it out. I send it to S3 for longterm retention initially.
other code
..
..
..
copy_source = {'Bucket': target, 'Key': 'mycsv.csv' }
s3client.copy_object(CopySource = copy_source, Bucket = target, Key = dated_file )
s3client.delete_object(Bucket = target, Key = 'generic.csv')
I would like to attach the csv located in the s3 bucket to the boto3
something goes wrong.
is it possible?
lets say target = s3://mys3bucket
UPDATE:::So I have found a solution using boto3 s3_object.get_object
this will send the email and attach the attachment to the email.
sg = MIMEMultipart()
new_body = "The following EC2 server are up and running"
text_part = MIMEText(new_body, _subtype="html")
msg.attach(text_part)
filename='generic.csv'
msg["To"] = "randal1981#gmail.com"
msg["From"] = "randal1981#gmail.com"
s3_object = boto3.client('s3', 'us-west-1')
s3_object = s3_object.get_object(Bucket=target, Key=filename)
body = s3_object['Body'].read()
part = MIMEApplication(body, filename)
part.add_header("Content-Disposition", 'attachment', filename=filename)
msg.attach(part)
ses_aws_client = boto3.client('ses', 'us-west-1')
ses_aws_client.send_raw_email(RawMessage={"Data" : msg.as_bytes()})
I have posted in the edit of the original question how I was able to actually send the attachment via email. what I was trying to understand is how to pull the attachment from S3. I did not explain myself very well. I found a similar solution using the boto3 call: s3_object.get_object. It worked well with my code. I hope it can help someone else...

Categories

Resources