Upload multiple pdf files through API Gateway Lambda AWS- Python - python

I am having an issue passing the file name while uploading it to S3 using API Gateway-Lambda. How should I pass files name dynamically, meaning users should put multiple files and original file names should be there upon landing to S3.Here is my code,
def lambda_handler(event, context):
try:
s3 = boto3.client("s3")
buffer = BytesIO()
get_file_content = event["body"]
content = base64.b64decode(get_file_content)
s3_upload = s3.put_object(Bucket=BUCKET_NAME, Key= "test.pdf", Body = content )
return {
"statuscode" : 200,
"body" : json.dumps("complete")
}
except Exception as e:
logger.error(
"Error uploading files"
": {}".format(e)
)
raise e

Related

CORS error appears when I Upload multiple files at once to S3 Bucket with boto3

I have a question about S3 upload, I use boto3 library in python to connect to S3.
def get_client():
return boto3.client('s3', aws_access_key_id = 'AWS_ACCESS_KEY',
aws_secret_access_key = 'AWS_SECRET_KEY')
My Multiple upload file in below
for item in img_files:
file_type = ''
uuid_key = uuid.uuid4()
fs = FileSystemStorage()
filename = fs.save(f'{S3_TEMP_FOLDER}{uuid_key}-{item.name}', item)
file_o_name, file_o_ext = os.path.splitext(filename)
if file_o_ext.lower() in ['.jpg', '.png', '.jpeg']:
file_type = 'image'
else:
file_type = 'video'
uploaded_file_path = fs.path(filename)
try:
s3_client.upload_file(uploaded_file_path, bucket_name, f'{folder_key}/{uuid_key}-{item.name}', ExtraArgs={'ACL': 'public-read'})
uploaded_data = {
"file_name": item.name,
"file_key": f'{folder_key}/{uuid_key}-{item.name}',
"bucket_name": bucket_name,
"folder_id": get_folder_id.id,
"created_by": request.user.username,
"updated_by": request.user.username,
"bucket_id": bucket.id,
"file_type": file_type
}
S3FileManagement.objects.create(**uploaded_data)
except Exception as e:
logger.error({'message': str(e)})
logger.error({'message': S3_FOLDER_MESSAGE['S3_FILE_UPLOAD_FAILED']})
return Response({'message': S3_FOLDER_MESSAGE['S3_FILE_UPLOAD_FAILED']}, status=status.HTTP_400_BAD_REQUEST)
When I upload about 5-10 files at a time, the upload process is no problem, but when I upload > 30 files, there will be some files with 502 - CORS errors. (When > than 30 files, not all errors. For example, file 0-15 uploads normally, file 16 has an error, file 17-25 uploads normally, file 26 uploads an error, file 27-30 uploads normally.)
In settings python, I was allows CORS
CORS_ORIGIN_ALLOW_ALL = True
CORS_ALLOWED_ORIGINS = [
'..{url}..vercel.app',
]
CSRF_TRUSTED_ORIGINS = [
'..{url}..vercel.app',
]
Additionaly, the python backend I'm using aapanel. Any more additional settings needed?
FE I used ReactJS and Ant Design

Change directory of xlsx file in s3 bucket using AWS Lambda

The goal of my code is to change the directory of a file every 24 hours (because every day a new one is created with another lambda function). I want to get the current file from my s3 bucket and write it to another directory in the same s3 bucket. Currently, this line of the code does not work: s3.put_object(Body=response, Bucket=bucket, Key=fileout) and I get this error: "errorMessage": "Parameter validation failed:\nInvalid type for parameter Body, "errorType": "ParamValidationError" What does the error mean and what is needed in order to be able to store the response in the history directory?
import boto3
import json
s3 = boto3.client('s3')
bucket = "some-bucket"
def lambda_handler(event, context):
file='latest/some_file.xlsx'
response = s3.get_object(Bucket=bucket, Key=file)
fileout = 'history/some_file.xlsx'
s3.put_object(Body=response, Bucket=bucket, Key=fileout)
return {
'statusCode': 200,
'body': json.dumps(data),
}
The response variable in your code stores more than just the actual xlsx file. You should get the body from the response and pass it to the put object method.
response = s3.get_object(Bucket=bucket, Key=file)['Body']

How to save a file in a folder within a S3 bucket using Flask?

I try to save images in a folder within my s3 bucket, but it always save within the root of the bucket, not the folder. Here is what my code looks like:
s3 = boto3.client(
"s3",
aws_access_key_id = Config.S3_ACCESS_KEY,
aws_secret_access_key = Config.S3_SECRET_ACCESS_KEY
)
def upload_file_to_s3(file, acl="public-read"):
try:
s3.upload_fileobj(
file,
Config.S3_BUCKET,
file.filename,
ExtraArgs={
"ACL": acl,
"ContentType": file.content_type
}
)
except Exception as e:
print("Something happened: ", e)
return e
return f"{ Config.S3_DOMAIN }profiles_name/{ file.filename }"
Profiles_name is the name of the folder. How to do I achieve this?
It should work for you:
s3.upload_fileobj(
file,
Config.S3_BUCKET,
f"path/to/folder/within/bucket/{file.filename}",
ExtraArgs={
"ACL": acl,
"ContentType": file.content_type
}
)

Boto3 upload_file is silently failing

I am trying to upload a file in s3 bucket and the following code i have used to achieve the same.
Code
accessKey = ''
secretKey = ''
session = boto3.Session(aws_access_key_id = accessKey, aws_secret_access_key = secretKey,)
s3 = session.resource('s3')
try:
response =s3.Object(bucket_name,'sample.docx').upload_file(Filename='C:/Users/Anushka/Desktop/sample.docx')
except Exception as e:
return e
The code does not do anything not even raising any error and if I print "response", "None" gets printed on the shell. I am not able to understand what is the problem with the code.

Write Twitter data to s3 bucket

i am trying to write my streamed data(json format) to s3 bucket. i am using below code but not able to write. No error while executing the below code. but no json files in s3
class TweetsListener( StreamListener):
def __init__(self,path):
self.path = path
def on_data(self, data):
try:
s3 = boto3.resource('s3')
s3.put_object(Bucket='bucket',Body=data.encode('UTF-8'),Key='/A/'+self.path+'/test.json')
return True
except BaseException as e:
print("Error on_data: %s" % str(e))
return True
def on_error(self, status):
print(status)
return True
From what I can tell, you are trying to use put_object action on boto3's S3 Service Resource instead of the S3 Client.
The Service Resource doesn't have a put_object method.
Also, you should remove the leading / in the Key, and make sure that your bucket is already created.

Categories

Resources