I used this python code to upload file to s3 bucket using lambda. Getting an error:
{
"errorMessage": "Syntax error in module 'lambda_function': (unicode error) 'unicodeescape' codec can't decode bytes in position 2-3: truncated \\UXXXXXXXX escape (lambda_function.py, line 10)",
"errorType": "Runtime.UserCodeSyntaxError",
"stackTrace": [
" File \"/var/task/lambda_function.py\" Line 10\n file_name= \"C:\\Users\\smanthriprag\\Pictures\\Screenshots\\s.jpeg\"\n"
]
}
import json
import boto3
from botocore.exceptions import ClientError
def lambda_handler(file_name, bucket, object_name=None):
file_name= "C:\Users\smanthriprag\Pictures\Screenshots\s.jpeg"
bucket= "serverlesswebapp0406"
# If S3 object_name was not specified, use file_name
if object_name is None:
object_name = file_name
# Upload the file
s3_client = boto3.client('s3')
try:
response = s3_client.upload_file(file_name, bucket, object_name)
print('Step 3: upload done')
except ClientError as e:
logging.error(e)
return False
return True['response']
An AWS Lambda function written in Python should use the following handler:
def lambda_handler(event, context):
If information is being passed into the Lambda function, it will be made available via the event. The contents of event depends upon how the Lambda function is triggered (eg triggered by S3, triggered by SQS, or run via an Invoke() command).
Your program has an incorrect definition for the handler function.
See: Lambda function handler in Python - AWS Lambda
Related
I am trying to retrieve a file from an S3 bucket (using boto3), but I keep getting "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied". Below is how I created the S3 bucket:
# S3
bucket = s3.Bucket(
self,
"testS3Bucket",
bucket_name=f"test_s3_bucket",
versioned=True,
)
bucket.grant_read_write(service_lambda)
and the method I used to get the file from S3:
def download_file(self, file_name, s3_bucket):
try:
file = self.s3.Bucket(s3_bucket).Object(file_name)
file_content = file.get()["Body"].read()
except ClientError as e:
if e.response["Error"]["Code"] == "404":
log.error("File does not exist for partner")
return {}
else:
raise e
except Exception as e:
raise e
return file_content.decode()
I even went as far as to add "s3:*" action to the IAM policy statement, but I still got the same error. I was able to use the command below to successfully retrieve the file's metadata so I don't think it is a permissions error.
aws s3api head-object --bucket test_s3_bucket --key configuration.txt
The goal of my code is to change the directory of a file every 24 hours (because every day a new one is created with another lambda function). I want to get the current file from my s3 bucket and write it to another directory in the same s3 bucket. Currently, this line of the code does not work: s3.put_object(Body=response, Bucket=bucket, Key=fileout) and I get this error: "errorMessage": "Parameter validation failed:\nInvalid type for parameter Body, "errorType": "ParamValidationError" What does the error mean and what is needed in order to be able to store the response in the history directory?
import boto3
import json
s3 = boto3.client('s3')
bucket = "some-bucket"
def lambda_handler(event, context):
file='latest/some_file.xlsx'
response = s3.get_object(Bucket=bucket, Key=file)
fileout = 'history/some_file.xlsx'
s3.put_object(Body=response, Bucket=bucket, Key=fileout)
return {
'statusCode': 200,
'body': json.dumps(data),
}
The response variable in your code stores more than just the actual xlsx file. You should get the body from the response and pass it to the put object method.
response = s3.get_object(Bucket=bucket, Key=file)['Body']
I created lambda function. In the function, I scanned all the s3 buckets if the bucket name contains "log" word. Took those buckets and apply the lifecycle. If the retention older than 30 days delete all kind of files under the buckets that contains "log" word. But when I triggered the code I get this format error.I couldn't fix it. Is my code or format wrong ?
This is my python code:
import logging
import boto3
import datetime, os, json, boto3
from botocore.exceptions import ClientError
def lambda_handler(event, context):
s3 = boto3.client('s3')
response = s3.list_buckets()
# Output the bucket names
print('Existing buckets:')
for bucket in response['Buckets']:
#print(f' {bucket["Name"]}')
if 'log' in bucket['Name']:
BUCKET = bucket["Name"]
print (BUCKET)
try:
policy_status = s3.put_bucket_lifecycle_configuration(
Bucket = BUCKET ,
LifecycleConfiguration={'Rules': [{'Expiration':{'Days': 30,'ExpiredObjectDeleteMarker': True},'Status': 'Enabled',} ]})
except ClientError as e:
print("Unable to apply bucket policy. \nReason:{0}".format(e))
The error
Existing buckets:
sample-log-case
sample-bucket-log-v1
template-log-v3-mv
Unable to apply bucket policy.
Reason:An error occurred (MalformedXML) when calling the PutBucketLifecycleConfiguration operation: The XML you provided was not well-formed or did not validate against our published schema
I took below example from boto3 site, I just changed parameters didnt touch the format but I also get the format issue for this too :
import boto3
client = boto3.client('s3')
response = client.put_bucket_lifecycle_configuration(
Bucket='bucket-sample',
LifecycleConfiguration={
'Rules': [
{
'Expiration': {
'Days': 3650,
},
'Status': 'Enabled'
},
],
},
)
print(response)
According to the documentation, the 'Filter' rule is required if the LifecycleRule does not contain a 'Prefix' element (the 'Prefix' element is no longer used). 'Filter', in turn, requires exactly one of 'Prefix', 'Tag' or 'And'.
Adding the following to your 'Rules' in LifecycleConfiguration will solve the problem:
'Filter': {'Prefix': ''}
Hi Iam trying to turn on default s3 encryption on all my buckets in an account using python boto3 script see below.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
for bucket in response['Buckets']:
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
s3.put_bucket_encryption(
Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
}
)
But i am struggling with my code which is not working
gives error
File "apply.py", line 10, in <module>
enc = s3.get_bucket_encryption(Bucket=bucket['Name'])
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 272, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/Users/hhaqqani/Library/Python/2.7/lib/python/site-packages/botocore/client.py", line 576, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ServerSideEncryptionConfigurationNotFoundError) when calling the GetBucketEncryption operation: The server side encryption configuration was not found
You are passing the wrong bucket name. Change Bucket=enc to Bucket=bucket['Name'] in your call to put_bucket_encryption.
Note also that the call to get_bucket_encryption will throw an exception if the bucket does not actually have encryption configured. While that might seem odd, that's the way it works (see boto3/issues/1899 for more details). So, to handle this potential exception:
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(...)
else:
print("Unexpected error: %s" % e)
Please see below fixed code thanks #jarmod for quick help.
import boto3
from botocore.exceptions import ClientError
s3 = boto3.client('s3')
response = s3.list_buckets()
client = boto3.client('s3')
SSECNF = 'ServerSideEncryptionConfigurationNotFoundError'
for bucket in response['Buckets']:
try:
bucket = client.get_bucket_encryption(Bucket=bucket['Name'])
# check current encryption here, if it's not what you want then update it
# check bucket['ServerSideEncryptionConfiguration']['Rules']
except client.exceptions.ClientError as e:
if e.response['Error']['Code'] == SSECNF:
s3.put_bucket_encryption(Bucket=bucket['Name'],
ServerSideEncryptionConfiguration={
'Rules': [
{
'ApplyServerSideEncryptionByDefault': {
'SSEAlgorithm': 'AES256'
}
},
]
})
else:
print("Unexpected error: %s" % e)
```
This code works fine
import boto3
def upload(request):
try:
get_file = request.POST['file'].file
d_filename = 'foo-'+uuid.uuid4().hex
s3 = boto3.resource('s3')
s3.Bucket('Bucket_name').put_object(Key=d_filename, Body=get_file, ContentType=ContentType)
return d_filename
except Exception, e:
log.error(str(e))
return 'error'
but when i want to put md5 hash of file in filename it throws error- "An error occurred (BadDigest) when calling the PutObject operation (reached max retries: 4): The Content-MD5 you specified did not match what we received."
import boto3
import hashlib
def upload(request):
try:
get_file = request.POST['file'].file
d_filename = 'foo-'+str(hashlib.md5(get_file.read()).hexdigest())
s3 = boto3.resource('s3')
s3.Bucket('Bucket_name').put_object(Key=d_filename, Body=get_file, ContentType=ContentType)
return d_filename
except Exception, e:
log.error(str(e))
return 'error'
I am not trying to calculate md5 of file already uploaded or trying to set it for file but want md5 in name of the file.