An error occurred (404) when calling the PutObject operation: Not Found - python

I wanna upload files into the s3 cloud with this code:
session = boto3.session.Session()
s3_client = session.resource('s3',
endpoint_url=credentials.get('endpoint_url'),
aws_access_key_id=credentials.get('access_key'),
aws_secret_access_key=credentials.get('secret_key'),
)
bucket = s3_client.Bucket("test")
bucket.upload_file("req.txt", "req.txt")
and I get this error:
boto3.exceptions.S3UploadFailedError: Failed to upload req.txt to test/req.txt: An error occurred (404) when calling the PutObject operation: Not Found
also, I am using this code:
session = boto3.session.Session()
s3_client = session.client(service_name='s3',
endpoint_url=credentials.get('endpoint_url'),
aws_access_key_id=credentials.get('access_key'),
aws_secret_access_key=credentials.get('secret_key'),
)
def upload_file(filename, bucket_name, name_in_bucket):
s3_client.upload_file(filename, bucket_name, name_in_bucket)
upload_file('req.txt', 'test_mmdaz', 'testfile.txt')
and I get the same error.
any idea:?

According to #RobertoS comment, My fault is that the bucket: test isn't in my s3 cloud object service.

Related

ExpiredToken error when using python moto lib to mock s3

I am new to Python. Anyone knows how moto mock test work? I am going to test the code which will fetch data from S3 and convert data and upload to S3. I used moto lib to mock s3, however it shows "An error occurred (ExpiredToken) when calling the ListObjects operation" when calling S3 from my code.
Do I need to mock aws credential? How should I mock it? (I check the moto lib and there is no such thing mock_arn/mock_credential )
Thank you in advance.
Here is my code: inside parse_requests method, it will fetch data from S3 and convert data, then upload to S3.
Class Test_processing_data(TestCase):
def setUp(self):
self.mock_s3 = mock_s3()
self.mock_s3.start()
self.mock_logs = mock_logs()
self.mock_logs.start()
self.bucket_region = "us-east-1"
self.bucket_name = "test-bucket"
self.s3_client = boto3.client("s3", region_name=self.bucket_region)
self.s3_client.create_bucket(Bucket=self.bucket_name)
#mock_s3
def test_parse_requests(self):
bucket_name = "test-bucket"
prefix = "test/model/main/"
execution_date = "2022/09/13/12"
parse_requests(execution_date, bucket_name, prefix)
Here is the error message:
self = <botocore.client.S3 object at 0x10abf18e0>
operation_name = 'ListObjects'
api_params = {'Bucket': 'campaign-performance-forecasting-offline', 'EncodingType': 'url', 'Prefix': 'test/model/main/2022/09/13/12'}
...
if http.status_code >= 300:
error_code = parsed_response.get("Error", {}).get("Code")
error_class = self.exceptions.from_code(error_code)
> raise error_class(parsed_response, operation_name)
E botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the ListObjects operation: The provided token has expired.
../../venv/lib/python3.8/site-packages/botocore/client.py:914: ClientError

Accessing QLDB ledger in another AWS account

I'm having trouble accessing the QLDB ledger in another AWS account.
I have granted necessary IAM permission for cross-account access.
I set the credentials in the EC2 where my python script is runing using below code.
sts_client = boto3.client("sts", region_name=region)
response = sts_client.assume_role(
RoleArn="arn:aws:iam::xxx:role/xxx-ec2",
RoleSessionName="RoleSessionname",
)
os.environ["AWS_ACCESS_KEY_ID"] = response["Credentials"]["AccessKeyId"]
os.environ["AWS_SECRET_ACCESS_KEY"] = response["Credentials"]["SecretAccessKey"]
os.environ["AWS_SESSION_TOKEN"] = response["Credentials"]["SessionToken"]
os.environ["AWS_DEFAULT_REGION"] = region
but keep on getting below error
in _get_session
raise ExecuteError(e, True, True)
pyqldb.errors.ExecuteError: Error containing the context of a failure during execute.
botocore.errorfactory.BadRequestException: An error occurred (BadRequestException) when calling the SendCommand operation: The Ledger with name my-ledger is not found
the error is thrown during the execution of the below code.
qldb_driver = QldbDriver(ledger_name='my-ledger', region_name='us-east-1')
result = qldb_driver.execute_lambda(lambda x: read_table(x, table_name))
Found out that the credentials can be passed to QldbDriver function per -> https://github.com/awslabs/amazon-qldb-driver-python/blob/master/pyqldb/driver/qldb_driver.py#L103

Unable to download file from S3 because "A client error (403) occurred when calling the HeadObject operation: Forbidden"

I am trying to download a file in code from an S3 bucket I created through AWS CDK, but got this error "A client error (403) occurred when calling the HeadObject operation: Forbidden". At first I thought it was because I didn't add s3:GetObject action to the IAM policy statement, but I still get that error. Below is how I created the bucket:
# S3
bucket = s3.Bucket(
self, "testS3Bucket", bucket_name=f"test_s3_bucket"
)
service_lambda.add_to_role_policy(
iam.PolicyStatement(
effect=iam.Effect.ALLOW,
actions=[
"s3:ListBucket",
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:HeadObject",
],
resources=[bucket.arn_for_objects("*")],
)
)
Here is the code where I download the file from S3:
def download_file(self, file_name, s3_bucket):
try:
file = self.s3.Object(s3_bucket, file_name).load()
except ClientError as e:
if e.response["Error"]["Code"] == "404":
log.error("File does not exist for partner")
return {}
else:
raise e
except Exception as e:
raise e
return file
Does anybody know how I can get past this issue?
A simpler way to grant your lambda appropriate permissions would be something like this:
bucket = s3.Bucket(
self, "testS3Bucket", bucket_name=f"test_s3_bucket"
)
bucket.grant_read_write(service_lambda.role)
Based on docs
If an encryption key is used, permission to use the key for encrypt/decrypt will also be granted.
Give that a try and see if you still receive a permissions error

botocore.exceptions.ClientError: An error occurred (MalformedXML) when calling the CompleteMultipartUpload operation

I am transferring file from one s3 bucket to another bucket. File size is more than 8GB.
Please find the code snippet below:-
import boto3
from botocore.client import Config
s3_client = boto3.client("s3",
aws_access_key_id="myaccesskey",
aws_secret_access_key="mysecretaccesskey" , config=Config(signature_version='s3v4',read_timeout=65)
)
copy_source_object = {'Bucket': source_bucket_name, 'Key': file_key_name}
tgt_bucket = s3.Bucket(trgt_bckt)
s3_client.copy(copy_source_object,trgt_bckt,trgt_key)
but i am getting issue i.e. botocore.exceptions.ClientError: An error occurred (MalformedXML) when calling the CompleteMultipartUpload operation: The XML you provided was not well-formed or did not validate against our published schema in < script >
Please let me know how to resolve this issue.

S3 unit tests boto client

Having issues writing a unit test for S3 client, it seems the test is trying to use a real s3 client rather than the one i have created for the test here is my example
#pytest.fixture(autouse=True)
def moto_boto(self):
# setup: start moto server and create the bucket
mocks3 = mock_s3()
mocks3.start()
res = boto3.resource('s3')
bucket_name: str = f"{os.environ['BUCKET_NAME']}"
res.create_bucket(Bucket=bucket_name)
yield
# teardown: stop moto server
mocks3.stop()
def test_with_fixture(self):
from functions.s3_upload_worker import (
save_email_in_bucket,
)
client = boto3.client('s3')
bucket_name: str = f"{os.environ['BUCKET_NAME']}"
client.list_objects(Bucket=bucket_name)
save_email_in_bucket(
"123AZT",
os.environ["BUCKET_FOLDER_NAME"],
email_byte_code,
)
This results in the following error
botocore.exceptions.ClientError: An error occurred (ExpiredToken) when calling the PutObject operation: The provided token has expired.
code i am testing looks like this
def save_email_in_bucket(message_id, bucket_folder_name, body):
s3_key = "".join([bucket_folder_name, "/", str(message_id), ".json"])
s3_client.put_object(
Bucket=bucket,
Key=s3_key,
Body=json.dumps(body),
ContentType="application-json",
)
LOGGER.info(
f"Saved email with messsage ID {message_id} in bucket folder {bucket_folder_name}"
)
Not accepting this an an answer but useful for anyone who ends up here, I found a workaround where if I create the s3 client in the function i am trying to test then this approach will work rather than create it globally. I would prefer to find an actual solution though.

Categories

Resources