creating bucket error on AWS using python - python

I want to create a bucket to upload some wav files in it. I am able to create a bucket manually with location required, but when I am trying to program in python to create a bucket with us-west-2 location
session = boto3.Session(aws_access_key_id=aws_access_key_id, aws_secret_access_key=aws_secret_access_key)
s3 = session.resource('s3')
s3.create_bucket(Bucket='test-asterisk1', CreateBucketConfiguration={'LocationConstraint': 'eu-central-1'})
I got the following error
Traceback (most recent call last):
File "create_bucket.py", line 10, in <module>
s3.create_bucket(Bucket='asterisk1', CreateBucketConfiguration={'LocationConstraint': 'ap-south-1'})
File "/home/dileep/.local/lib/python2.7/site-packages/boto3/resources/factory.py", line 520, in do_action
response = action(self, *args, **kwargs)
File "/home/dileep/.local/lib/python2.7/site-packages/boto3/resources/action.py", line 83, in __call__ response = getattr(parent.meta.client, operation_name)(**params)
File "/home/dileep/.local/lib/python2.7/site-packages/botocore/client.py", line 314, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/dileep/.local/lib/python2.7/site-packages/botocore/client.py", line 612, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (IllegalLocationConstraintException) when calling the CreateBucket operation: The ap-south-1 location constraint is incompatible for the region specific endpoint this request was sent to.
I am on indian IP trying to create a 'us-west-2' end point is that creating a problem?
so I tried changing location constrain one by one time each
"LocationConstraint": "EU"|"eu-west-1"|"us-west-1"|"us-west-2"|"ap-south-1"|"ap-southeast-1"|"ap-southeast-2"|"ap-northeast-1"|"sa-east-1"|"cn-north-1"|"eu-central-1"
but whatever location I try it gives me the same error.
so I tried creating with boto instead of boto3
import boto
from boto.s3.connection import Location
s3 = boto.connect_s3(aws_access_key_id, aws_secret_access_key)
s3.create_bucket('test-asterisk1', location=Location.USWest2)
it throws error
File "s2t_amazon.py", line 27, in <module>
s3.create_bucket('test-asterisk2', location=Location.USWest2)
File "/home/dileep/.local/lib/python2.7/site-packages/boto/s3/connection.py", line 623, in create_bucket
response.status, response.reason, body)
boto.exception.S3CreateError: S3CreateError: 409 Conflict
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>BucketAlreadyOwnedByYou</Code><Message>Your previous request to create the named bucket succeeded and you already own it.
</Message><BucketName>test-asterisk2</BucketName>
<RequestId>EAF26BA152FD20A5</RequestId<HostId>ep0WFZEb1mIjEgbYIY4BGGuOTi5HSutYd3XTKgFjWmRMnGG0ajj5TLF4/t1amJQsOZdZQrqGnoE=</HostId></Error>
I have checked if the bucket is created, and it is not created by anyone of the methods. Can anyone suggest what could be the problem?

This works:
import boto3
s3_client = boto3.client('s3', region_name = 'eu-central-1')
s3_client.create_bucket(Bucket='my-bucket', CreateBucketConfiguration={'LocationConstraint': 'eu-central-1'})
The import thing to realise is that the command must be sent to the region where the bucket is being created. Thus, you'll need to specify the region when creating the client and also when creating the bucket.

Related

Boto3 S3 - Connection was closed before we received a valid response from endpoint

everyone.
Currently we have a problem with boto3 in Python. We are trying to get object in AWS S3:
s3 = boto3.client("s3", region_name="us-east-1")
obj = s3.get_object(Bucket=bucket, Key=key)['Body'].read()
But in some cases AWS returns an error: "Connection was closed before we received a valid response from endpoint URL"
This error happens completely randomly, there is no common pattern for it to fail, is an intermittent error. In many cases it manages to read the file correctly, but in other cases the error is displayed.
This happens inside Elastic Beanstalk on AWS, it is not a local test. I clarify this, because in some similar forums they mention that it may be antivirus, firewall or VPN issue and we are not using any of these inside AWS.
We initially we tried to add the region_name in the code, but it did not work.
Here logs from boto3 when raise error:
File "/opt/python/current/app/customscripts/scripts/backend_940.py", line 67, in read
lines = s3.get_object(Bucket=s3_url.bucket, Key=s3_url.key)['Body'].read().decode()
File "/opt/python/run/venv/local/lib/python3.6/site-packages/botocore/client.py", line 391, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/opt/python/run/venv/local/lib/python3.6/site-packages/botocore/client.py", line 706, in _make_api_call
operation_model, request_dict, request_context)
File "/opt/python/run/venv/local/lib/python3.6/site-packages/botocore/client.py", line 725, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File "/opt/python/run/venv/local/lib/python3.6/site-packages/botocore/endpoint.py", line 106, in make_request
return self._send_request(request_dict, operation_model)
I appreciate any help

Amazon Transcribe on S3 Upload: "[ERROR] BadRequestException: URI provided doesn't point to an S3 object"

I'm trying out Amazon Transcribe on a collection of media files, adapting
the sample docs code and using this series as a reference to fit with any upload to my designated media S3 folder, but having issues with my test file.
UPLOAD BUCKET/FOLDER path:
'MediaFileUri': https://us-west-2.console.aws.amazon.com/s3/buckets/upload-asr/mediaupload/file.mp4
I've verified that the file exists and the bucket permissions grant access to the Amazon Transcribe service. I am able to start a manual transcription job with the same URL, but not with the SDK: I've also directly linked it in the function using the path above with no result. I appreciate it might be a URL path issue, but haven't seen much on the subject so checking for an obvious error.
import json
import time
import boto3
from urllib.request import urlopen
def lambda_handler(event, context):
transcribe = boto3.client("transcribe")
s3 = boto3.client("s3")
if event:
file_obj = event["Records"][0]
bucket_name = str(file_obj['s3']['bucket']['name'])
file_name = str(file_obj['s3']['object']['key'])
file_type = file_name.split(".")[1]
s3_uri = create_uri(bucket_name, file_name)
job_name = context.aws_request_id
transcribe.start_transcription_job(TranscriptionJobName = job_name,
Media = {'MediaFileUri': s3_uri},
OutputBucketName = "bucket-name",
MediaFormat = file_type,
LanguageCode = "en-US")
def create_uri(bucket_name, file_name):
CloudWatch Log Failure Report:
[ERROR] BadRequestException: An error occurred (BadRequestException) when calling the StartTranscriptionJob operation:
The URI that you provided doesn't point to an S3 object. Make sure that the object exists and try your request again.
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 25, in lambda_handler
LanguageCode = "en-US")
File "/var/runtime/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/var/runtime/botocore/client.py", line 623, in _make_api_call
raise error_class(parsed_response, operation_name)
SIMILAR:
https://forums.aws.amazon.com/thread.jspa?messageID=876906&#876906
It works for me using this format:
Media={
'MediaFileUri': f'https://s3-us-west-2.amazonaws.com/{BUCKET}/{KEY}'
},

Creation of more than one bucket using Amazon S3 on CEPH cluster failes with Error Code 503

Blockquote
I am new to ceph hand Amazon S3. I am trying to create more than one bucket with following python script, however I am getting below error.
import boto
import boto.s3.connection
access_key = 'xxx'
secret_key = 'xxxxxxx=='
conn = boto.connect_s3(aws_access_key_id = access_key,aws_secret_access_key = secret_key,host = '127.0.0.1',port = 8000,is_secure=False, calling_format = boto.s3.connection.OrdinaryCallingFormat(),)
bucket = conn.create_bucket('my-new-bucket')
bucket2 = conn.create_bucket('my-new-bucket2')
bucket3 = conn.create_bucket('my-new-bucket3')
Below is the error message:
Traceback (most recent call last):
File "PythonS3.py", line 8, in <module>
bucket2 = conn.create_bucket('my-new-bucket2')
File "/home/vivekanand/.local/lib/python2.7/site-packages/boto/s3/connection.py", line 619, in create_bucket
data=data)
File "/home/vivekanand/.local/lib/python2.7/site-packages/boto/s3/connection.py", line 671, in make_request
retry_handler=retry_handler
File "/home/vivekanand/.local/lib/python2.7/site-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/home/vivekanand/.local/lib/python2.7/site-packages/boto/connection.py", line 1028, in _mexe
raise BotoServerError(response.status, response.reason, body)
boto.exception.BotoServerError: BotoServerError: 503 Slow Down
<?xml version="1.0" encoding="UTF-8"?><Error><Code>SlowDown</Code></Error>
One bucket gets created successfully. But creation of second bucket bucket fails with error code 503.
Amazon is throttling your bucket create operations. Amazon does not want you to create one bucket after another in rapid succession. You are limited to the number of buckets that you can create (100). I don't know why Amazon throttles bucket create operations, maybe it is an expensive operation internally to setup storage.

credentials issue in Python while using AWS S3

This error i am getting:
ERROR:boto:Unable to read instance data, giving up
Traceback (most recent call last):
File "<ipython-input-62-476f799f9e0f>", line 2, in <module>
conn = boto.connect_s3()
File "/usr/local/lib/python2.7/dist-packages/boto/__init__.py", line 141, in connect_s3
return S3Connection(aws_access_key_id, aws_secret_access_key, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 191, in __init__
validate_certs=validate_certs, profile_name=profile_name)
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 569, in __init__
host, config, self.provider, self._required_auth_capability())
File "/usr/local/lib/python2.7/dist-packages/boto/auth.py", line 993, in get_auth_handler
'Check your credentials' % (len(names), str(names)))
NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] Check your credentials
This error Message is coming while establishing connection with aws S3Connection.
I want to establish connection with AWS S3 and read CSV files.
please Help me out?
i am using Python 2.7.12
And Now i am using this below code:
import boto
import time
from boto.s3.connection import S3Connection
conn = S3Connection('<aws access key>','<aws secret key>')
print conn
from boto.s3.connection import Location
print '\n'.join(i for i in dir(Location) if i[0].isupper())
conn.create_bucket('egp-shared-prod/egp-prod-c2c1/',
location=Location.DEFAULT)
And, Its show Error:
File "<ipython-input-69-4b49d719d4ca>", line 15, in <module>
conn.create_bucket('egp-shared-prod/egp-prod-c2c1/', location=Location.DEFAULT)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 616, in create_bucket
data=data)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 668, in make_request
retry_handler=retry_handler
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 1030, in _mexe
raise ex
gaierror: [Errno -2] Name or service not known
I tried your code and my testing has found that the error is related to your bucket name of egp-shared-prod/egp-prod-c2c1/.
The Bucket Restrictions and Limitations documentation says:
Bucket names can contain lowercase letters, numbers, and hyphens.
Slashes are not permitted. Also, they seem to be upsetting the boto code.
Boto (the official AWS Python bindings) that you are using, expect you to save your AWS_ACCESS_KEY_id and AWS_SECRET_ACCESS_KEY in environment variables like so:
export AWS_ACCESS_KEY_ID='AK123'
export AWS_SECRET_ACCESS_KEY='abc123'
you can pass aws cridential access like:
#Connection with s3 :
s3= boto3.resource(
service_name='s3',
region_name='us-east-1',
aws_secret_access_key='',
aws_access_key_id=''
)

Python boto connecting to S3 throwing error

Hello I am trying to connect Python to S3 in Frankfurt region using boto 2.43 where I want to print contents of bucket name...
Following is my code :
from boto.s3.connection import S3Connection
hostname='s3.eu-central-1.amazonaws.com'
conn = S3Connection(aws_access_key_id,aws_secret_access_key, host=hostname)
bucket_name = conn.get_bucket('jd-eu01-isg-analytics-data-from-us01', validate=False)
for key in bucket_name.list(prefix='EU_Scripts_For_Frankfurt/'):
print key
continue
When I am executing it,it throws following error :
File "/usr/lib/python2.7/site-packages/boto/s3/bucketlistresultset.py", line 34, in bucket_lister
encoding_type=encoding_type)
File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 473, in get_all_keys
'', headers, **params)
File "/usr/lib/python2.7/site-packages/boto/s3/bucket.py", line 399, in _get_all
query_args=query_args)
File "/usr/lib/python2.7/site-packages/boto/s3/connection.py", line 668, in make_request
retry_handler=retry_handler
File "/usr/lib/python2.7/site-packages/boto/connection.py", line 1071, in make_request
retry_handler=retry_handler)
File "/usr/lib/python2.7/site-packages/boto/connection.py", line 927, in _mexe
request.authorize(connection=self)
File "/usr/lib/python2.7/site-packages/boto/connection.py", line 377, in authorize
connection._auth_handler.add_auth(self, **kwargs)
File "/usr/lib/python2.7/site-packages/boto/auth.py", line 727, in add_auth
**kwargs)
File "/usr/lib/python2.7/site-packages/boto/auth.py", line 546, in add_auth
string_to_sign = self.string_to_sign(req, canonical_request)
File "/usr/lib/python2.7/site-packages/boto/auth.py", line 486, in string_to_sign
sts.append(self.credential_scope(http_request))
File "/usr/lib/python2.7/site-packages/boto/auth.py", line 468, in credential_scope
region_name = self.determine_region_name(http_request.host)
File "/usr/lib/python2.7/site-packages/boto/auth.py", line 662, in determine_region_name
return region_name
UnboundLocalError: local variable 'region_name' referenced before assignment
How to resolve this issue ?? is this because of boto version ?? Any solutions please
Use boto3, it is easier. boto2 is not supported by AWS and boto already deprecated and there is no intention for features
The error pop up because it can't find the region. Boto/boto3
API will check the region name inside the boto/boto3 service initialization stage. If you didn't specify it, it will look for default region name define inside credential file or environment variable (e.g. ~/.aws/config).
This is true even you explicitly specify S3 endpoint URL. If you don't want to hard code the credential, region name, then you must setup AWS credential as specified here: credential configuration.
Making boto/boto3 using credential file/environment variable will make your code cleaner and more flexible, e.g. you can even using STS without changing the code. e.g.:
import boto3
# You can choose between service resource or service client.
s3 = boto3.client("s3")
response = s3.list_objects_v2(
Bucket="jd-eu01-isg-analytics-data-from-us01",
Prefix="EU_Scripts_For_Frankfurt"
)
for content in response["Contents"]:
print content["Key"]
Nevertheless, you can still hardcode access key id, secre_key, region name, etc by passing the parameter when you initialize boto/boto3 resources. (boto API is similar)
import boto3
s3 = boto3.client(
"s3",
region_name = "eu-central-1",
aws_access_key_id = 'xxxxxxxx",
aws_secret_access_key= = "yyyyyyy")

Categories

Resources