The following does not work:
From the boto 3 docs:
http://boto3.readthedocs.io/en/latest/guide/s3.html#generating-presigned-urls
This is my script with placeholder bucket and key values:
import boto3
import requests
from botocore.client import Config
# Get the service client.
s3 = boto3.client('s3', config=Config(signature_version='s3v4'))
# Generate the URL to get 'key-name' from 'bucket-name'
url = s3.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': 'mybucketname',
'Key': 'myObject.txt'
}
)
print url
response = requests.get(url)
print response
S3 responds with a 403:
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>B5681E888657E2A1</RequestId>
<HostId>
FMS7oPPOXt4I0KXPPQwdBx2fyxze+ussMmy/BOWLVFusWMoU2zAErE08ez34O6VhSYRvIYFm7Bs=
</HostId>
</Error>
You need to provide aws credentials with your boto3 client. Docs here
If you need help getting access to your credentials on aws you can look here.
import boto3
client = boto3.client(
's3',
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
# Or via the Session
session = boto3.Session(
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
Related
I am trying to upload multiple images in s3 from react application using aws api gateway.
I have tried below approach:
Setup api gateway which target to lambda function.
lambda function code:
import json
import boto3
def lambda_handler(event, context):
print(event)
s3 = boto3.client('s3', region_name='us-east-1')
bucket_name = 'testimagesbucketupload'
URL = s3.generate_presigned_post(
Bucket= bucket_name,
Key="${filename}",
# Conditions=[
# ["starts-with", "$success_action_redirect", ""],
# ["eq", "$userid", "test"],
# ],
ExpiresIn=3600)
data = {"url": URL['url'], "fields": URL['fields']}
print(type(data))
# print(data)
return data
Using above code i am able to upload single image from web and postman both but now i want to upload multiple image using this url and also want to retrieve image for preview..
If any one worked please help me
Thanks in advance..
I tried presigned_post and presigned-url for achieve this but still i am not able to achieve this
You'll need to create one url for image, but you can use a loop to create all of them. I think something like this could work for you
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3', region_name='us-east-1')
bucket_name = 'testimagesbucketupload'
image_list = event['image_list']
data = []
for image in image_list:
URL = s3.generate_presigned_post(
Bucket= bucket_name,
Key=image,
ExpiresIn=3600)
data.append({"url": URL['url'], "fields": URL['fields']})
return data
Note that you need to pass a list of images in the event
For the preview, you could use a presigned url to return the image as a public url...
from botocore.client import Config
import boto3
s3 = boto3.client('s3', config=Config(signature_version='s3v4'), region_name = "your_region")
presigned_url = s3.generate_presigned_url('get_object',
Params={'Bucket': "your_bucket",
'Key': "your_file_key"},
ExpiresIn=3600)
presigned_url
I have my credentials stored in an S3 bucket and can access the file using the boto3 library but how can I pointos.environ['GOOGLE_APPLICATION_CREDENTIALS'] to the file stored in S3
client = boto3.client(
"s3",
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
credentials = client.get_object(Bucket=bucket_name, Key='name-of-file.json')
GOOGLE_CREDENTIALS = json.loads(credentials["Body"].read().decode('utf-8'))
# THIS DOES NOT WORK
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = GOOGLE_CREDENTIALS
I want to connect to the seller partner api in python using boto3.
The steps assumeRole to get temporary credentials for a session client I get. But sp-api is not in the list of aws services to handle with boto3. Is there a reference for the sp-api to use with python or what would be the equivalent to s3 = boto3.client('s3') for the sp-api?
I had also issues and questions like you have, I managed to connect successfully, maybe this can help:
import boto3
# ======== GET AUTH ========
# Credentials for user created following the docs
amw_client = boto3.client(
'sts',
aws_access_key_id=self.access_key,
aws_secret_access_key=self.secret_key,
region_name=self.region
)
# ROLE created following the docs
# STS assume policy must be included in the role
res = amw_client.assume_role(
RoleArn='arn:aws:iam::xxxx:role/xxxx',
RoleSessionName='SellingPartnerAPI'
)
Credentials = res["Credentials"]
AccessKeyId = Credentials["AccessKeyId"]
SecretAccessKey = Credentials["SecretAccessKey"]
SessionToken = Credentials["SessionToken"]
from requests_auth_aws_sigv4 import AWSSigV4
aws_auth = AWSSigV4('execute-api',
aws_access_key_id=AccessKeyId,
aws_secret_access_key=SecretAccessKey,
aws_session_token=SessionToken,
region=self.region
)
import requests
# ======== GET ACCESS TOKEN ======
body = \
{
'grant_type': 'refresh_token',
'client_id': amazon_app_client_id,
'refresh_token': amazon_app_refresh_token,
'client_secret': amazon_app_client_secret
}
h = {'Content-Type': 'application/json'}
access_token_response = \
requests.post('https://api.amazon.com/auth/o2/token', json=body, headers=h)
access_token = self.access_token_response.json().get('access_token')
# ======== CONSUME API ========
resp = requests.get(
request_url, auth=aws_auth, headers={'x-amz-access-token': access_token})
Let me know if I can help further :)
I am using a lambda to create a pre-signed URL to download files that land in an S3 bucket -
the code works and I get a URL but when trying to access it I get
af-south-1 location constraint is incompatible for the region-specific endpoint this request was sent to.
both the bucket and the lambda are in the same region
I'm at a loss as to what is actually happening any ideas or solutions would be greatly appreciated.
my code is below
import json
import boto3
import boto3.session
def lambda_handler(event, context):
session = boto3.session.Session(region_name='af-south-1')
s3 = session.client('s3')
for record in event['Records']:
bucket = record['s3']['bucket']['name']
key = record['s3']['object']['key']
url = s3.generate_presigned_url(ClientMethod='get_object',
Params={'Bucket': bucket,
'Key': key}, ExpiresIn = 400)
print (url)```
Set an endpoint_url=https://s3.af-south-1.amazonaws.com while generating the s3_client
s3_client = session.client('s3',
region_name='af-south-1',
endpoint_url='https://s3.af-south-1.amazonaws.com')
Could you please try using the boto3 client directly rather than via the session, and generate the pre-signed url :
import boto3
import requests
# Get the service client.
s3 = boto3.client('s3',region_name='af-south-1')
# Generate the URL to get 'key-name' from 'bucket-name'
url = s3.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': 'bucket-name',
'Key': 'key-name'
}
)
You could also have a look at these 1 & 2, which resembles the same issue.
I am trying to use boto3 for Amazon Mechanical Turk. I was trying to get the client using the following code:
import boto3
endpoint_url = 'https://mturk-requester.us-east-1.amazonaws.com'
aws_access_key_id = <aws_access_key_id>
aws_secret_access_key = <aws_secret_access_key>
region_name = 'us-east-1'
client = boto3.client('mturk',
aws_access_key_id = aws_access_key_id,
aws_secret_access_key = aws_secret_access_key,
region_name=region_name,
endpoint_url = endpoint_url
)
But I am getting the following error about UnknownService name:
botocore.exceptions.UnknownServiceError: Unknown service: 'mturk'. Valid service
names are: acm,..., xray
Why is 'mturk' not in this list? The code I am using is taken from mturk developer website.
Any suggestion is welcome! Thanks in advance!