I wrote a python script to download some files from an s3 bucket. The script works just fine on one machine, but breaks on another.
Here is the exception I get: botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden.
I am pretty sure it's related to some system configurations, or something related to the registry, but don't know what exactly. Both machines are running Windows 7 and python 3.5.
Any suggestions.
The issue was actually being caused by the system time being incorrect. I fixed the system time and the problem is fixed.
So forbidden means you dont have access to perform the operation. Check you have permission to perform read on that specific bucket and also you have supplied valid IAM keys. Below is the sample policy for getting read and list access to bucket.
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"statement1",
"Effect":"Allow",
"Action":[
"s3:List*",
"s3:GetObject"
],
"Resource":[
"arn:aws:s3:::bucketname/*"
]
}
]
}
More info here:
Specifying Permissions in a Policy
Writing IAM Policies: How to Grant Access to an Amazon S3 Bucket
Related
I am trying to download files from a s3 bucket by using the Access Key ID and Secret Access Key provided by https://db.humanconnectome.org. However, even though I am able to navigate the database and find the files (as I have configured my credentials via aws cli), attempting to download them results in the following error:
"botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden"
With the same credentials, I can browse the same database and download the files manually via a cloud storage browser such as Cyberduck, so how Cyberduck accesses the data does not invoke a 403 Forbidden error.
I have also verified that boto3 is able to access my aws credentials, and also tried by hardcoding them.
How I am attempting to download the data is very straightforward, and replicates the boto3 docs example: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-example-download-file.html
s3 = boto3.client('s3',
aws_access_key_id=ACCESS_KEY_ID,
aws_secret_access_key=ACCESS_KEY,)
s3.download_file(Bucket=BUCKET_NAME, Key=FILE_KEY, Filename=FILE_NAME)
This should download the file to the location and file given by FILE_NAME, but instead invokes the 403 Forbidden error.
You'll need to pass the bucket region as well when downloading the file. Try configuring region using the CLI or pass region_name when creating the client.
s3 = boto3.client('s3',
aws_access_key_id=ACCESS_KEY_ID,
aws_secret_access_key=ACCESS_KEY,
region_name=AWS_REGION)
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html
I know this might sound ridiculous, but make sure you don't have a typo in your bucket name or anything like that.
I worked so long trying to fix this, only to realize I added an extra letter in the env variable I had set for my s3 bucket.
It's weird that they give you a forbidden error as a opposed to "not found" error, but they do.
I am running the k-means example in SageMaker:
from sagemaker import KMeans
data_location = 's3://{}/kmeans_highlevel_example/data'.format(bucket)
output_location = 's3://{}/kmeans_example/output'.format(bucket)
kmeans = KMeans(role=role,
train_instance_count=2,
train_instance_type='ml.c4.8xlarge',
output_path=output_location,
k=10,
data_location=data_location)
When I run this line, it appears access denied error.
%%time
kmeans.fit(kmeans.record_set(train_set[0]))
The error returns:
ClientError: An error occurred (AccessDenied) when calling the
PutObject operation: Access Denied
I also read other questions, but their answers do not solve my problem.
Would you please look at my case?
To be able to training a job in SageMaker, you need to pass in an AWS IAM role allowing SageMaker to access your S3 bucket.
The error means that SageMaker does not have permissions to write files in the bucket that you specified.
You can find the permissions that you need to add to your role hereL https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-roles.html#sagemaker-roles-createtrainingjob-perms
sagemaker aws-sagemaker role
Another thing to consider, if you are using an encrypted bucket, that requires kms decryption, make sure to also include kms related permissions
I've noticed sometimes the error shown is PutObject operation: Access Denied while failure is actually KMS related.
I faced the same problem. My Sagemaker Notebook Instance wasn't able to read or write files to my S3 bucket. First step of troubleshooting is locating the role for your **Sagemaker Instance **. You can do that by checking this section
Then go to this specific role from IAM and attach another policy to the role
I attached S3 Full access but you can create a custom policy.
I was getting confused because I was logged in using the admin user. However, when you go with a Sagemaker Instance your user policies/roles will not be used to perform actions.
In my case I had just forgotten to rename the s3 bucket name from the default given to something that is unique
I've deployed an endpoint in sagemaker and was trying to invoke it through my python program. I had tested it using postman and it worked perfectly ok. Then I wrote the invocation code as follows
import boto3
import pandas as pd
import io
import numpy as np
def np2csv(arr):
csv = io.BytesIO()
np.savetxt(csv, arr, delimiter=',', fmt='%g')
return csv.getvalue().decode().rstrip()
runtime= boto3.client('runtime.sagemaker')
payload = np2csv(test_X)
runtime.invoke_endpoint(
EndpointName='<my-endpoint-name>',
Body=payload,
ContentType='text/csv',
Accept='Accept'
)
Now whe I run this I get a validation error
ValidationError: An error occurred (ValidationError) when calling the InvokeEndpoint operation: Endpoint <my-endpoint-name> of account <some-unknown-account-number> not found.
While using postman i had given my access key and secret key but I'm not sure how to pass it when using sagemaker apis. I'm not able to find it in the documentation also.
So my question is, how can I use sagemaker api from my local machine to invoke my endpoint?
I also had this issue and it turned out to be my region was wrong.
Silly but worth a check!
When you are using any of the AWS SDK (including the one for Amazon SageMaker), you need to configure the credentials of your AWS account on the machine that you are using to run your code. If you are using your local machine, you can use the AWS CLI flow. You can find detailed instructions on the Python SDK page: https://aws.amazon.com/developers/getting-started/python/
Please note that when you are deploying the code to a different machine, you will have to make sure that you are giving the EC2, ECS, Lambda or any other target a role that will allow the call to this specific endpoint. While in your local machine it can be OK to give you admin rights or other permissive permissions, when you are deploying to a remote instance, you should restrict the permissions as much as possible.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "sagemaker:InvokeEndpoint",
"Resource": "arn:aws:sagemaker:*:1234567890:endpoint/<my-endpoint-name>"
}
]
}
Based on #Jack's answer, I ran aws configure and changed the default region name and it worked.
I'm trying to execute a query in Athena, but it fails.
Code:
client.start_query_execution(QueryString="CREATE DATABASE IF NOT EXISTS db;",
QueryExecutionContext={'Database': 'db'},
ResultConfiguration={
'OutputLocation': "s3://my-bucket/",
'EncryptionConfiguration': {
'EncryptionOption': 'SSE-S3'
}
})
But it raises the following exception:
botocore.errorfactory.InvalidRequestException: An error occurred (InvalidRequestException)
when calling the StartQueryExecution operation: The S3 location provided to save your
query results is invalid. Please check your S3 location is correct and is in the same
region and try again. If you continue to see the issue, contact customer support
for further assistance.
However, if I go to the Athena Console, go to Settings and enter the same S3 location (for example):
the query runs fine.
What's wrong with my code? I've used the API of several the other services (eg, S3) successfully, but in this one I believe I'm passing some incorrect parameters. Thanks.
Python: 3.6.1. Boto3: 1.4.4
I had to add a 'athena-' prefix to my bucket to get it to work. For example, in stead of:
"s3://my-bucket/"
Try:
"s3://athena-my-bucket/"
EDIT: As suggested by Justin, AWS later added support for Athena by adding athena prefix to the bucket. Please upvote his answer.
Accepted Answer:
The S3 location provided to save your query results is invalid. Please check your S3 location is correct and is in the same region and try again.
Since it works when you use the console, it is likely the bucket is in a different region than the one you are using in Boto3. Make sure you use the correct region (the one that worked in the console) when constructing the Boto3 client. By default, Boto3 will use the region configured in the credentials file.
Alternatively try boto3.client('athena', region_name = '<region>')
Ran into the same issue and needed to specify the S3 bucket in the client.
In my case, IAM role didn't have all the permissions for the S3 bucket. I gave IAM role following permissions for Athena results bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:GetObject",
"s3:ListBucket",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::athena_results_bucket",
"arn:aws:s3:::athena_results_bucket"
],
"Effect": "Allow"
}
]
}
I received the OP error, attempted Justin's answer, and got the following error
SYNTAX_ERROR: line 1:15: Schema TableName does not exist
Meaning that it was not able to find the tables that I had previously created through the AWS Athena UI.
The simple solution was to use dclaze's answer instead. These two answers cannot be used simultaneously, or you will get back the initial (OP) error.
I am able to upload an image file using:
s3 = session.resource('s3')
bucket = s3.Bucket(S3_BUCKET)
bucket.upload_file(file, key)
However, I want to make the file public too. I tried looking up for some functions to set ACL for the file but seems like boto3 have changes their API and removed some functions. Is there a way to do it in the latest release of boto3?
To upload and set permission to publicly-readable in one step, you can use:
bucket.upload_file(file, key, ExtraArgs={'ACL':'public-read'})
See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html#the-extraargs-parameter
I was able to do it using objectAcl API:
s3 = boto3.resource('s3')
object_acl = s3.ObjectAcl('bucket_name','object_key')
response = object_acl.put(ACL='public-read')
For details: http://boto3.readthedocs.io/en/latest/reference/services/s3.html#objectacl
Adi's way works. However, if you were like me, you might have run into an access denied issue. This is normally caused by broken permissions of the user.
I fixed it by adding the following to the Action array:
"s3:GetObjectAcl",
"s3:PutObjectAcl"
In the recent versions of boto, ACL is available as a regular parameter - both when using the S3 client and resource, it seems. You can just specify ACL="public_read" without having to wrap it with ExtraParams or using ObjectAcl API.
Set the ACL="public-read" as mentioned above.
Also, make sure your bucket policy Resource line
has both the bare arn and /* arn formats.
Not having them both can cause strange permissions problems.
...
"Resource": ["arn:aws:s3:::my_bucket/*", "arn:aws:s3:::my_bucket"]