I am going through a bootcamp and one project is to create a bucket in aws using python scripting. I keep getting an access denied error and im not sure why? Any help would be much appreciated.
enter image description here
#samrah there are many different scenarios to Access Denied
Major, your admin(in case if someone gives access to AWS), not gave permission to CREATe bucket
Other probably the Access Keys issue.
It's a good way always to share code snippet, so the community will help easily.
Related
For my current Python project I' using the Microsoft Azure SDK for Python.
I want to copy a specific blob from one container path to another and tested already some options, described here.
Overall they are basically "working", but unfortunately the new_blob.start_copy_from_url(source_blob_url) command always leads to an erorr: ErrorCode:CannotVerifyCopySource.
Is someone getting the same error message here, or has an idea, how to solve it?
I was also trying to modify the source_blob_url as a sas-token, but still doesn't work. I have the feeling that there is some connection to the access levels of the storage account, but so far I wasn't able to figure it out. Hopefully someone here can help me.
Is someone getting the same error message here, or has an idea, how to solve it?
As you have mentioned you might be receiving this error due to permissions while including the SAS Token.
The difference to my code was, that I used the blob storage sas_token from the Azure website, instead of generating it directly for the blob client with the azure function.
In order to allow access to certain areas of your storage account, a SAS is generated by default with a number of permissions such as read/write, services, resource type, Start and expiration date/time, and Allowed IP addresses, etc.
It's not that you always need to generate directly for the blob client with the azure function but you can generate one from the portal too by allowing the permissions.
REFERENCES: Grant limited access to Azure Storage resources using SAS - MSFT Document
I am kinda stuck right now.
Basically i am trying to retrieve files from a s3 bucket with boto3.
It all used to work, but now the administrators of the s3 bucket decided to revoke permissions for parent folders, because the bucket is shared with a lot of companies whose data we shall not access (for obvious reasons).
Basically this is how we used to get to our files:
objects = s3_bucket.objects.filter(Prefix=s3_folder)
And then we just were going through those files as shown in the documentation:
for obj in objects:
But now this just throws an error:
An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
What i am unable to figure out now is how i can connect to a s3 bucket but only retrieve the object list starting from a specific path.
Let's say my files are under /all-data/reports/company-2/data/csv/year-month-day/*****.csv
I know the entire path from /all-data/ to /csv/ but i need to list the contents of the csv folder (year-month-day folders). Also i only can retrieve the object list starting from /company-2.
I cannot find any function in the boto3 documentation which allows me to either directly connect to a subfolder in the bucket or otherwise let me navigate through the bucket down to /csv where i could start the list objects.
If there is any of you who could help me out i would really appreciate it, i could not find anyone else facing the problem yet :)
So basically it seems like my only option here is to get the admin to set up a IAM Rule, as explained here: https://www.reddit.com/r/aws/comments/ofh4dp/navigate_to_subfolders_in_s3_with_boto3_without/h4ceohy?utm_source=share&utm_medium=web2x&context=3
Within a framework I am building some functions that run on the main Faas providers (aws,gcp,azure,alicloud). The main function is essentially an elif based on a environment variable deciding which function to call ("do stuff on aws", "do stuff on gcp")etc. The functions essentially just read from the appropriate database (aws->dynamo, gcp->firestore, azure->cosmos).
When uploading my zip to google cloud functions through their web portal, i get the following error:
Function failed on loading user code. Error message: You must specify a region.
I'm concerned it's got something to do with my piplock file, and a clash with the aws dependencies. Not sure though. I cannot find anywhere online where someone has had this error message with gcp (certainly not through using the online console), and only see results for this error with aws.
My requirements.txt file is simply:
google-cloud-firestore==1.4.0
The piplock contains the google requirements, but doesn't state the region anywhere. However, when using the gcp console, it automatically uploads to us-central1.
Found the error in a Google Groups. If anyone else has this problem, it's because you're importing boto and uploading to gcp. GCP say it's boto's fault. So you can either split up your code so that you only bring in necessary gcp files, or wrap your imports in if's based on environment vars.
The response from the gcp Product Manager was "Hi all -- closing this out. Turns out this wasn't an issue in Cloud Functions/gcloud. The error was one emitted by the boto library: "You must specify a region.". This was confusing because the concept of region applies to AWS and GCP. We're making a tweak to our error message so that this should hopefully be a little more obvious in the future."
def getthings():
client = boto3.client('iot', region_name='name')
response = client.list_things(nextToken='string', maxResults=123, attributeName='string', attributeValue='string', thingTypeName='string')
I am a beginner in Python, I have the following code that I get from the AWS documentation to have the list of things in AWS IoT. I have the following error :
InvalidRequestException: An error occurred (InvalidRequestException)
when calling the ListThings operation.
What is the problem ?
Though there is not much details to this exception the following steps could help resolve faster.
You can try the with python console first.
You need to make sure you have AWS-CLI installed
Ensure you have AWS configure for configuring your access key and secret keys
By the way the following code worked to determine the list of things:
client=boto3.client('iot')
response = client.list_things(maxResults=123, thingTypeName='appropriate thing type')
Hi I am trying to create a lambda function in amazon lambda. Just trying to follow the python tutorial.
After following all the steps I get a "Service Error" when creating the function.
I found this forum discussing about this link to the forum.
I checked with all the environments but still it gives me the service error.
error
Is it a configuration issue or a problem with the account ?
I am totally new to aws. Appreciate any help regarding this.
I was having this problem, but it disappeared when I selected "Choose an existing role" instead of letting the wizard create a role, under "Lambda function handler and role". If I create a role manually I can then assign it to the Lambda afterwards. Maybe that'll help you, too?
I had this problem, too. In my case, I had "Enable active tracing" selected. I tried un-checking that option and it got rid of the error. Maybe I had a permissions problem w/XRay? Who knows; I didn't investigate further.