In my current project, my objective is to access the video files (in mp4) from AWS S3 bucket.
I have created S3 bucket, named videostreambucketpankesh . This is a public folder with the following permission (as follows).
The Access Control list (ACL) of videostreambucketpankesh bucket is as follows:
The bucket policy of videostreambucketpankesh bucket is as follows:
Now the bucket “videostreambucketpankesh” contains many subfolders (or sub-buckets), including one subfolder, named “video”. This sub-bucket contains some .mp4 file (as shown in the image below).
My problem is that there are some files (such as firetruck.mp4 and ambulance.mp4) that can be directly accessed by browser, when I click its objectURL. I can play them in the browser.
However, I am not able to play other .mp4 ( 39cf9079-7b65-4aa8-8913-8a6b924021d3.mp4, 45fd1749-95aa-488c-ac2f-be8673b8416e.mp4, 8ba187f2-5148-49f6-9acc-2459e41f547b.mp4) files into the browser, when I click its objectURL.
Please note that I upload 39cf9079-7b65-4aa8-8913-8a6b924021d3.mp4, 45fd1749-95aa-488c-ac2f-be8673b8416e.mp4, 8ba187f2-5148-49f6-9acc-2459e41f547b.mp4 video file using a python program programmatically in Python (See the following code ).
def upload_to_s3(local_file, bucket, s3_file):
data = open(local_file, 'rb')
s3_client.put_object(Key="video/"+frame_id+".mp4", Body=data, ContentType='video/mp4', Bucket = s3_bucket)
print("Upload succcessful")
However, I am not able to play mp4 file (I play them in VLC player) in my Google chrome browser. Can you please suggest how can I resolve this issue?
Select the files and look at Properties / Metadata.
It should show Content-Type : video/mp4 like this:
When uploading via the browser, the metadata is automatically set based upon the filetype.
If you are uploading via your own code, you can set the metadata like this:
s3_client.upload_file('video.mp4', bucketname, key, ExtraArgs={'ContentType': "video/mp4"})
or
bucket.put_object(key, Body=data, ContentType='video/mp4')
See: AWS Content Type Settings in S3 Using Boto3
Related
I have created the folder code but how can i access the folder to write csv file into that folder?
# Creating folder on S3 for unmatched data
client = boto3.client('s3')
# Variables
target_bucket = obj['source_and_destination_details']['s3_bucket_name']
subfolder = obj['source_and_destination_details']['s3_bucket_uri-new_folder_path'] + obj['source_and_destination_details']['folder_name_for_unmatched_column_data']
# Create subfolder (objects)
client.put_object(Bucket = target_bucket, Key = subfolder)
Folder is getting created succesfully by above code but how to write csv file into it?
Below is the code which i have tried to write but its not working
# Writing csv on AWS S3
df.reindex(idx).to_csv(obj['source_and_destination_details']['s3_bucket_uri-write'] + obj['source_and_destination_details']['folder_name_for_unmatched_column_data'] + obj['source_and_destination_details']['file_name_for_unmatched_column_data'], index=False)
An S3 bucket is not a file system.
I assume that the to_csv() method is supposed to do write to some sort of file system, but this is not the way it works with S3. While there are solutions to mount S3 buckets as file systems, this is not the preferred way.
Usually, you would interact with S3 via the AWS REST APIs, the AWS CLI or a client library such as Boto, which you’re already using.
So in order to store your content on S3, you first create the file locally, e.g. in the system’s /tmp folder. Then use Boto’s put_object() method to upload the file. Remove from your local storage afterwards.
I am doing a Django project. I have hosted my static files on Amazon S3. It has been successfully uploaded to it. But, the images are not loading when I run the server.
When I inspect the image field it shows:
https://django-ecommerce-files.s3.amazonaws.com/images/logo.png%22%20id=%22image%22%20style=%22width:%2040px;%20height:40px%22%3E
When I double clicked it. It shows this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>07PX6KHYASHT3008</RequestId>
<HostId>pJCxChq1JHlw/GL0Zy/W+PvX1TevOf/C60Huyidi8+0GMAs8geYlXSrEgo6m9vllL0PouTn6NAA=
</HostId>
</Error>
When working with S3 bucket, there is a need to make your resources(files) publicly accessible. You can either do that programmatically at the point of uploading to S3 or from the AWS console. please check how you can enable public access to your files here
Make sure that you have changed the public access settings for the S3 bucket, such that it allows files to be accessed by your app (with the right credentials).
Your requirement may vary, so take a look at their user manual.
Check the Permissions tab under the bucket.
Or, you can also take a look at the actions allowed on your S3 bucket, it must be configured to allow read/write. Refer the docs for a few examples
im not sure if this works. can you try enabling the static hosting on your s3?
go to your s3.
go to properties, scroll down to the bottom
enable the static hosting
a png file on s3 would look like this(the link works btw):
https://aws-cicd-react.s3.ap-southeast-1.amazonaws.com/logo512.png
addendum:
if you want to see the url of your file:
In your s3, click the file then go to properties
look at the Object URL
I have a local .exe file and I want to make it available by https so everyone can download it.
example: "download my app here: https://look_how_downloadable_i_am.exe
If I can update the file with python and manually with interface, it would be perfect ! (the possibility to automate the process and keep it simple if done manually).
It's maybe possible with AWS S3 or/and Lambda.
The most straightforward way would be using an s3 bucket to enable downloads to the file.
Steps are:
Upload file to the bucket
Select the file after it gets uploaded, press actions and select make public
This will make the file publicly downloadable through its unique link. In order to use your own custom domain and link you will have to use CloudFront as #jordanm suggested.
You can also use a python script to update or download your file, you can find demo codes and documentations in Reference 3
Reference 1: How to create download link for an Amazon S3 bucket's object?
Reference 2: https://aws.amazon.com/premiumsupport/knowledge-center/read-access-objects-s3-bucket/
Reference 3: https://docs.aws.amazon.com/code-samples/latest/catalog/code-catalog-python-example_code-s3.html
You can use boto3 to programmatically upload a local file to a bucket, than just edit the buckets permissions to allow public read. Or instead of editing the buckets permissions, when uploading the file just edit the ACL s3.upload_file(upload_path, "bucket-name", file-key, ExtraArgs={'ACL': "public-read"})
upload_path just being the local file path, and file-key being the object name
I have a chalice application that has a defined lambda_handler that will be triggered using s3 event notifications. Every time an image is created in my s3 bucket, the lambda_handler function will be invoked to create thumbnails. But when you upload images to s3 using presigned_urls, the uploaded file does not have a file extension.
The files on s3 look like this:
Now when using pillow, an error is thrown unknown extension file.
How should I go about this?
Do you have access to the function that is in charge of performing the image upload?
If you're using a pre-signed post to perform the image upload, you should also specify the file extension within the object_name parameter.
response = s3_client.generate_presigned_post(BUCKET_NAME,
"5eafba9fa31dd3bcc190a52.jpg",
Fields=fields,
Conditions=conditions,
ExpiresIn=expiration)
This will cause images to be uploaded with their proper extensions, therefore allowing any subsequent invocation to have the proper file extension.
If you only have access to your chalice application, if there's any guarantee that the file extension will always be of a certain type, you can append the extension prior to using Pillow.
I need to upload a css file and a js file to S3 and use them as static resources. If I upload them via web from S3 page, it works. But if I upload via a python script, it uploads the files, but I can't get the css seems not working at all.
Here is my python code,
s3 = boto3.resource('s3')
s3.meta.client.upload_file('sample.css', 'mybucket', 'sample_dir/sample.css', {'ACL': 'public-read'})
The notable condition here is that files uploaded through the console are correctly used by the browser, but files uploaded through the API are not.
The AWS/S3 console, by default, automatically sets the Content-Type: for many uploaded file types, (for CSS, this should probably be text/css)... but the API requires it be set by your code.
But, you don't appear to be setting it, so the browser may be refusing to use the css, even if it downloads successfully, because the Content-Type: response header contains an unexpected value.
(The browser dev tools/console should show an error or warning to confirm this).
Mentioning file type solved this issue.
s3 = boto3.resource('s3')
s3.meta.client.upload_file('sample.css', 'mybucket', 'sample_dir/sample.css', {'ACL': 'public-read','public-read','ContentType': 'text/css'})