I'm trying to upload a .wav file to s3.
My currently put_object code looks like this:
client.put_object(
ACL='public-read',
Bucket='test',
Key='test_folder/' + file_name,
ContentType = 'audio/x-wav',
Body='test_folder/music/' + file_name,
StorageClass='STANDARD_IA'
)
I'm uploading it successfully, but when I try to download it and play, I get an error:
Could not determine type of stream
What arguments do I need to pass in order to tell S3 that it's a sound .wav file? When I tried to upload it using boto.s3 using bucket.new_key(key_name) and set_content_from_file, it worked perfectly, but I couldn't set the StorageClass using that function.
Thanks.
Related
I was trying to open a file/image in python/django and upload it to s3 but I get different errors depending on what I try. I can get it to work when I send the image using the front end html form but not when opening the file on the back end. I get errors such as "'bytes' object has no attribute 'file'" Any ideas how to open an image and upload it to s3? I wasn't sure if I was using the correct upload function, but it worked when I received the file from an html form instead of opening it directly.
image = open(fileURL, encoding="utf-8")
S3_BUCKET = settings.AWS_BUCKET
session = boto3.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
s3 = session.resource('s3')
s3.Bucket(S3_BUCKET).put_object(Key='folder/%s' % fileName, Body=image)
Thanks.
The open command return a file object. Therefore Body=image does not contain the actual contents of the object.
Since you want to upload an existing object, you could use:
Key = 'folder/' + fileName
s3.Object(S3_BUCKET, Key).upload_file(fileURL)
I am using Paramiko to access a remote SFTP folder, and I'm trying to write code that transfers files from a path in SFTP (with a simple logic using the file metadata to check it's last modified date) to AWS S3 bucket.
I have set the connection to S3 using Boto3, but I still can't seem to write a working code that transfers the files without downloading them to a local directory first. Here is some code I tried using Paramiko's getfo() method. But it doesn't work.
for f in files:
# get last modified from file metadata
last_modified = sftp.stat(remote_path + f).st_mtime
last_modified_date = datetime.fromtimestamp(last_modified).date()
if last_modified_date > date_limit: # check limit
print('getting ' + f)
full_path = f"{folder_path}{f}"
fo = sftp.getfo(remote_path + f,f)
s3_conn.put_object(Body=fo,Bucket=s3_bucket, Key=full_path)
Thank you!
Use Paramiko SFTPClient.open to get a file-like object that you can pass to Boto3 Client.put_object:
with sftp.open(remote_path + f, "r") as f:
f.prefetch()
s3_conn.put_object(Body=f)
For the purpose of the f.prefetch(), see Reading file opened with Python Paramiko SFTPClient.open method is slow.
For the opposite direction, see:
Transfer file from AWS S3 to SFTP using Boto 3
My code is to add watermark on video and upload that video directly to s3.
I am able to do add the watermark on video. Upload that video Directly to s3 need to be done.
import moviepy.editor as mp
import boto3
AWS_ACCESS_KEY_ID="aws_key_id"
AWS_SECRET_ACCESS_KEY="aws_secret_access_key"
s3_resource = boto3.resource("s3")
BUCKET = "bucket_name"
video = mp.VideoFileClip('path_of_ video_file_stored')
logo = (mp.ImageClip('logo_file_stored')
.set_duration(video.duration))
final = mp.CompositeVideoClip([video,logo])
final.subclip(0).write_videofile('localpath/filename.mp4')
Instead of writing to local from final.subclip(0).write_videofile(localpath/filename.mp4)
I need to write file directly on s3 instead of first writing on local and then uploading to s3. May I know how to write to s3 directly with code solution of above.
In my flask application, I am using a function to upload file to Amazon s3, using Boto.
Its working fine most of the cases, but some times its uploading files as zero byte file with no extension.
Why its failing sometimes,
I am validating user image file in form.
FileField('Your photo',validators=[FileAllowed(['jpg', 'png'], 'Images only!')])
My image upload function.
def upload_image_to_s3(image_from_form):
#upload pic to amazon
source_file_name_photo = secure_filename(image_from_form.filename)
source_extension = os.path.splitext(source_file_name_photo)[1]
destination_file_name_photo = uuid4().hex + source_extension
s3_file_name = destination_file_name_photo
# Connect to S3 and upload file.
conn = boto.connect_s3('ASJHjgjkhSDJJHKJKLSDH','GKLJHASDJGFAKSJDGJHASDKJKJHbbvhjcKJHSD')
b = conn.get_bucket('mybucket')
# Connect to S3 and upload file.
sml = b.new_key("/".join(["myfolder",destination_file_name_photo]))
sml.set_contents_from_string(image_from_form.read())
acl='public-read'
sml.set_acl(acl)
return s3_file_name
How large are your assets? If there is too large of an upload, you may have to multipart/chunk it otherwise it will timeout.
bucketObject.initiate_multipart_upload('/local/object/as/file.ext')
it means you will not be using set_contents_from_string but rather store and upload. You may have to use something to chuck the file, like FileChuckIO.
An example is here if this applies to you : http://www.bogotobogo.com/DevOps/AWS/aws_S3_uploading_large_file.php
Also, you may want to edit your post above and alter your AWS keys.
I'm attempting to save an image to S3 using boto. It does save a file, but it doesn't appear to save it correctly. If I try to open the file in S3, it just shows a broken image icon. Here's the code I'm using:
# Get and verify the file
file = request.FILES['file']
try:
img = Image.open(file)
except:
return api.error(400)
# Determine a filename
filename = file.name
# Upload to AWS and register
s3 = boto.connect_s3(aws_access_key_id=settings.AWS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY)
bucket = s3.get_bucket(settings.AWS_BUCKET)
f = bucket.new_key(filename)
f.set_contents_from_file(file)
I've also tried replacing the last line with:
f.set_contents_from_string(file.read())
But that didn't work either. Is there something obvious that I'm missing here? I'm aware django-storages has a boto backend, but because of complexity with this model, I do not want to use forms with django-storages.
Incase you don't want to go for django-storages and just want to upload few files to s3 rather then all the files then below is the code:
import boto3
file = request.FILES['upload']
s3 = boto3.resource('s3', aws_access_key_id=settings.AWS_ACCESS_KEY, aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY)
bucket = s3.Bucket('bucket-name')
bucket.put_object(Key=filename, Body=file)
You should use django-storages which uses boto internally.
You can either swap the default FileSystemStorage, or create a new storage instance and manually save files. Based on your code example I guess you really want to go with the first option.
Please consider using django's Form instead of directly accessing the request.