I'm attempting to save an image to S3 using boto. It does save a file, but it doesn't appear to save it correctly. If I try to open the file in S3, it just shows a broken image icon. Here's the code I'm using:
# Get and verify the file
file = request.FILES['file']
try:
img = Image.open(file)
except:
return api.error(400)
# Determine a filename
filename = file.name
# Upload to AWS and register
s3 = boto.connect_s3(aws_access_key_id=settings.AWS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY)
bucket = s3.get_bucket(settings.AWS_BUCKET)
f = bucket.new_key(filename)
f.set_contents_from_file(file)
I've also tried replacing the last line with:
f.set_contents_from_string(file.read())
But that didn't work either. Is there something obvious that I'm missing here? I'm aware django-storages has a boto backend, but because of complexity with this model, I do not want to use forms with django-storages.
Incase you don't want to go for django-storages and just want to upload few files to s3 rather then all the files then below is the code:
import boto3
file = request.FILES['upload']
s3 = boto3.resource('s3', aws_access_key_id=settings.AWS_ACCESS_KEY, aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY)
bucket = s3.Bucket('bucket-name')
bucket.put_object(Key=filename, Body=file)
You should use django-storages which uses boto internally.
You can either swap the default FileSystemStorage, or create a new storage instance and manually save files. Based on your code example I guess you really want to go with the first option.
Please consider using django's Form instead of directly accessing the request.
Related
I was trying to open a file/image in python/django and upload it to s3 but I get different errors depending on what I try. I can get it to work when I send the image using the front end html form but not when opening the file on the back end. I get errors such as "'bytes' object has no attribute 'file'" Any ideas how to open an image and upload it to s3? I wasn't sure if I was using the correct upload function, but it worked when I received the file from an html form instead of opening it directly.
image = open(fileURL, encoding="utf-8")
S3_BUCKET = settings.AWS_BUCKET
session = boto3.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
s3 = session.resource('s3')
s3.Bucket(S3_BUCKET).put_object(Key='folder/%s' % fileName, Body=image)
Thanks.
The open command return a file object. Therefore Body=image does not contain the actual contents of the object.
Since you want to upload an existing object, you could use:
Key = 'folder/' + fileName
s3.Object(S3_BUCKET, Key).upload_file(fileURL)
I have written code on my backend (hosted on Elastic Beanstalk) to retrieve a file from an S3 bucket and save it back to the bucket under a different name. I am using boto3 and have created an s3 client called 's3'.
bucketname is the name of the bucket, keyname is name of the key. I am also using the tempfile module
tmp = tempfile.NamedTemporaryFile()
with open(tmp.name, 'wb') as f:
s3.download_fileobj(bucketname, keyname, f)
s3.upload_file(tmp, bucketname, 'fake.jpg')
I was wondering if my understanding was off (still debugging why there is an error) - I created a tempfile and opened and saved within it the contents of the object with the keyname and bucketname. Then I uploaded that temp file to the bucket under a different name. Is my reasoning correct?
The upload_file() command is expecting a filename (as a string) in the first parameter, not a file object.
Instead, you should use upload_fileobj().
However, I would recommend something different...
If you simply wish to make a copy of an object, you can use copy_object:
response = client.copy_object(
Bucket='destinationbucket',
CopySource='/sourcebucket/HappyFace.jpg',
Key='HappyFaceCopy.jpg',
)
I am uploading images to a folder currently on local . like in site/uploads.
And After searching I got that for uploading images to Amazon S3, I have to do like this
import boto3
s3 = boto3.resource('s3')
# Get list of objects for indexing
images=[('image01.jpeg','Albert Einstein'),
('image02.jpeg','Candy'),
('image03.jpeg','Armstrong'),
('image04.jpeg','Ram'),
('image05.jpeg','Peter'),
('image06.jpeg','Shashank')
]
# Iterate through list to upload objects to S3
for image in images:
file = open(image[0],'rb')
object = s3.Object('rekognition-pictures','index/'+ image[0])
ret = object.put(Body=file,
Metadata={'FullName':image[1]}
)
Clarification
Its my code to send images and name to S3 . But I dont know how to get image in this line of code images=[('image01.jpeg','Albert Einstein'), like how can I get this image in this code from /upload/image01.jpeg . and 2ndly how can I get images from s3 and show in my website image page ?
I know your question is specific to boto3 so you might not like my answer, but it will achieve the same outcome as what you would like to achieve and the aws-cli also makes use of boto3.
See here: http://bigdatums.net/2016/09/17/copy-local-files-to-s3-aws-cli/
This example is from the site and could easily be used in a script:
#!/bin/bash
#copy all files in my-data-dir into the "data" directory located in my-s3-bucket
aws s3 cp my-data-dir/ s3://my-s3-bucket/data/ --recursive
The very first thing, the code snippet you are showing as a reference is not for your use case as I had written that code snippet for batch uploads from boto3 where you have to provide image paths in your script along with metadata for image, so the names in your code snippet are metadata.So upto what i get to understand from your question, you want files in a local folder to be uploaded and want to provide custom names before uploading , so this is how you will do that.
import os
import boto3
s3 = boto3.resource('s3')
directory_in_str="E:\\streethack\\hold"
directory = os.fsencode(directory_in_str)
for file in os.listdir(directory):
filename = os.fsdecode(file)
if filename.endswith(".jpeg") or filename.endswith(".jpg") or filename.endswith(".png"):
strg=directory_in_str+'\\'+filename
print(strg)
print("Enter name for your image : ")
inp_val = input()
strg2=inp_val+'.jpeg'
file = open(strg,'rb')
object = s3.Object('mausamrest','test/'+ strg2) #mausamrest is bucket
object.put(Body=file,ContentType='image/jpeg',ACL='public-read')
else:
continue
programmatically , you have to provide path of folder which is hard-coded in this example in directory_in_str variable. then , this code will iterate over each file searching for image , then it will ask for input for custom name and then it will upload your file.
Moreover, you want to show these images on your website , so public_read for images have been turned on using ACL , so you can directly use s3 links to embedd images in your webpages like this one.
https://s3.amazonaws.com/mausamrest/test/jkl.jpeg
This above file is the one i used to test this code snippet. your images will be availbale like this. Make sure you change bucket name. :)
Using the Resource method:
# Iterate through list to upload objects to S3
bucket = s3.Bucket('rekognition-pictures')
for image in images:
bucket.upload_file(Filename='/upload/' + image[0],
Key='index/' + image[0],
ExtraArgs={'FullName': image[1]}
)
Using the client method:
import boto3
client = boto3.client('s3')
...
# Iterate through list to upload objects to S3
for image in images:
client.upload_file(Filename='/upload/' + image[0],
Bucket='rekognition-pictures',
Key='index/' + image[0],
ExtraArgs={'FullName': image[1]}
)
In my flask application, I am using a function to upload file to Amazon s3, using Boto.
Its working fine most of the cases, but some times its uploading files as zero byte file with no extension.
Why its failing sometimes,
I am validating user image file in form.
FileField('Your photo',validators=[FileAllowed(['jpg', 'png'], 'Images only!')])
My image upload function.
def upload_image_to_s3(image_from_form):
#upload pic to amazon
source_file_name_photo = secure_filename(image_from_form.filename)
source_extension = os.path.splitext(source_file_name_photo)[1]
destination_file_name_photo = uuid4().hex + source_extension
s3_file_name = destination_file_name_photo
# Connect to S3 and upload file.
conn = boto.connect_s3('ASJHjgjkhSDJJHKJKLSDH','GKLJHASDJGFAKSJDGJHASDKJKJHbbvhjcKJHSD')
b = conn.get_bucket('mybucket')
# Connect to S3 and upload file.
sml = b.new_key("/".join(["myfolder",destination_file_name_photo]))
sml.set_contents_from_string(image_from_form.read())
acl='public-read'
sml.set_acl(acl)
return s3_file_name
How large are your assets? If there is too large of an upload, you may have to multipart/chunk it otherwise it will timeout.
bucketObject.initiate_multipart_upload('/local/object/as/file.ext')
it means you will not be using set_contents_from_string but rather store and upload. You may have to use something to chuck the file, like FileChuckIO.
An example is here if this applies to you : http://www.bogotobogo.com/DevOps/AWS/aws_S3_uploading_large_file.php
Also, you may want to edit your post above and alter your AWS keys.
In my Django project I use Django-storageS to save media files in my Amazon S3.
I followed this tutorial (I use also Django-rest-framework). This works well for me: I can upload some images and I can see these on my S3 storage.
But, if I try to remove an instance of my model (that contains an ImageField) this not removes the corresponding file in S3. Is correct this? I need t remove also the resource in S3.
Deleting a record will not automatically delete the file in the S3 Bucket. In order to delete the S3 resource you need to call the following method on your file field:
model.filefield.delete(save=False) # delete file in S3 storage
You can perform this either in
The delete method of your model
A pre_delete signal
Here is an example of how you can achieve this in the delete model method:
def delete(self):
self.filefield.delete(save=False)
super().delete()
You can delete S3 files by offering its id (filename in the S3 storage) using following code:
import boto
from boto.s3.key import Key
from django.conf import settings
def s3_delete(id):
s3conn = boto.connect_s3(settings.AWS_ACCESS_KEY,
settings.AWS_SECRET_ACCESS_KEY)
bucket = s3conn.get_bucket(settings.S3_BUCKET)
k = Key(bucket)
k.key = str(id)
k.delete()
Make sure that you setup S3 variable correctly in settings.py including: AWS_ACCESS_KEY, AWS_SECRET_ACCESS_KEY, and S3_BUCKET.
This works for me in aws s3,hope it helps
import os
#receiver(models.signals.post_delete, sender=YourModelName)
def auto_delete_file_on_delete(sender, instance, **kwargs):
if instance.image:
instance.image.delete(save=False) ## use for aws s3
# if os.path.isfile(instance.image.path): ## use this in development
# os.remove(instance.image.path)
I ended up making a function to Django admin panel since in my case I don't remove files frequently.
If you want to delete files using API you may write your own destroy() in your serializer.
BUCKET_NAME = os.environ.get("AWS_STORAGE_BUCKET_NAME")
s3 = boto3.client('s3')
class UserFileAdmin(admin.ModelAdmin):
list_display = ('file')
actions = ['delete_completely']
def delete_completely(self, request, queryset):
for filemodel in queryset:
s3.delete_object(Bucket=BUCKET_NAME, Key=str(filemodel.file))
filemodel.delete()
delete_completely.short_description = 'Delete pointer and real file together'