I generate a file in python, and want to "upload" that file to the django database. This way it is automatically put inside the media folder, and organized neatly with all other files of my application.
Now here is what I tried: (type hinting used, since it's python 3.6)
# forms.py
class UploadForm(forms.ModelForm):
class Meta:
model = UploadedFile
fields = ('document',)
# models.py
class UploadedFile(models.Model):
document = models.FileField(upload_to=get_upload_path)
# mimetype is generated by filename on save
mimetype = models.CharField(max_length=255)
# ... additional fields like temporary
def get_upload_path(instance: UploadedFile, filename):
if instance.temporary:
return "uploaded_files/temp/" + filename
return "uploaded_files/" + filename
# views.py, file_out has been generated
with open(file_out, 'rb') as local_file:
from django.core.files import File
form = UploadForm(dict(), {'document': File(local_file)})
print(form.errors)
if form.is_valid():
file = form.save(commit=False)
# ... set additional fields
file.save()
form.save_m2m()
return file
Now this is not the only thing I've tried. First I've gone with setting the FileField directly, but that resulted in the save() to fail, while the mimetype field is set. Because the original file sits outside the media folder, and thus a suspicious file action is triggered.
Also, the form gives some feedback about the "upload", through the form.errors.
Depending on my approach, either the save() fails as mentioned above -- meaning the "uploading" does not actually copy the file in the media folder -- or the form returns the error that no file was transmitted, and tells to check the form protocol.
Now my theory is, that I would have to go and initialize my own instance of InMemoryUploadedFile, but I could not figure out how to do that myself, and no documentation was available on the internet.
It feels like I'm taking the wrong approach from the get go. How would one do this properly?
Do you have get_upload_path defined? If not, that would explain the errors you're getting.
From what I can see you're on the right track. If you don't need a dynamic path for your uploads, if you just want them in media/uploads, you can pass in a string value for upload_to (from the Django docs):
# file will be uploaded to MEDIA_ROOT/uploads
document = models.FileField(upload_to='uploads/')
First of all, thanks to Franey for pointing me at storage documentation which lead me to contentfile documentation.
The ContentFile actually solves the problem, because it basically is the self-instantiated version of InMemoryUploadedFile that I was looking for. It's a django File that is not stored on disk.
Here's the full solution:
# views.py, file_out has been generated
with open(file_out, 'rb') as local_file:
from django.core.files.base import ContentFile
# we need to provide a name. Otherwise the Storage.save
# method reveives a None-parameter and breaks.
form = UploadForm(dict(), {'document': ContentFile(local_file.read(), name=name)})
if form.is_valid():
file = form.save(commit=False)
# ... set additional fields
file.save()
form.save_m2m()
return file
Related
I am creating and saving a PDF as such in my views:
views.py
#login_required(login_url="/login")
def PackingListView(request):
if request.method == "POST":
form = PackingListForm(request.POST)
if form.is_valid():
if 'preview' in request.POST:
...
elif 'save' in request.POST:
pdf_contents = form
file = ContentFile(pdf_contents)
item = PackingListDocuments.objects.get(pk=1)
item.PackingListDocument.save('test.pdf', file) #saving as FileField in model
form.save()
messages.success(request, "Success: Packing List Has Been Created!")
return redirect('HomeView')
I see that the test.pdf is saved. I can see it in my file explorer as well as in the admin, but every time that I attempt to open it, the file seems to be corrupted. What do I need to add or subtract in my code to get this working?
Thanks!
UPDATE:
I've changed the line: file = ContentFile(pdf_contents) to file = File(pdf_contents)
But now I am receiving an attribute error that 'PackingListForm' object has no attribute 'read'
I believe the error must be to do with this line
file = ContentFile(pdf_contents)
Note that, from the docs
The ContentFile class inherits from File, but unlike File it operates on string content (bytes also supported), rather than an actual file. For example:
So my guess is that you are not passing in a string/byte type as argument to the ContenetFile object.
Try finding the type of it. You can also convert it to string type by doing String(pdf_contents).
I'm having troubles trying to upload files to FileField from local path.
I have correctly configurred CDN backend in S3 bucket and use it as PrivateMediaStorage for one of my model fields:
class MyModel(models.Model):
some_file = models.FileField(storage=PrivateMediaStorage())
...
With this very simple configuration whenever I'm creating/updating model through django-admin it is saved and file attached as some_file is correctly uploaded to S3 bucket.
Yet if I try to create/update model instance programmatically, say through custom manage.py command, model instance itself is created but attachment is never uploaded to CDN. Here's simplified version of code I'm using to upload files:
class Command(BaseCommand):
help = 'Creates dummy instance for quicker configuration'
def handle(self, *args, **options):
some_file = os.path.join(os.path.dirname(__file__), '../../../temporary/some_image.png')
if not os.path.exists(some_file):
raise CommandError(f'File {some_file} does not exist')
else:
instance, created = MyModel.objects.get_or_create(defaults={'some_file': some_file}, ...)
What is missing in my implementation and what needs to be adjusted to allow file uploads from local storage?
You're passing a string (the result of os.path.join()) to your some_file field, but you need to pass it an actual File object.
The easiest way to save a file on a model directly is to use the FieldFile's save() method.
As a working solution for case provided in question a valid way of creating a record would be:
instance = MyModel.objects.create(some_file=File(file=open(some_file, 'rb'), name='some_name.png'))
Or even better to use pathlib to obtain name dynamically:
from pathlib import Path
instance = MyModel.objects.create(some_file=File(file=open(some_file, 'rb'), name=Path(some_file).name))
Note that fetching a row based on the file is unlikely to work, AFAIK each time you open a file, doing a get_or_create() with the File instance as argument will probably create a new row each time. Better put file fields into defaults:
with open(some_file, 'rb') as file:
instance, created = MyModel.objects.get_or_create(
some_other_field=...,
defaults={'some_file': File(
file=file,
name=pathlib.Path(some_file).name
)}
)
you can also do something like this.
some_file = os.path.join(os.path.dirname(__file__), '../../../temporary/some_image.png')
instance.some_file.name = some_file
instance.save()
I have a Django model that saves filename as "uuid4().pdf". Where uuid4 generates a random uuid for each instance created. This file name is also stored on the amazon s3 server with the same name.
I am trying to add a custom disposition for filename that i upload to amazon s3, this is because i want to see a custom name whenever i download the file not the uuid one. At the same time, i want the files to stored on s3 with the uuid filename.
So, I am using django-storages with python 2.7. I have tried adding content_disposition in settings like this:
AWS_CONTENT_DISPOSITION = 'core.utils.s3.get_file_name'
where get_file_name() returns the filename.
I have also tried adding this to the settings:
AWS_HEADERS = {
'Content-Disposition': 'attachments; filename="%s"'% get_file_name(),
}
no luck!
Do anyone of you know to implement this.
Current version of S3Boto3Storage from django-storages supports AWS_S3_OBJECT_PARAMETERS global settings variable, which allows modify ContentDisposition too. But the problem is that it is applied as is to all objects that are uploaded to s3 and, moreover, affects all models working with the storage, which may turn to be not the expected result.
The following hack worked for me.
from storages.backends.s3boto3 import S3Boto3Storage
class DownloadableS3Boto3Storage(S3Boto3Storage):
def _save_content(self, obj, content, parameters):
"""
The method is called by the storage for every file being uploaded to S3.
Below we take care of setting proper ContentDisposition header for
the file.
"""
filename = obj.key.split('/')[-1]
parameters.update({'ContentDisposition': f'attachment; filename="{filename}"'})
return super()._save_content(obj, content, parameters)
Here we override native save method of the storage object and make sure proper content disposition is set on each file.
Of course, you need to feed this storage to the field you working on:
my_file_filed = models.FileField(upload_to='mypath', storage=DownloadableS3Boto3Storage())
In case someone finds this, like I did: none of the solutions mentioned on SO worked for me in Django 3.0.
Docstring of S3Boto3Storage suggests overriding S3Boto3Storage.get_object_parameters, however this method only receives name of the uploaded file, which at this point has been changed by upload_to and can differ from the original.
What worked is the following:
class S3Boto3CustomStorage(S3Boto3Storage):
"""Override some upload parameters, such as ContentDisposition header."""
def _get_write_parameters(self, name, content):
"""Set ContentDisposition header using original file name.
While docstring recomments overriding `get_object_parameters` for this purpose,
`get_object_parameters` only gets a `name` which is not the original file name,
but the result of `upload_to`.
"""
params = super()._get_write_parameters(name, content)
original_name = getattr(content, 'name', None)
if original_name and name != original_name:
content_disposition = f'attachment; filename="{original_name}"'
params['ContentDisposition'] = content_disposition
return params
and then using this storage in the file field, e.g.:
file_field = models.FileField(
upload_to=some_func,
storage=S3Boto3CustomStorage(),
)
Whatever solution you come up with, do not change file_field.storage.object_parameters directly (e.g. in model's save() as it's been suggested in a similar question), because this will change ContentDisposition header for subsequent file uploads of any field that uses the same storage. Which is not what you probably want.
I guess you are using S3BotoStorage from django-storages, so while uploading the file to S3, override the save() method of the model, and set the header there.
I am giving an example below:
class ModelName(models.Model):
sthree = S3BotoStorage()
def file_name(self,filename):
ext = filename.split('.')[-1]
name = "%s/%s.%s" % ("downloads", uuid.uuid4(), ext)
return name
upload_file = models.FileField(upload_to=file_name,storage = sthree)
def save(self):
self.upload_file.storage.headers = {'Content-Disposition': 'attachments; filename="%s"' %self.upload_file.name}
super(ModelName, self).save()
One way can be giving ResponseContentDisposition parameter to S3Boto3Storage.url() method. In this case you don't have to create a custom storage.
Example model:
class MyModel(models.Model):
file = models.FileField(upload_to=generate_upload_path)
original_filename = models.CharField(max_length=255)
Creating URL for your file:
# obj is instance of MyModel
url = obj.file.storage.url(
obj.file.name,
parameters={
'ResponseContentDisposition': f'inline; filename={obj.original_filename}',
},
)
If you want to force browser to download the file, replace inline with attachment.
If you are using non-ascii filenames, check how Django encodes filename for Content-Disposition header in FileResponse.
I need to change default file upload behavior in django and the documentation on the django site is rather confusing.
I have a model with a field as follows:
class document (models.Model):
name = models.CharField(max_length=200)
file = models.FileField(null=True, upload_to='uploads/')
I need to create a .json file that will contain meta data when a file is uploaded. For example if I upload a file mydocument.docx I need to create mydocument.json file within the uploads/ folder and add meta information about the document.
From what I can decipher from the documentation I need to create a file upload handler as a subclass of django.core.files.uploadhandler.FileUploadHandler. It also goes on to say I can define this anywhere I want.
My questions: Where is the best place to define my subclass? Also from the documentation found here https://docs.djangoproject.com/en/1.8/ref/files/uploads/#writing-custom-upload-handlers looks like the subclass would look like the following:
class FileUploadHandler(object):
def handle_raw_input(self, input_data, META, content_length, boundary, encoding=None):
# do the acctual writing to disk
def file_complete(self, file_size):
# some logic to create json file
Does anyone have a working example of a upload handler class that works for django version 1.8?
One option could be to do the .json file generation on the (model) form used to initially upload the file. Override the save() method of the ModelForm to generate the file immediately after the model has been saved.
class DocumentForm(forms.ModelForm):
class Meta(object):
model = Document
fields = 'name', 'file'
def save(self, commit=True):
saved_document = super().save(commit)
with open(saved_document.file.path + '.json', mode='w') as fh:
fh.write(json.dumps({
"size": saved_document.file.size,
"uploaded": timezone.now().isoformat()
}))
return saved_document
I've tested this locally but YMMV if you are using custom storages for working with things like S3.
I'm using Django to create a stock photo site, I have an ImageField in my model, the problem is that when the user updates the image field, the original image file isn't deleted from the hard disk.
How can I delete the old images after an update?
Use django-cleanup
pip install django-cleanup
settings.py
INSTALLED_APPS = (
...
'django_cleanup.apps.CleanupConfig', # should be placed after your apps
)
You'll have to delete the old image manually.
The absolute path to the image is stored in your_image_field.path. So you'd do something like:
os.remove(your_image_field.path)
But, as a convenience, you can use the associated FieldFile object, which gives easy access to the underlying file, as well as providing a few convenience methods. See http://docs.djangoproject.com/en/dev/ref/models/fields/#filefield-and-fieldfile
Use this custom save method in your model:
def save(self, *args, **kwargs):
try:
this = MyModelName.objects.get(id=self.id)
if this.MyImageFieldName != self.MyImageFieldName:
this.MyImageFieldName.delete()
except: pass
super(MyModelName, self).save(*args, **kwargs)
It works for me on my site. This problem was bothering me as well and I didn't want to make a cleanup script instead over good bookkeeping in the first place. Let me know if there are any problems with it.
Before updating the model instance, you can use the delete method of FileField object. For example, if the FileField or ImageField is named as photo and your model instance is profile, then the following will remove the file from disk
profile.photo.delete(False)
For more clarification, here is the django doc
https://docs.djangoproject.com/en/1.11/ref/models/fields/#django.db.models.fields.files.FieldFile.delete
You can define pre_save reciever in models:
#receiver(models.signals.pre_save, sender=UserAccount)
def delete_file_on_change_extension(sender, instance, **kwargs):
if instance.pk:
try:
old_avatar = UserAccount.objects.get(pk=instance.pk).avatar
except UserAccount.DoesNotExist:
return
else:
new_avatar = instance.avatar
if old_avatar and old_avatar.url != new_avatar.url:
old_avatar.delete(save=False)
My avatrs has unique url for each person like "avatars/ceb47779-8833-4719-8711-6f4e5cabb2b2.png". If user upload new image with different extension like jpg, delete_file_on_change_extension reciever remove old image, before save new with url "avatars/ceb47779-8833-4719-8711-6f4e5cabb2b2.jpg" (in this case). If user uploads new image with same extension django overwrite old image on storage (disk), because images paths are the same.
This works fine with AWS S3 django-storage.
Here is an app that deletes orphan files by default: django-smartfields.
It will remove files whenever:
field value was replaced with a new one (either uploaded or set manually)
field is cleared through the form (in case that field is not required, of course)
the model instance itself containing the field is deleted.
It is possible to turn that cleanup feature off using an argument: ImageField(keep_orphans=True) on per field basis, or globally in settings SMARTFIELDS_KEEP_ORPHANS = True.
from django.db import models
from smartfields import fields
class MyModel(models.Model):
image = fields.ImageField()
document = fields.FileField()
try this, it will work even if old file is deleted
def logo_file(instance, filename):
try:
this = business.objects.get(id=instance.id)
if this.logo is not None:
path = "%s" % (this.logo)
os.remove(path)
finally:
pass..
code will work even without "try .. finally" but it will generate problem if file was accidently deleted.
changed: move model matching inside "try" so it will not throw any error at user signup
Let me know if there are any problems.
Completing Chris Lawlor's answer, tried this and works.
from YOURAPP.settings import BASE_DIR
try:
os.remove(BASE_DIR + user.userprofile.avatarURL)
except Exception as e:
pass
The URL has a pattern of /media/mypicture.jpg
What I did is saving the path to the old image and if form is valid I would delete the old one.
if request.method == 'POST':
old_image = ""
if request.user.profile.profile_picture:
old_image = request.user.profile.profile_picture.path
form = UpdateProfileForm(request.POST,request.FILES,instance = profile)
if form.is_valid():
if os.path.exists(old_image):
os.remove(old_image)
form.save()
It is a little messy , but you do not install third parties or anythin