I'm having troubles trying to upload files to FileField from local path.
I have correctly configurred CDN backend in S3 bucket and use it as PrivateMediaStorage for one of my model fields:
class MyModel(models.Model):
some_file = models.FileField(storage=PrivateMediaStorage())
...
With this very simple configuration whenever I'm creating/updating model through django-admin it is saved and file attached as some_file is correctly uploaded to S3 bucket.
Yet if I try to create/update model instance programmatically, say through custom manage.py command, model instance itself is created but attachment is never uploaded to CDN. Here's simplified version of code I'm using to upload files:
class Command(BaseCommand):
help = 'Creates dummy instance for quicker configuration'
def handle(self, *args, **options):
some_file = os.path.join(os.path.dirname(__file__), '../../../temporary/some_image.png')
if not os.path.exists(some_file):
raise CommandError(f'File {some_file} does not exist')
else:
instance, created = MyModel.objects.get_or_create(defaults={'some_file': some_file}, ...)
What is missing in my implementation and what needs to be adjusted to allow file uploads from local storage?
You're passing a string (the result of os.path.join()) to your some_file field, but you need to pass it an actual File object.
The easiest way to save a file on a model directly is to use the FieldFile's save() method.
As a working solution for case provided in question a valid way of creating a record would be:
instance = MyModel.objects.create(some_file=File(file=open(some_file, 'rb'), name='some_name.png'))
Or even better to use pathlib to obtain name dynamically:
from pathlib import Path
instance = MyModel.objects.create(some_file=File(file=open(some_file, 'rb'), name=Path(some_file).name))
Note that fetching a row based on the file is unlikely to work, AFAIK each time you open a file, doing a get_or_create() with the File instance as argument will probably create a new row each time. Better put file fields into defaults:
with open(some_file, 'rb') as file:
instance, created = MyModel.objects.get_or_create(
some_other_field=...,
defaults={'some_file': File(
file=file,
name=pathlib.Path(some_file).name
)}
)
you can also do something like this.
some_file = os.path.join(os.path.dirname(__file__), '../../../temporary/some_image.png')
instance.some_file.name = some_file
instance.save()
Related
I generate a file in python, and want to "upload" that file to the django database. This way it is automatically put inside the media folder, and organized neatly with all other files of my application.
Now here is what I tried: (type hinting used, since it's python 3.6)
# forms.py
class UploadForm(forms.ModelForm):
class Meta:
model = UploadedFile
fields = ('document',)
# models.py
class UploadedFile(models.Model):
document = models.FileField(upload_to=get_upload_path)
# mimetype is generated by filename on save
mimetype = models.CharField(max_length=255)
# ... additional fields like temporary
def get_upload_path(instance: UploadedFile, filename):
if instance.temporary:
return "uploaded_files/temp/" + filename
return "uploaded_files/" + filename
# views.py, file_out has been generated
with open(file_out, 'rb') as local_file:
from django.core.files import File
form = UploadForm(dict(), {'document': File(local_file)})
print(form.errors)
if form.is_valid():
file = form.save(commit=False)
# ... set additional fields
file.save()
form.save_m2m()
return file
Now this is not the only thing I've tried. First I've gone with setting the FileField directly, but that resulted in the save() to fail, while the mimetype field is set. Because the original file sits outside the media folder, and thus a suspicious file action is triggered.
Also, the form gives some feedback about the "upload", through the form.errors.
Depending on my approach, either the save() fails as mentioned above -- meaning the "uploading" does not actually copy the file in the media folder -- or the form returns the error that no file was transmitted, and tells to check the form protocol.
Now my theory is, that I would have to go and initialize my own instance of InMemoryUploadedFile, but I could not figure out how to do that myself, and no documentation was available on the internet.
It feels like I'm taking the wrong approach from the get go. How would one do this properly?
Do you have get_upload_path defined? If not, that would explain the errors you're getting.
From what I can see you're on the right track. If you don't need a dynamic path for your uploads, if you just want them in media/uploads, you can pass in a string value for upload_to (from the Django docs):
# file will be uploaded to MEDIA_ROOT/uploads
document = models.FileField(upload_to='uploads/')
First of all, thanks to Franey for pointing me at storage documentation which lead me to contentfile documentation.
The ContentFile actually solves the problem, because it basically is the self-instantiated version of InMemoryUploadedFile that I was looking for. It's a django File that is not stored on disk.
Here's the full solution:
# views.py, file_out has been generated
with open(file_out, 'rb') as local_file:
from django.core.files.base import ContentFile
# we need to provide a name. Otherwise the Storage.save
# method reveives a None-parameter and breaks.
form = UploadForm(dict(), {'document': ContentFile(local_file.read(), name=name)})
if form.is_valid():
file = form.save(commit=False)
# ... set additional fields
file.save()
form.save_m2m()
return file
I have a Django model that saves filename as "uuid4().pdf". Where uuid4 generates a random uuid for each instance created. This file name is also stored on the amazon s3 server with the same name.
I am trying to add a custom disposition for filename that i upload to amazon s3, this is because i want to see a custom name whenever i download the file not the uuid one. At the same time, i want the files to stored on s3 with the uuid filename.
So, I am using django-storages with python 2.7. I have tried adding content_disposition in settings like this:
AWS_CONTENT_DISPOSITION = 'core.utils.s3.get_file_name'
where get_file_name() returns the filename.
I have also tried adding this to the settings:
AWS_HEADERS = {
'Content-Disposition': 'attachments; filename="%s"'% get_file_name(),
}
no luck!
Do anyone of you know to implement this.
Current version of S3Boto3Storage from django-storages supports AWS_S3_OBJECT_PARAMETERS global settings variable, which allows modify ContentDisposition too. But the problem is that it is applied as is to all objects that are uploaded to s3 and, moreover, affects all models working with the storage, which may turn to be not the expected result.
The following hack worked for me.
from storages.backends.s3boto3 import S3Boto3Storage
class DownloadableS3Boto3Storage(S3Boto3Storage):
def _save_content(self, obj, content, parameters):
"""
The method is called by the storage for every file being uploaded to S3.
Below we take care of setting proper ContentDisposition header for
the file.
"""
filename = obj.key.split('/')[-1]
parameters.update({'ContentDisposition': f'attachment; filename="{filename}"'})
return super()._save_content(obj, content, parameters)
Here we override native save method of the storage object and make sure proper content disposition is set on each file.
Of course, you need to feed this storage to the field you working on:
my_file_filed = models.FileField(upload_to='mypath', storage=DownloadableS3Boto3Storage())
In case someone finds this, like I did: none of the solutions mentioned on SO worked for me in Django 3.0.
Docstring of S3Boto3Storage suggests overriding S3Boto3Storage.get_object_parameters, however this method only receives name of the uploaded file, which at this point has been changed by upload_to and can differ from the original.
What worked is the following:
class S3Boto3CustomStorage(S3Boto3Storage):
"""Override some upload parameters, such as ContentDisposition header."""
def _get_write_parameters(self, name, content):
"""Set ContentDisposition header using original file name.
While docstring recomments overriding `get_object_parameters` for this purpose,
`get_object_parameters` only gets a `name` which is not the original file name,
but the result of `upload_to`.
"""
params = super()._get_write_parameters(name, content)
original_name = getattr(content, 'name', None)
if original_name and name != original_name:
content_disposition = f'attachment; filename="{original_name}"'
params['ContentDisposition'] = content_disposition
return params
and then using this storage in the file field, e.g.:
file_field = models.FileField(
upload_to=some_func,
storage=S3Boto3CustomStorage(),
)
Whatever solution you come up with, do not change file_field.storage.object_parameters directly (e.g. in model's save() as it's been suggested in a similar question), because this will change ContentDisposition header for subsequent file uploads of any field that uses the same storage. Which is not what you probably want.
I guess you are using S3BotoStorage from django-storages, so while uploading the file to S3, override the save() method of the model, and set the header there.
I am giving an example below:
class ModelName(models.Model):
sthree = S3BotoStorage()
def file_name(self,filename):
ext = filename.split('.')[-1]
name = "%s/%s.%s" % ("downloads", uuid.uuid4(), ext)
return name
upload_file = models.FileField(upload_to=file_name,storage = sthree)
def save(self):
self.upload_file.storage.headers = {'Content-Disposition': 'attachments; filename="%s"' %self.upload_file.name}
super(ModelName, self).save()
One way can be giving ResponseContentDisposition parameter to S3Boto3Storage.url() method. In this case you don't have to create a custom storage.
Example model:
class MyModel(models.Model):
file = models.FileField(upload_to=generate_upload_path)
original_filename = models.CharField(max_length=255)
Creating URL for your file:
# obj is instance of MyModel
url = obj.file.storage.url(
obj.file.name,
parameters={
'ResponseContentDisposition': f'inline; filename={obj.original_filename}',
},
)
If you want to force browser to download the file, replace inline with attachment.
If you are using non-ascii filenames, check how Django encodes filename for Content-Disposition header in FileResponse.
I need to change default file upload behavior in django and the documentation on the django site is rather confusing.
I have a model with a field as follows:
class document (models.Model):
name = models.CharField(max_length=200)
file = models.FileField(null=True, upload_to='uploads/')
I need to create a .json file that will contain meta data when a file is uploaded. For example if I upload a file mydocument.docx I need to create mydocument.json file within the uploads/ folder and add meta information about the document.
From what I can decipher from the documentation I need to create a file upload handler as a subclass of django.core.files.uploadhandler.FileUploadHandler. It also goes on to say I can define this anywhere I want.
My questions: Where is the best place to define my subclass? Also from the documentation found here https://docs.djangoproject.com/en/1.8/ref/files/uploads/#writing-custom-upload-handlers looks like the subclass would look like the following:
class FileUploadHandler(object):
def handle_raw_input(self, input_data, META, content_length, boundary, encoding=None):
# do the acctual writing to disk
def file_complete(self, file_size):
# some logic to create json file
Does anyone have a working example of a upload handler class that works for django version 1.8?
One option could be to do the .json file generation on the (model) form used to initially upload the file. Override the save() method of the ModelForm to generate the file immediately after the model has been saved.
class DocumentForm(forms.ModelForm):
class Meta(object):
model = Document
fields = 'name', 'file'
def save(self, commit=True):
saved_document = super().save(commit)
with open(saved_document.file.path + '.json', mode='w') as fh:
fh.write(json.dumps({
"size": saved_document.file.size,
"uploaded": timezone.now().isoformat()
}))
return saved_document
I've tested this locally but YMMV if you are using custom storages for working with things like S3.
I am trying to work with a filestorage in Django. Everything is working fine but a thing in my save method I guess. I have a model with a FileField
download_url = models.FileField(verbose_name = 'Konfig', upload_to = file_path, storage = OverwriteStorage())
In this method in my model I create the file_path
def file_path(instance, filename):
path = os.getcwd() + '/files'
return os.path.join(path, str(instance.download_url), filename)
And the filestorage method I use is outsourced in my storage.py which I import in my models.py
from django.core.files.storage import FileSystemStorage
class OverwriteStorage(FileSystemStorage):
def _save(self, name, content):
if self.exists(name):
self.delete(name)
return super(OverwriteStorage, self)._save(name, content)
def get_available_name(self, name):
return name
Now when I create a new file in the admin interface from django, it successfully uploads the file, makes a database entry with the correct filepath, but it fails to create the right path. When my filename is foo the path would look like following:
cwd/files/foo/foo
and if its name would be bar.txt it would look like following:
cwd/files/bar.txt/bar.txt
I don't want django to create a subdirectory based on the filename. Can you guys help me out ?
Im pretty sure you have to rename the save function from "save" to "_save".
On the Super Call, you used ._save, which isnt the same function as the save function above.
You can read alot about Super here
I have an existing file on disk (say /folder/file.txt) and a FileField model field in Django.
When I do
instance.field = File(file('/folder/file.txt'))
instance.save()
it re-saves the file as file_1.txt (the next time it's _2, etc.).
I understand why, but I don't want this behavior - I know the file I want the field to be associated with is really there waiting for me, and I just want Django to point to it.
How?
just set instance.field.name to the path of your file
e.g.
class Document(models.Model):
file = FileField(upload_to=get_document_path)
description = CharField(max_length=100)
doc = Document()
doc.file.name = 'path/to/file' # must be relative to MEDIA_ROOT
doc.file
<FieldFile: path/to/file>
If you want to do this permanently, you need to create your own FileStorage class
import os
from django.conf import settings
from django.core.files.storage import FileSystemStorage
class MyFileStorage(FileSystemStorage):
# This method is actually defined in Storage
def get_available_name(self, name):
if self.exists(name):
os.remove(os.path.join(settings.MEDIA_ROOT, name))
return name # simply returns the name passed
Now in your model, you use your modified MyFileStorage
from mystuff.customs import MyFileStorage
mfs = MyFileStorage()
class SomeModel(model.Model):
my_file = model.FileField(storage=mfs)
try this (doc):
instance.field.name = <PATH RELATIVE TO MEDIA_ROOT>
instance.save()
It's right to write own storage class. However get_available_name is not the right method to override.
get_available_name is called when Django sees a file with same name and tries to get a new available file name. It's not the method that causes the rename. the method caused that is _save. Comments in _save is pretty good and you can easily find it opens file for writing with flag os.O_EXCL which will throw an OSError if same file name already exists. Django catches this Error then calls get_available_name to get a new name.
So I think the correct way is to override _save and call os.open() without flag os.O_EXCL. The modification is quite simple however the method is a little be long so I don't paste it here. Tell me if you need more help :)
I had exactly the same problem! then I realize that my Models were causing that. example I hade my models like this:
class Tile(models.Model):
image = models.ImageField()
Then, I wanted to have more the one tile referencing the same file in the disk! The way that I found to solve that was change my Model structure to this:
class Tile(models.Model):
image = models.ForeignKey(TileImage)
class TileImage(models.Model):
image = models.ImageField()
Which after I realize that make more sense, because if I want the same file being saved more then one in my DB I have to create another table for it!
I guess you can solve your problem like that too, just hoping that you can change the models!
EDIT
Also I guess you can use a different storage, like this for instance: SymlinkOrCopyStorage
http://code.welldev.org/django-storages/src/11bef0c2a410/storages/backends/symlinkorcopy.py
You should define your own storage, inherit it from FileSystemStorage, and override OS_OPEN_FLAGS class attribute and get_available_name() method:
Django Version: 3.1
Project/core/files/storages/backends/local.py
import os
from django.core.files.storage import FileSystemStorage
class OverwriteStorage(FileSystemStorage):
"""
FileSystemStorage subclass that allows overwrite the already existing
files.
Be careful using this class, as user-uploaded files will overwrite
already existing files.
"""
# The combination that don't makes os.open() raise OSError if the
# file already exists before it's opened.
OS_OPEN_FLAGS = os.O_WRONLY | os.O_TRUNC | os.O_CREAT | getattr(os, 'O_BINARY', 0)
def get_available_name(self, name, max_length=None):
"""
This method will be called before starting the save process.
"""
return name
In your model, use your custom OverwriteStorage
myapp/models.py
from django.db import models
from core.files.storages.backends.local import OverwriteStorage
class MyModel(models.Model):
my_file = models.FileField(storage=OverwriteStorage())
The answers work fine if you are using the app's filesystem to store your files. But, If your are using boto3 and uploading to sth like AWS S3 and maybe you want to set a file already existing in an S3 bucket to your model's FileField then, this is what you need.
We have a simple model class with a filefield:
class Image(models.Model):
img = models.FileField()
owner = models.ForeignKey(get_user_model(), on_delete=models.CASCADE, related_name='images')
date_added = models.DateTimeField(editable=False)
date_modified = models.DateTimeField(editable=True)
from botocore.exceptions import ClientError
import boto3
s3 = boto3.client(
's3',
aws_access_key_id=os.getenv("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.getenv("AWS_SECRET_ACCESS_KEY")
)
s3_key = S3_DIR + '/' + filename
bucket_name = os.getenv("AWS_STORAGE_BUCKET_NAME")
try:
s3.upload_file(local_file_path, bucket_name, s3_key)
# we want to store it to our db model called **Image** after s3 upload is complete so,
image_data = Image()
image_data.img.name = s3_key # this does it !!
image_data.owner = get_user_model().objects.get(id=owner_id)
image_data.save()
except ClientError as e:
print(f"failed uploading to s3 {e}")
Setting the S3 KEY into the name field of the FileField does the trick. As much i have tested everything related works as expected e.g previewing the image file in django admin. fetching the images from db appends the root s3 bucket prefix (or, the cloudfront cdn prefix) to the s3 keys of the files too. Ofcourse, its given that, i already had a working setup of the django settings.py for boto and s3.