Set file permission not public with django-storages and S3boto - python

I'm developing django1.4 application with django-storages and botoS3.
I have model like this
class MyModel(models.Model):
image = models.ImageField(blank=True, null=True, upload_to='my_image')
I don't want to the image public, but django-storage set the file permission public
(Grantee : Everyone Open/Download) automatically.
Could you tell me how to set permission not public automatically?
Thanks!

You can use the AWS_DEFAULT_ACL setting, if you set it to 'private', s3boto will store the files as private.
I think this is not documented, but you can see a complete list of django-storage S3 parameters in the source code of the original __init__ function.

Related

How upload files to AWS only with a few models filtered in django models?

I have this code in a models.py of my Django app. I got the default file storage saving my Files on the remote server. But it store ALL the File objects/models to the remote server. Is there any option to set the upload on the remote server AWS only in the models that I want?
class Attachment(models.Model):
file = models.FileField(upload_to=log_att_path)
log_sender = models.ForeignKey(
LogSender,
related_name='attachments',
on_delete=models.CASCADE
)
timestamp = models.DateTimeField(auto_now_add=True)
attachment_url = models.TextField(default=False)
```
There is a really good and popular package django-storages on github. You can use this package to upload your uploaded files to aws. You can use S3Boto3Storage from this package to handle your file uploads.
Is there any option to set the upload on the remote server AWS only in
the models that I want?
You can specify storage class to be used in your file field if you only want to use AWS for specific models/fields like this.
from storages.backends.s3boto3 import S3Boto3Storage
class Attachment(models.Model):
file = models.FileField(upload_to=log_att_path, storage=S3Boto3Storage)
# other stuff

How to upload Image Files on Shared Host/ Shared Folder using Django, Python?

I have two application server 10.1.xx.xx and 10.1.xx.yy and middle of both I have load balancer 10.5.aa.bb and I have deployed my Django application in both the servers successfully and able to access the application too.
There is a shared folder in between both the servers where I have to upload the images so that both servers have access of all the images, but I don't have any idea, how should I do it? up-till now I just upload the images in project folder. I googled a lot I just got this blog,
http://www.bogotobogo.com/python/Django/Python_Django_Image_Files_Uploading_On_Shared_Host_Example.php
but It is also not working.
I tried with following setting but it is uploading file in project directory.
settings.py
MEDIA_URL = '/home/bynry-01/www/media/'
MEDIA_ROOT='http://192.168.1.109:3333/www/media/'
model.py
class Document(models.Model):
description = models.CharField(max_length=255, blank=True)
document = models.FileField(upload_to='documents/')
uploaded_at = models.DateTimeField(auto_now_add=True)
views.py
document=Document()
document.document = request.FILES['imgfile']
document.save()

Django not uploading image files to Azure storage container

How do I link my Azure blob storage to my Django admin, such that it uploads the file to the blob storage account when saving a new record.
I have my image upload set up in the admin already. The admin interface acts like the image is attached before I click save, although I am aware that the image file is not actually stored in my SQLite3 database.
I can reference them successfully in the consumer-facing portion of my project when the images are manually uploaded to the Azure blob storage account. I don't want to manually upload them each time, for obvious reasons.
There has to be a simple solution for this, I just haven't had success in researching it. Thanks in advance!
models.py
class Image(models.Model):
file = models.ImageField(upload_to='img/')
def __unicode__(self):
return u'%s' % self.file
class Product(models.Model):
...
picture = models.ManyToManyField(Image)
...
settings.py
MEDIA_ROOT = path.join(PROJECT_ROOT, 'media').replace('\\', '/')
MEDIA_URL = 'https://my_site.blob.core.windows.net/'
Using Django 1.7, Python 2.7, SQLite3
Django-storages has support for an Azure blob backend which would allow any uploads you do to be automatically stored in your storage container.
http://django-storages.readthedocs.org/en/latest/backends/azure.html
I'm not aware of any built-in Django API that allows us to change the blob's content type. But from my experience, you can use Azure SDK for Python to upload blobs: https://github.com/Azure/azure-sdk-for-python. The most important setting in your case is the content type. By default content type is application/octet-stream. However you can change it via x_ms_blob_content_type. Please refer to https://azure.microsoft.com/en-us/documentation/articles/storage-python-how-to-use-blob-storage/ for a sample and feel free to let us know if you have any further concerns.
My Configuration
# Media
MEDIA_ROOT = '/home/<user>/media/'
MEDIA_URL = '/media/'
Remember folder need permission (write, read) apache user
example:
<img src="/media/img/my_image.png">
or
<img src="{{obj.file.url}}">

store files in a remote server Django

I have a model where I have FileField to upload some documents. I save the files into my filesystem. I want to know if is possible to save the files in another server using ftp or another method.
For example my django app server run in the host 192.168.0.1 and I want to store the files in the host with the IP 192.168.0.2.
I can access without any problem through ftp to the server where I want to store the files.
Models:
class Documentacion(models.Model):
id_doc = models.AutoField(primary_key=True)
id_proceso = models.ForeignKey(Proceso,db_column='id',verbose_name='Proceso')
tipo_docu = odels.CharField(max_length=100,null=False,blank=False,verbose_name='Tipo Doc.')
fecha = models.DateField(auto_now = True)
autor = models.CharField(max_length=50,blank=False,null=False)
descripcion = models.CharField(max_length=250,blank=True,null=True)
documento = models.FileField(upload_to='docs/')
My MEDIA settings:
MEDIA_ROOT = os.path.join(BASE_DIR,'media')
MEDIA_URL = '/media/'
If you go throught the docs, FileField#upload_to it's only to define a local filesystem path.
One thing you can do is: let the user uploads the content normally then when you have it in your filesystem run a process to upload the file to another server, aka your ftp server, you could use the ftplib library to archieve this.
Maybe using Django Storages will ease your work.

When to programmatically create custom Django permissions?

The permission/authentication documentation for Django 1.4 provides the following snippet for creating custom permissions programmatically:
Edit: (I would like to employ this for permissions that are not necessarily linked to a specific model class, but more general permissions that span multiple types.)
from django.contrib.auth.models import Group, Permission
from django.contrib.contenttypes.models import ContentType
content_type = ContentType.objects.get(app_label='myapp', model='BlogPost')
permission = Permission.objects.create(codename='can_publish',
name='Can Publish Posts',
content_type=content_type)
Source
My question is where this code should be placed. Obviously these should only be created once, but I don't want to have to do it in the shell. It seems like this should be stored in a file somewhere. (For documentation sake.)
Usually it is enough to just add the needed permissions to the corresponding model class using the permissions meta attribute.
This is from the official documentation:
class Task(models.Model):
...
class Meta:
permissions = (
("view_task", "Can see available tasks"),
("change_task_status", "Can change the status of tasks"),
("close_task", "Can remove a task by setting its status as closed"),
)

Categories

Resources