store files in a remote server Django - python

I have a model where I have FileField to upload some documents. I save the files into my filesystem. I want to know if is possible to save the files in another server using ftp or another method.
For example my django app server run in the host 192.168.0.1 and I want to store the files in the host with the IP 192.168.0.2.
I can access without any problem through ftp to the server where I want to store the files.
Models:
class Documentacion(models.Model):
id_doc = models.AutoField(primary_key=True)
id_proceso = models.ForeignKey(Proceso,db_column='id',verbose_name='Proceso')
tipo_docu = odels.CharField(max_length=100,null=False,blank=False,verbose_name='Tipo Doc.')
fecha = models.DateField(auto_now = True)
autor = models.CharField(max_length=50,blank=False,null=False)
descripcion = models.CharField(max_length=250,blank=True,null=True)
documento = models.FileField(upload_to='docs/')
My MEDIA settings:
MEDIA_ROOT = os.path.join(BASE_DIR,'media')
MEDIA_URL = '/media/'

If you go throught the docs, FileField#upload_to it's only to define a local filesystem path.
One thing you can do is: let the user uploads the content normally then when you have it in your filesystem run a process to upload the file to another server, aka your ftp server, you could use the ftplib library to archieve this.
Maybe using Django Storages will ease your work.

Related

How upload files to AWS only with a few models filtered in django models?

I have this code in a models.py of my Django app. I got the default file storage saving my Files on the remote server. But it store ALL the File objects/models to the remote server. Is there any option to set the upload on the remote server AWS only in the models that I want?
class Attachment(models.Model):
file = models.FileField(upload_to=log_att_path)
log_sender = models.ForeignKey(
LogSender,
related_name='attachments',
on_delete=models.CASCADE
)
timestamp = models.DateTimeField(auto_now_add=True)
attachment_url = models.TextField(default=False)
```
There is a really good and popular package django-storages on github. You can use this package to upload your uploaded files to aws. You can use S3Boto3Storage from this package to handle your file uploads.
Is there any option to set the upload on the remote server AWS only in
the models that I want?
You can specify storage class to be used in your file field if you only want to use AWS for specific models/fields like this.
from storages.backends.s3boto3 import S3Boto3Storage
class Attachment(models.Model):
file = models.FileField(upload_to=log_att_path, storage=S3Boto3Storage)
# other stuff

How to upload Image Files on Shared Host/ Shared Folder using Django, Python?

I have two application server 10.1.xx.xx and 10.1.xx.yy and middle of both I have load balancer 10.5.aa.bb and I have deployed my Django application in both the servers successfully and able to access the application too.
There is a shared folder in between both the servers where I have to upload the images so that both servers have access of all the images, but I don't have any idea, how should I do it? up-till now I just upload the images in project folder. I googled a lot I just got this blog,
http://www.bogotobogo.com/python/Django/Python_Django_Image_Files_Uploading_On_Shared_Host_Example.php
but It is also not working.
I tried with following setting but it is uploading file in project directory.
settings.py
MEDIA_URL = '/home/bynry-01/www/media/'
MEDIA_ROOT='http://192.168.1.109:3333/www/media/'
model.py
class Document(models.Model):
description = models.CharField(max_length=255, blank=True)
document = models.FileField(upload_to='documents/')
uploaded_at = models.DateTimeField(auto_now_add=True)
views.py
document=Document()
document.document = request.FILES['imgfile']
document.save()

Django not uploading image files to Azure storage container

How do I link my Azure blob storage to my Django admin, such that it uploads the file to the blob storage account when saving a new record.
I have my image upload set up in the admin already. The admin interface acts like the image is attached before I click save, although I am aware that the image file is not actually stored in my SQLite3 database.
I can reference them successfully in the consumer-facing portion of my project when the images are manually uploaded to the Azure blob storage account. I don't want to manually upload them each time, for obvious reasons.
There has to be a simple solution for this, I just haven't had success in researching it. Thanks in advance!
models.py
class Image(models.Model):
file = models.ImageField(upload_to='img/')
def __unicode__(self):
return u'%s' % self.file
class Product(models.Model):
...
picture = models.ManyToManyField(Image)
...
settings.py
MEDIA_ROOT = path.join(PROJECT_ROOT, 'media').replace('\\', '/')
MEDIA_URL = 'https://my_site.blob.core.windows.net/'
Using Django 1.7, Python 2.7, SQLite3
Django-storages has support for an Azure blob backend which would allow any uploads you do to be automatically stored in your storage container.
http://django-storages.readthedocs.org/en/latest/backends/azure.html
I'm not aware of any built-in Django API that allows us to change the blob's content type. But from my experience, you can use Azure SDK for Python to upload blobs: https://github.com/Azure/azure-sdk-for-python. The most important setting in your case is the content type. By default content type is application/octet-stream. However you can change it via x_ms_blob_content_type. Please refer to https://azure.microsoft.com/en-us/documentation/articles/storage-python-how-to-use-blob-storage/ for a sample and feel free to let us know if you have any further concerns.
My Configuration
# Media
MEDIA_ROOT = '/home/<user>/media/'
MEDIA_URL = '/media/'
Remember folder need permission (write, read) apache user
example:
<img src="/media/img/my_image.png">
or
<img src="{{obj.file.url}}">

Opening a data file from the media directory in Django

I have an application that allows for users to upload CSV files with data which is then graphed and displayed to the user. These files are saved as media within my media folder. In my graph view however I need to open the file and process it. My problem is that I can only open files that are within my project's current working directory and any time that I attempt to upload a file from somewhere outside of that directory I get this error:
File b'TEST.CSV' does not exist
I have attempted this, but without success:
file_upload_dir = os.path.join(settings.MEDIA_ROOT, 'Data_Files')
data_file = open(os.path.join(file_upload_dir, new_file), 'rb')
The variable new_file is only the name of the file saved from in a session and not a path to that file. Data_Files is a directory within the media directory that contains the uploaded files.
My media settings for Django are
SETTINGS_DIR = os.path.dirname(__file__)
PROJECT_PATH = os.path.join(SETTINGS_DIR, os.pardir)
PROJECT_PATH = os.path.abspath(PROJECT_PATH)
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(PROJECT_PATH, 'media')
Is there a way to reference the media files properly from a view?
Here is the output of file_upload_dir and the location of the new file.
>>> print(file_upload_dir)
C:\\Users\\vut46744\\Desktop\\graphite_project\\media\\Data_Files
>>> print(os.path.join(file_upload_dir, new_file))
C:\\Users\\vut46744\\Desktop\\graphite_project\\media\\Data_Files\\TEST.CSV
Normally, you should not access files using open() in a Django app. You should use the storage API. This allows your code to play well with Django settings, and potential third party apps that augment this API.
https://docs.djangoproject.com/en/1.7/topics/files/#file-storage
So here you should be doing something like
from django.core.files.storage import default_storage
f = default_storage.open(os.path.join('Data_Files', new_file), 'r')
data = f.read()
f.close()
print(data)
By the way, if you want it to be modular, it would be a good idea to have a custom storage class, allowing easy configuration and use of your app, should your requirements change. Also, that allows putting files outside of MEDIA_ROOT. This sample storage will put them in settings.UPLOADS_ROOT (and default to MEDIA_ROOT if the setting is not found).
# Put this in a storage.py files in your app
from django.conf import settings
from django.core.files.storage import FileSystemStorage, get_storage_class
from django.utils.functional import LazyObject
class UploadsStorage(FileSystemStorage):
def __init__(self, location=None, base_url=None, *args, **kwargs):
if location is None:
location = getattr(settings, 'UPLOADS_ROOT', None)
super(UploadsStorage, self).__init__(location, base_url, *args, **kwargs)
self.base_url = None # forbid any URL generation for uploads
class ConfiguredStorage(LazyObject):
def _setup(self):
storage = getattr(settings, 'UPLOADS_STORAGE', None)
klass = UploadsStorage if storage is None else get_storage_class(storage)
self._wrapped = klass()
uploads_storage = ConfiguredStorage()
We create a very simple storage here. It's just the regular one, but that reads its files from another directory. Then we set up a lazy object that will allow overriding that storage from settings.
So now your code becomes:
from myapp.storage import uploads_storage
f = uploads_storage.open(new_files, 'r')
And in your settings, you set UPLOADS_ROOT to whatever you like. Probably something outside your media directory. And if someday you decide to store uploads in a database instead, you can set UPLOADS_STORAGE to a database-backed storage, your code will happily use it.

Remote access of django models

I have a django 1.5 project using django models over mysql running on apache server.
class Person(models.Model):
first_name = models.CharField(max_length=100)
last_name = models.CharField(max_length=100)
birthdate = models.DateField()
class Book(models.Model):
name = models.CharField(max_length=100)
author = models.ForeignKey(Person)
I also have a python/django application (using django custom commands) running on a remote computer that must use these models.
Remote application shares the same model definitions with server
Remote application needs read only access to models
Remote application cannot have a full dump of server database, as server must return a queryset based on user rights
Remote application can only connect over http to server
Server can expose the models over REST API (json)
Is there any automated way to transfer models over http? I have tried to use django.core.serializers but I had the following issues:
I cannot serialize the related objects in a queryset
Remote application cannot work without local database
Remote application searches related objects on local db after deserialization (that does not exist)
Edit:
I managed to serialize models like this:
books = Book.objects.prefetch_related('author').all()
authors = [book.author for book in books]
data = authors + list(books.all())
serialized_data = django.core.serializers.serialize("json", data)
My problem is that the remote application cannot deserialize without having a local database.
Don't think that you need to transfer models over http.
need just to connect to server db.
In remote app settings choose db engine (mysql in your case), name.
Specify appropriate user and password.
And enter a valid host and proxy (if needed). the one your database server is running on
As for the user. on the server create a mysql user with read only rights to the db.
This will give you the ability to use the same db for both, server and remote app.
Finally I solved by using sqlite running on ram on client side
On settings.py I used this configuration
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:'
}
}
And the code is like this:
from django.db import connections
from django.core.management.color import no_style
from django.core import serializers
from apps.my_models import Book, Person
connection = connections['default']
cursor = connection.cursor()
sql, references = connection.creation.sql_create_model(Book, no_style(), set())
cursor.execute(sql[0])
sql, references = connection.creation.sql_create_model(Person, no_style(), set())
cursor.execute(sql[0])
serialized_data = get_serialized_data()
for obj in serializers.deserialize("json", serialized_data):
obj.save()
books = Book.objects.prefetch_related('author').all()

Categories

Resources