Django with S3 staticfiles optimization - python

I am trying to optimize my staticfiles using django with S3. I am using django compressor to compress and cache js and css files.
Here are my settings :
AWS_ACCESS_KEY_ID = access_key
AWS_SECRET_ACCESS_KEY = secret_key
AWS_STORAGE_BUCKET_NAME='mybucketname'
AWS_QUERYSTRING_AUTH = False
S3_URL = 'https://%s.s3.amazonaws.com/' %AWS_STORAGE_BUCKET_NAME
MEDIA_URL = S3_URL + "media/"
STATIC_URL = S3_URL + "static/"
ADMIN_MEDIA_PREFIX = STATIC_URL + "admin/"
STATICFILES_DIRS = (
os.path.join(BASE_DIR, "static","static_dirs"),
#'/var/www/static/',
)
AWS_HEADERS = {
'Cache-Control': 'public,max-age=86400',
}
STATIC_ROOT = os.path.join(BASE_DIR, "static","static_root")
STATICFILES_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
DEFAULT_FILE_STORAGE = 'lafabrique.settings.s3utils.MediaRootS3BotoStorage'
COMPRESS_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
COMPRESS_URL = S3_URL
and in another file :
class CachedS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.GzipCompressorFileStorage")()
def save(self, name, content):
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content)
return name
What I don't understand is that when I test my page on https://developers.google.com/speed/pagespeed/insights/, google still tells me that I should use gzip and cache on my static files...
Also in my amazon http response I get : Cache-Control:max-age=0 ... ( actual website is lafabrique.io, just in case)
Does somebody know what I did wrong ?
Thanks a lot

Are you using Django-storages? Try adding this to your settings:
AWS_IS_GZIPPED = True
GZIP_CONTENT_TYPES = (
'text/css',
'application/javascript',
'application/x-javascript',
'text/javascript'
)
It looks like you're using gzipped storage on your local machine, but not for the file that you upload to S3.
For the caching issue, try the solution here: Trouble setting cache-cotrol header for Amazon S3 key using boto

Related

Suspicious Operation trying to upload local image to S3 in Django

I'm developing a Django project where my Users have a profile picture saved as an ImageField, which I currently store in S3.
class MyUser(models.Model):
user = models.OneToOneField(User, unique=True)
picture = models.ImageField(max_length=255, upload_to=get_user_pic_path)
def get_user_pic_path(instance, filename):
return os.path.join('user_pics', instance.user.username, filename)
with the relevant settings.py:
# Amazon S3 stuff
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
# URL prefix for static files.
STATIC_ROOT = '/%s/' % AWS_LOCATION
STATIC_URL = '//%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, AWS_LOCATION)
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'static'),
]
# Media urls
MEDIA_ROOT = '/%s/' % MEDIAFILES_LOCATION
MEDIA_URL = '//%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
The project works in production, which allows users to create accounts by taking a webcam photo and saving that to S3, but I get a SuspiciousOperation when trying to set up a local user for testing purposes. I have a command that is invoked by python manage.py create_test_user, which does:
def handle(self, *args, **options):
pic_path = os.path.join(settings.BASE_DIR, 'static', 'img', 'test_img.jpg')
user_list, pic_list = [], []
for i in range(5):
# create user objects associated with each MyUser
u = User.objects.create_user(
username='bickeree_'+str(i),
email=str(i)+'#gmail.com',
password='1234'
)
u.first_name = 'Test User'
u.last_name = str(i)
u.save()
# we store the picture so we can properly close it
pic_list.append(File(open(pic_path, 'rb')))
user_list.append(MyUser(
user=u,
picture=pic_list[-1],
))
# create all of the MyUser objects
MyUser.objects.bulk_create(user_list)
# clean up the images
for pic in pic_list:
pic.close()
Running this command results in:
django.core.exceptions.SuspiciousOperation: Attempted access to '/Users/username/Documents/projectname/static/img/test_img.jpg' denied.
How to get around this error?

Using Cloudfront with Django S3Boto

I have successfully set up my app to use S3 for storing all static and media files. However, I would like to upload to S3 (current operation), but serve from a cloudfront instance I have set up. I have tried adjusting settings to the cloudfront url but it does not work. How can I upload to S3 and serve from Cloudfront please?
Settings
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
DEFAULT_FILE_STORAGE = 'app.custom_storages.MediaStorage'
STATICFILES_STORAGE = 'app.custom_storages.StaticStorage'
STATICFILES_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
STATIC_URL = "https://s3-eu-west-1.amazonaws.com/app/%s/" % (STATICFILES_LOCATION)
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)
custom_storages.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
Your code is almost complete except you are not adding your cloudfront domain to STATIC_URL/MEDIA_URL and your custom storages.
In detail, you must first install the dependencies
pip install django-storages-redux boto
Add the required settings to your django settings file
INSTALLED_APPS = (
...
'storages',
...
)
AWS_STORAGE_BUCKET_NAME = 'mybucketname'
AWS_CLOUDFRONT_DOMAIN = 'xxxxxxxx.cloudfront.net'
AWS_ACCESS_KEY_ID = get_secret("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = get_secret("AWS_SECRET_ACCESS_KEY")
MEDIAFILES_LOCATION = 'media'
MEDIA_ROOT = '/%s/' % MEDIAFILES_LOCATION
MEDIA_URL = '//%s/%s/' % (AWS_CLOUDFRONT_DOMAIN, MEDIAFILES_LOCATION)
DEFAULT_FILE_STORAGE = 'app.custom_storages.MediaStorage'
STATICFILES_LOCATION = 'static'
STATIC_ROOT = '/%s/' % STATICFILES_LOCATION
STATIC_URL = '//%s/%s/' % (AWS_CLOUDFRONT_DOMAIN, STATICFILES_LOCATION)
STATICFILES_STORAGE = 'app.custom_storages.StaticStorage'
Your custom storages need some modification to present the cloudfront domain for the resources, instead of the S3 domain:
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
"""uploads to 'mybucket/static/', serves from 'cloudfront.net/static/'"""
location = settings.STATICFILES_LOCATION
def __init__(self, *args, **kwargs):
kwargs['custom_domain'] = settings.AWS_CLOUDFRONT_DOMAIN
super(StaticStorage, self).__init__(*args, **kwargs)
class MediaStorage(S3BotoStorage):
"""uploads to 'mybucket/media/', serves from 'cloudfront.net/media/'"""
location = settings.MEDIAFILES_LOCATION
def __init__(self, *args, **kwargs):
kwargs['custom_domain'] = settings.AWS_CLOUDFRONT_DOMAIN
super(MediaStorage, self).__init__(*args, **kwargs)
And that is all you need, assuming your bucket and cloudfront domain are correctly linked and the user's AWS_ACCESS_KEY has access permissions to your bucket. Additionally, based on your use case, you may wish to make your s3 bucket items read-only accessible by everyone.
I had a similar issue and just setting AWS_S3_CUSTOM_DOMAIN to the Cloudfront url in Django's settings.py worked for me. You can check the code here.

Django-Storages + Easy_Thumbnails: [Errno 30] Read-only file system

I have installed easy_thumbnails and am trying to deploy my solution on to S3. I'm using https://github.com/jamstooks/django-s3-folder-storage to separate my /media/ and /static/ folders, with media containing uploaded content.
My settings file works like this:
# static file config
DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
DEFAULT_S3_PATH = "media"
STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'
STATIC_S3_PATH = "static"
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
MEDIA_ROOT = '/media/'
MEDIA_URL = 'https://%s.s3.amazonaws.com/media/' % AWS_STORAGE_BUCKET_NAME
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = '//%s.s3.amazonaws.com/static/' % AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
AWS_PRELOAD_METADATA = True
CKEDITOR_UPLOAD_PATH = 'uploads'
AWS_DEFAULT_ACL = 'public-read'
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
But I'm encountering this error:
TemplateSyntaxError at /
Couldn't get the thumbnail teams/alumni/images/thumbs/alumni.png: [Errno 30] Read-only file system: '/media'
After doing a lot of research I discovered http://gibuloto.com/blog/easy-thumbnails-with-amazon-s3/. This should resolve any issues one may be having in implementing easy_thumbnails using S3.

Upload Media from Heroku to Amazon S3

New to Heroku & Amazon S3, so bear with me. Uploaded my Django app onto Heroku, and having a problem with user media uploads. The model is below:
#models.py
class Movie(models.Model):
title = models.CharField(max_length = 500)
poster = models.ImageField(upload_to = 'storages.backends.s3boto')
pub_date = models.DateTimeField(auto_now_add = True)
author = models.ForeignKey(User)
The poster attribute is the one where the image is uploaded. I had it running fine locally, and now on Heroku there is an error. So I added 'storages.backends.s3boto', as numerous other posts have told me to. (not sure if right).
My Settings.py file looks like this right now, kind of a mess:
#settings.py
PROJECT_ROOT = os.path.abspath(os.path.dirname(__file__))
PROJECT_DIR = os.path.join(PROJECT_ROOT, '../qanda')
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = '****************'
AWS_SECRET_ACCESS_KEY = '************'
AWS_STORAGE_BUCKET_NAME = 'mrt-assets'
AWS_PRELOAD_METADATA = True
MEDIA_ROOT = os.path.join(PROJECT_ROOT, 'qanda/media/movie_posters/)
MEDIA_URL = '/media'
STATIC_ROOT = os.path.join(PROJECT_ROOT, 'staticfiles')
STATIC_URL = 'https://mrt-assets.s3.amazonaws.com/static/'
STATICFILES_DIRS = (os.path.join(PROJECT_DIR, 'static'),)
My bucket is called mrt-assets, and there are 2 folders in there static (css, js, images and media. I'm not too worried about the static files for now, as I've hardcoded the CSS/JS files into my HTML files*, but how do I get my user uploaded media (images of any kind) into mrt-assets/media?
*although if someone wanted to help with STATIC files too that would be great. But user uploaded media more urgent.
EDIT (per Yuji's comment):
Have tried a number of options, and none of them working. I've gone back and deleted a lot of changes, and this is now my Settings
#settings.py
PROJECT_ROOT = os.path.abspath(os.path.dirname(__file__))
MEDIA_ROOT = 'http://s3.amazonaws.com/mrt-assets/media/'
MEDIA_URL = '/media/'
STATIC_ROOT = 'http://s3.amazonaws.com/mrt-assets/static/'
STATIC_URL = '/static/'
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
TEMPLATE_DIRS = (os.path.join(PROJECT_ROOT, "templates"),)
#models.py
#same as before, but now have changed the poster directory
poster = models.ImageField().
Not really sure what to do, need to connect my Heroku app to S3 so user media uploads are saved there.
Have now changed my S3 Bucket to this
mrt-assets
static
css
js
images
media
(empty)
The trick of getting your media to upload into <bucket>/media and your static assets into <bucket>/static is to create two different default storage backends for the two asset types or to explicitly instantiate your model fields with a storage object taking a location parameter.
Instantiating model field with custom storage
from storages.backends.s3boto import S3BotoStorage
class Movie(models.Model):
title = models.CharField(max_length=500)
poster = models.ImageField(storage=S3BotoStorage(location='media'))
pub_date = models.DateTimeField(auto_now_add=True)
author = models.ForeignKey(User)
Giving S3BotoStorage a location will prefix all uploads with its path.
Creating custom storage backends for media and static assets
This is almost the same as explicitly defining a storage backend with
location, but instead we'll be using settings.MEDIA_ROOT and
settings.STATIC_ROOT to apply a path prefix globally.
# settings.py
STATIC_ROOT = '/static/'
MEDIA_ROOT = '/media/'
DEFAULT_FILE_STORAGE = 'app.storage.S3MediaStorage'
STATICFILES_STORAGE = 'app.storage.S3StaticStorage'
# app/storage.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class S3MediaStorage(S3BotoStorage):
def __init__(self, **kwargs):
kwargs['location'] = kwargs.get('location',
settings.MEDIA_ROOT.replace('/', ''))
super(S3MediaStorage, self).__init__(**kwargs)
class S3StaticStorage(S3BotoStorage):
def __init__(self, **kwargs):
kwargs['location'] = kwargs.get('location',
settings.STATIC_ROOT.replace('/', ''))
super(S3StaticStorage, self).__init__(**kwargs)
Refining it
You can refine the above code to take advantage of
Heroku config vars
to make it more portable:
# settings.py
import os
STATIC_ROOT = os.environ.get('STATIC_ROOT',
os.path.join(os.path.dirname(__file__), 'static'))
MEDIA_ROOT = os.environ.get('MEDIA_ROOT',
os.path.join(os.path.dirname(__file__), 'media'))
DEFAULT_FILE_STORAGE = os.environ.get('DEFAULT_FILE_STORAGE',
'django.core.files.storage.FileSystemStorage')
STATICFILES_STORAGE = os.environ.get('STATICFILES_STORAGE',
'django.contrib.staticfiles.storage.StaticFilesStorage')
Couple the above settings with a .env file and you can use the
default storage backends locally for development and testing and when
deploying on Heroku you'll automatically switch to
app.storage.S3MediaStorage and app.storage.S3StaticStorage respectively:
# .env
STATIC_ROOT=static
MEDIA_ROOT=media
DEFAULT_FILE_STORAGE=app.storage.S3MediaStorage
STATICFILES_STORAGE=app.storage.S3StaticStorage

How to set-up a Django project with django-storages and Amazon S3, but with different folders for static files and media files?

I'm configuring a Django project that were using the server filesystem for storing the apps static files (STATIC_ROOT) and user uploaded files (MEDIA_ROOT).
I need now to host all that content on Amazon's S3, so I have created a bucket for this. Using django-storages with the boto storage backend, I managed to upload collected statics to the S3 bucket:
MEDIA_ROOT = '/media/'
STATIC_ROOT = '/static/'
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'KEY_ID...'
AWS_SECRET_ACCESS_KEY = 'ACCESS_KEY...'
AWS_STORAGE_BUCKET_NAME = 'bucket-name'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
Then, I got a problem: the MEDIA_ROOT and STATIC_ROOT are not used within the bucket, so the bucket root contains both the static files and user uploaded paths.
So then I could set:
S3_URL = 'http://s3.amazonaws.com/%s' % AWS_STORAGE_BUCKET_NAME
STATIC_URL = S3_URL + STATIC_ROOT
MEDIA_URL = 'S3_URL + MEDIA_ROOT
And use those settings in the templates, but there is no distinction of static/media files when storing in S3 with django-storages.
How this can be done?
Thanks!
I think the following should work, and be simpler than Mandx's method, although it's very similar:
Create a s3utils.py file:
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
Then in your settings.py:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3BotoStorage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3BotoStorage'
A different but related example (that I've actually tested) can be seen in the two example_ files here.
I'm currently using this code in a separated s3utils module:
from django.core.exceptions import SuspiciousOperation
from django.utils.encoding import force_unicode
from storages.backends.s3boto import S3BotoStorage
def safe_join(base, *paths):
"""
A version of django.utils._os.safe_join for S3 paths.
Joins one or more path components to the base path component intelligently.
Returns a normalized version of the final path.
The final path must be located inside of the base path component (otherwise
a ValueError is raised).
Paths outside the base path indicate a possible security sensitive operation.
"""
from urlparse import urljoin
base_path = force_unicode(base)
paths = map(lambda p: force_unicode(p), paths)
final_path = urljoin(base_path + ("/" if not base_path.endswith("/") else ""), *paths)
# Ensure final_path starts with base_path and that the next character after
# the final path is '/' (or nothing, in which case final_path must be
# equal to base_path).
base_path_len = len(base_path) - 1
if not final_path.startswith(base_path) \
or final_path[base_path_len:base_path_len + 1] not in ('', '/'):
raise ValueError('the joined path is located outside of the base path'
' component')
return final_path
class StaticRootS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(StaticRootS3BotoStorage, self).__init__(*args, **kwargs)
self.location = kwargs.get('location', '')
self.location = 'static/' + self.location.lstrip('/')
def _normalize_name(self, name):
try:
return safe_join(self.location, name).lstrip('/')
except ValueError:
raise SuspiciousOperation("Attempted access to '%s' denied." % name)
class MediaRootS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(MediaRootS3BotoStorage, self).__init__(*args, **kwargs)
self.location = kwargs.get('location', '')
self.location = 'media/' + self.location.lstrip('/')
def _normalize_name(self, name):
try:
return safe_join(self.location, name).lstrip('/')
except ValueError:
raise SuspiciousOperation("Attempted access to '%s' denied." % name)
Then, in my settings module:
DEFAULT_FILE_STORAGE = 'myproyect.s3utils.MediaRootS3BotoStorage'
STATICFILES_STORAGE = 'myproyect.s3utils.StaticRootS3BotoStorage'
I got to redefine the _normalize_name() private method to use a "fixed" version of the safe_join() function, since the original code is giving me SuspiciousOperation exceptions for legal paths.
I'm posting this for consideration, if anyone can give a better answer or improve this one, it will be very welcome.
File: PROJECT_NAME/custom_storages.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
File: PROJECT_NAME/settings.py
STATICFILES_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
if not DEBUG:
STATICFILES_STORAGE = 'PROJECT_NAME.custom_storages.StaticStorage'
DEFAULT_FILE_STORAGE = 'PROJECT_NAME.custom_storages.MediaStorage'
AWS_ACCESS_KEY_ID = 'KEY_XXXXXXX'
AWS_SECRET_ACCESS_KEY = 'SECRET_XXXXXXXXX'
AWS_STORAGE_BUCKET_NAME = 'BUCKET_NAME'
AWS_HEADERS = {'Cache-Control': 'max-age=86400',}
AWS_QUERYSTRING_AUTH = False
And run: python manage.py collectstatic
I think the answer is pretty simple and done by default. This is working for me on AWS Elastic Beanstalk with Django 1.6.5 and Boto 2.28.0:
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
)
TEMPLATE_LOADERS = (
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_KEY']
The AWS keys are passed in from the container config file and I have no STATIC_ROOT or STATIC_URL set at all. Also, no need for the s3utils.py file. These details are handled by the storage system automatically. The trick here is that I needed to reference this unknown path in my templates correctly and dynamically. For example:
<link rel="icon" href="{% static "img/favicon.ico" %}">
That is how I address my favicon which lives locally (pre-deployment) in ~/Projects/my_app/project/my_app/static/img/favicon.ico.
Of course I have a separate local_settings.py file for accessing this stuff locally in dev environment and it does have STATIC and MEDIA settings. I had to do a lot of experimenting and reading to find this solution and it works consistently with no errors.
I understand that you need the static and root separation and considering that you can only provide one bucket I would point out that this method takes all the folders in my local environment under ~/Projects/my_app/project/my_app/static/and creates a folder in the bucket root (ie: S3bucket/img/ as in the example above). So you do get separation of files. For example you could have a media folder in the static folder and access it via templating with this:
{% static "media/" %}
I hope this helps. I came here looking for the answer and pushed a bit harder to find a simpler solution than to extend the storage system. Instead, I read the documentation about the intended use of Boto and I found that a lot of what I needed was built-in by default. Cheers!
If you want to have subfolders even before media or static seperations, you can use AWS_LOCATION on top of bradenm answer.
Reference: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#usage
AWS_STORAGE_BUCKET_NAME = 'bucket_name'
AWS_LOCATION = 'path1/path2/'
Bradenm's answer is outdated and doesn't work so I updated it in March 2021.
Updated One:
Create a s3utils.py in the same folder of "settings.py":
from storages.backends.s3boto3 import S3Boto3Storage
StaticRootS3Boto3Storage = lambda: S3Boto3Storage(location='static')
MediaRootS3Boto3Storage = lambda: S3Boto3Storage(location='media')
Then, add 2 lines of code to settings.py and change "myproject" to your folder name:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3Boto3Storage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3Boto3Storage'
The updated one has multiple "3s" as I emphasize below.
s3utils.py:
from storages.backends.s3boto"3" import S3Boto"3"Storage
StaticRootS3Boto"3"Storage = lambda: S3Boto"3"Storage(location='static')
MediaRootS3Boto"3"Storage = lambda: S3Boto"3"Storage(location='media')
settings.py:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3Boto"3"Storage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3Boto"3"Storage'
Check and compare with Bradenm's (outdated) answer.
"I respect Bradenm's answer."

Categories

Resources