I'm developing a Django project where my Users have a profile picture saved as an ImageField, which I currently store in S3.
class MyUser(models.Model):
user = models.OneToOneField(User, unique=True)
picture = models.ImageField(max_length=255, upload_to=get_user_pic_path)
def get_user_pic_path(instance, filename):
return os.path.join('user_pics', instance.user.username, filename)
with the relevant settings.py:
# Amazon S3 stuff
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_ACCESS_KEY']
AWS_STORAGE_BUCKET_NAME = os.environ['AWS_STORAGE_BUCKET_NAME']
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
AWS_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
# URL prefix for static files.
STATIC_ROOT = '/%s/' % AWS_LOCATION
STATIC_URL = '//%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, AWS_LOCATION)
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'static'),
]
# Media urls
MEDIA_ROOT = '/%s/' % MEDIAFILES_LOCATION
MEDIA_URL = '//%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
The project works in production, which allows users to create accounts by taking a webcam photo and saving that to S3, but I get a SuspiciousOperation when trying to set up a local user for testing purposes. I have a command that is invoked by python manage.py create_test_user, which does:
def handle(self, *args, **options):
pic_path = os.path.join(settings.BASE_DIR, 'static', 'img', 'test_img.jpg')
user_list, pic_list = [], []
for i in range(5):
# create user objects associated with each MyUser
u = User.objects.create_user(
username='bickeree_'+str(i),
email=str(i)+'#gmail.com',
password='1234'
)
u.first_name = 'Test User'
u.last_name = str(i)
u.save()
# we store the picture so we can properly close it
pic_list.append(File(open(pic_path, 'rb')))
user_list.append(MyUser(
user=u,
picture=pic_list[-1],
))
# create all of the MyUser objects
MyUser.objects.bulk_create(user_list)
# clean up the images
for pic in pic_list:
pic.close()
Running this command results in:
django.core.exceptions.SuspiciousOperation: Attempted access to '/Users/username/Documents/projectname/static/img/test_img.jpg' denied.
How to get around this error?
Related
I'm configuring my Django project to use S3 buckets to store static and media files, both for local and production settings.
My project tree is as follows:
src/
blog/
settings/
__init__
local.py
production.py
s3utils.py
[..]
[..]
My local.py:
access_key = "xx"
secret_key = "yy"
AWS_ACCESS_KEY_ID = access_key
AWS_SECRET_ACCESS_KEY = secret_key
AWS_STORAGE_BUCKET_NAME = 'zz'
STATICFILES_STORAGE = 'blog.s3utils.StaticRootS3BotoStorage'
DEFAULT_FILE_STORAGE = 'blog.s3utils.MediaRootS3BotoStorage'
S3DIRECT_REGION = 'us-west-2'
S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
MEDIA_URL = 'http://%s.s3.amazonaws.com/media/' % AWS_STORAGE_BUCKET_NAME
MEDIA_ROOT = MEDIA_URL
STATIC_URL = S3_URL + "/static/"
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
import datetime
two_months = datetime.timedelta(days=61)
date_two_months_later = datetime.date.today() + two_months
expires = date_two_months_later.strftime("%A, %d %B %Y 20:00:00 GMT")
AWS_HEADERS = {
'Expires': expires,
'Cache-Control': 'max-age=%d' % (int(two_months.total_seconds()), ),
}
my s3utils.py:
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
When I run:
python manage.py collectstatic
only (django) admin static files are copied to my s3 bucket.
I thought that the problem was that I misconfigured the permission of my IAM user, but actually I do have the permissions to copy the 'admin/' files to the s3 bucket.
Thank you for any help you can provide.
Add STATICFILES_DIRS to your settings.
STATICFILES_DIRS = [
"/path/to/your/static",
]
https://docs.djangoproject.com/en/1.10/ref/settings/#std:setting-STATICFILES_DIRS
I am trying to optimize my staticfiles using django with S3. I am using django compressor to compress and cache js and css files.
Here are my settings :
AWS_ACCESS_KEY_ID = access_key
AWS_SECRET_ACCESS_KEY = secret_key
AWS_STORAGE_BUCKET_NAME='mybucketname'
AWS_QUERYSTRING_AUTH = False
S3_URL = 'https://%s.s3.amazonaws.com/' %AWS_STORAGE_BUCKET_NAME
MEDIA_URL = S3_URL + "media/"
STATIC_URL = S3_URL + "static/"
ADMIN_MEDIA_PREFIX = STATIC_URL + "admin/"
STATICFILES_DIRS = (
os.path.join(BASE_DIR, "static","static_dirs"),
#'/var/www/static/',
)
AWS_HEADERS = {
'Cache-Control': 'public,max-age=86400',
}
STATIC_ROOT = os.path.join(BASE_DIR, "static","static_root")
STATICFILES_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
DEFAULT_FILE_STORAGE = 'lafabrique.settings.s3utils.MediaRootS3BotoStorage'
COMPRESS_STORAGE = 'lafabrique.settings.s3utils.CachedS3BotoStorage'
COMPRESS_URL = S3_URL
and in another file :
class CachedS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(CachedS3BotoStorage, self).__init__(*args, **kwargs)
self.local_storage = get_storage_class(
"compressor.storage.GzipCompressorFileStorage")()
def save(self, name, content):
name = super(CachedS3BotoStorage, self).save(name, content)
self.local_storage._save(name, content)
return name
What I don't understand is that when I test my page on https://developers.google.com/speed/pagespeed/insights/, google still tells me that I should use gzip and cache on my static files...
Also in my amazon http response I get : Cache-Control:max-age=0 ... ( actual website is lafabrique.io, just in case)
Does somebody know what I did wrong ?
Thanks a lot
Are you using Django-storages? Try adding this to your settings:
AWS_IS_GZIPPED = True
GZIP_CONTENT_TYPES = (
'text/css',
'application/javascript',
'application/x-javascript',
'text/javascript'
)
It looks like you're using gzipped storage on your local machine, but not for the file that you upload to S3.
For the caching issue, try the solution here: Trouble setting cache-cotrol header for Amazon S3 key using boto
I have successfully set up my app to use S3 for storing all static and media files. However, I would like to upload to S3 (current operation), but serve from a cloudfront instance I have set up. I have tried adjusting settings to the cloudfront url but it does not work. How can I upload to S3 and serve from Cloudfront please?
Settings
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
DEFAULT_FILE_STORAGE = 'app.custom_storages.MediaStorage'
STATICFILES_STORAGE = 'app.custom_storages.StaticStorage'
STATICFILES_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
STATIC_URL = "https://s3-eu-west-1.amazonaws.com/app/%s/" % (STATICFILES_LOCATION)
MEDIA_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, MEDIAFILES_LOCATION)
custom_storages.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
Your code is almost complete except you are not adding your cloudfront domain to STATIC_URL/MEDIA_URL and your custom storages.
In detail, you must first install the dependencies
pip install django-storages-redux boto
Add the required settings to your django settings file
INSTALLED_APPS = (
...
'storages',
...
)
AWS_STORAGE_BUCKET_NAME = 'mybucketname'
AWS_CLOUDFRONT_DOMAIN = 'xxxxxxxx.cloudfront.net'
AWS_ACCESS_KEY_ID = get_secret("AWS_ACCESS_KEY_ID")
AWS_SECRET_ACCESS_KEY = get_secret("AWS_SECRET_ACCESS_KEY")
MEDIAFILES_LOCATION = 'media'
MEDIA_ROOT = '/%s/' % MEDIAFILES_LOCATION
MEDIA_URL = '//%s/%s/' % (AWS_CLOUDFRONT_DOMAIN, MEDIAFILES_LOCATION)
DEFAULT_FILE_STORAGE = 'app.custom_storages.MediaStorage'
STATICFILES_LOCATION = 'static'
STATIC_ROOT = '/%s/' % STATICFILES_LOCATION
STATIC_URL = '//%s/%s/' % (AWS_CLOUDFRONT_DOMAIN, STATICFILES_LOCATION)
STATICFILES_STORAGE = 'app.custom_storages.StaticStorage'
Your custom storages need some modification to present the cloudfront domain for the resources, instead of the S3 domain:
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
"""uploads to 'mybucket/static/', serves from 'cloudfront.net/static/'"""
location = settings.STATICFILES_LOCATION
def __init__(self, *args, **kwargs):
kwargs['custom_domain'] = settings.AWS_CLOUDFRONT_DOMAIN
super(StaticStorage, self).__init__(*args, **kwargs)
class MediaStorage(S3BotoStorage):
"""uploads to 'mybucket/media/', serves from 'cloudfront.net/media/'"""
location = settings.MEDIAFILES_LOCATION
def __init__(self, *args, **kwargs):
kwargs['custom_domain'] = settings.AWS_CLOUDFRONT_DOMAIN
super(MediaStorage, self).__init__(*args, **kwargs)
And that is all you need, assuming your bucket and cloudfront domain are correctly linked and the user's AWS_ACCESS_KEY has access permissions to your bucket. Additionally, based on your use case, you may wish to make your s3 bucket items read-only accessible by everyone.
I had a similar issue and just setting AWS_S3_CUSTOM_DOMAIN to the Cloudfront url in Django's settings.py worked for me. You can check the code here.
I have installed easy_thumbnails and am trying to deploy my solution on to S3. I'm using https://github.com/jamstooks/django-s3-folder-storage to separate my /media/ and /static/ folders, with media containing uploaded content.
My settings file works like this:
# static file config
DEFAULT_FILE_STORAGE = 's3_folder_storage.s3.DefaultStorage'
DEFAULT_S3_PATH = "media"
STATICFILES_STORAGE = 's3_folder_storage.s3.StaticStorage'
STATIC_S3_PATH = "static"
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
MEDIA_ROOT = '/media/'
MEDIA_URL = 'https://%s.s3.amazonaws.com/media/' % AWS_STORAGE_BUCKET_NAME
STATIC_ROOT = "/%s/" % STATIC_S3_PATH
STATIC_URL = '//%s.s3.amazonaws.com/static/' % AWS_STORAGE_BUCKET_NAME
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
AWS_PRELOAD_METADATA = True
CKEDITOR_UPLOAD_PATH = 'uploads'
AWS_DEFAULT_ACL = 'public-read'
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
But I'm encountering this error:
TemplateSyntaxError at /
Couldn't get the thumbnail teams/alumni/images/thumbs/alumni.png: [Errno 30] Read-only file system: '/media'
After doing a lot of research I discovered http://gibuloto.com/blog/easy-thumbnails-with-amazon-s3/. This should resolve any issues one may be having in implementing easy_thumbnails using S3.
New to Heroku & Amazon S3, so bear with me. Uploaded my Django app onto Heroku, and having a problem with user media uploads. The model is below:
#models.py
class Movie(models.Model):
title = models.CharField(max_length = 500)
poster = models.ImageField(upload_to = 'storages.backends.s3boto')
pub_date = models.DateTimeField(auto_now_add = True)
author = models.ForeignKey(User)
The poster attribute is the one where the image is uploaded. I had it running fine locally, and now on Heroku there is an error. So I added 'storages.backends.s3boto', as numerous other posts have told me to. (not sure if right).
My Settings.py file looks like this right now, kind of a mess:
#settings.py
PROJECT_ROOT = os.path.abspath(os.path.dirname(__file__))
PROJECT_DIR = os.path.join(PROJECT_ROOT, '../qanda')
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = '****************'
AWS_SECRET_ACCESS_KEY = '************'
AWS_STORAGE_BUCKET_NAME = 'mrt-assets'
AWS_PRELOAD_METADATA = True
MEDIA_ROOT = os.path.join(PROJECT_ROOT, 'qanda/media/movie_posters/)
MEDIA_URL = '/media'
STATIC_ROOT = os.path.join(PROJECT_ROOT, 'staticfiles')
STATIC_URL = 'https://mrt-assets.s3.amazonaws.com/static/'
STATICFILES_DIRS = (os.path.join(PROJECT_DIR, 'static'),)
My bucket is called mrt-assets, and there are 2 folders in there static (css, js, images and media. I'm not too worried about the static files for now, as I've hardcoded the CSS/JS files into my HTML files*, but how do I get my user uploaded media (images of any kind) into mrt-assets/media?
*although if someone wanted to help with STATIC files too that would be great. But user uploaded media more urgent.
EDIT (per Yuji's comment):
Have tried a number of options, and none of them working. I've gone back and deleted a lot of changes, and this is now my Settings
#settings.py
PROJECT_ROOT = os.path.abspath(os.path.dirname(__file__))
MEDIA_ROOT = 'http://s3.amazonaws.com/mrt-assets/media/'
MEDIA_URL = '/media/'
STATIC_ROOT = 'http://s3.amazonaws.com/mrt-assets/static/'
STATIC_URL = '/static/'
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
TEMPLATE_DIRS = (os.path.join(PROJECT_ROOT, "templates"),)
#models.py
#same as before, but now have changed the poster directory
poster = models.ImageField().
Not really sure what to do, need to connect my Heroku app to S3 so user media uploads are saved there.
Have now changed my S3 Bucket to this
mrt-assets
static
css
js
images
media
(empty)
The trick of getting your media to upload into <bucket>/media and your static assets into <bucket>/static is to create two different default storage backends for the two asset types or to explicitly instantiate your model fields with a storage object taking a location parameter.
Instantiating model field with custom storage
from storages.backends.s3boto import S3BotoStorage
class Movie(models.Model):
title = models.CharField(max_length=500)
poster = models.ImageField(storage=S3BotoStorage(location='media'))
pub_date = models.DateTimeField(auto_now_add=True)
author = models.ForeignKey(User)
Giving S3BotoStorage a location will prefix all uploads with its path.
Creating custom storage backends for media and static assets
This is almost the same as explicitly defining a storage backend with
location, but instead we'll be using settings.MEDIA_ROOT and
settings.STATIC_ROOT to apply a path prefix globally.
# settings.py
STATIC_ROOT = '/static/'
MEDIA_ROOT = '/media/'
DEFAULT_FILE_STORAGE = 'app.storage.S3MediaStorage'
STATICFILES_STORAGE = 'app.storage.S3StaticStorage'
# app/storage.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class S3MediaStorage(S3BotoStorage):
def __init__(self, **kwargs):
kwargs['location'] = kwargs.get('location',
settings.MEDIA_ROOT.replace('/', ''))
super(S3MediaStorage, self).__init__(**kwargs)
class S3StaticStorage(S3BotoStorage):
def __init__(self, **kwargs):
kwargs['location'] = kwargs.get('location',
settings.STATIC_ROOT.replace('/', ''))
super(S3StaticStorage, self).__init__(**kwargs)
Refining it
You can refine the above code to take advantage of
Heroku config vars
to make it more portable:
# settings.py
import os
STATIC_ROOT = os.environ.get('STATIC_ROOT',
os.path.join(os.path.dirname(__file__), 'static'))
MEDIA_ROOT = os.environ.get('MEDIA_ROOT',
os.path.join(os.path.dirname(__file__), 'media'))
DEFAULT_FILE_STORAGE = os.environ.get('DEFAULT_FILE_STORAGE',
'django.core.files.storage.FileSystemStorage')
STATICFILES_STORAGE = os.environ.get('STATICFILES_STORAGE',
'django.contrib.staticfiles.storage.StaticFilesStorage')
Couple the above settings with a .env file and you can use the
default storage backends locally for development and testing and when
deploying on Heroku you'll automatically switch to
app.storage.S3MediaStorage and app.storage.S3StaticStorage respectively:
# .env
STATIC_ROOT=static
MEDIA_ROOT=media
DEFAULT_FILE_STORAGE=app.storage.S3MediaStorage
STATICFILES_STORAGE=app.storage.S3StaticStorage