I am not using Appengine.
I have a plain vanilla Django application running on a VM. I want to use Google Cloud Storage for serving my staticfiles, and also for uploading/serving my media files.
I have a bucket.
How do I link my Django application to my bucket? I've tried django-storages. That may work, but what do I have to do to prepare my bucket to be used by my django application? And what baseline configuration do I need in my Django settings?
Current settings:
# Google Cloud Storage
# http://django-storages.readthedocs.org/en/latest/backends/apache_libcloud.html
LIBCLOUD_PROVIDERS = {
'google': {
'type' : 'libcloud.storage.types.Provider.GOOGLE_STORAGE',
'user' : <I have no idea>,
'key' : <ditto above>,
'bucket': <my bucket name>,
}
}
DEFAULT_LIBCLOUD_PROVIDER = 'google'
DEFAULT_FILE_STORAGE = 'storages.backends.apache_libcloud.LibCloudStorage'
STATICFILES_STORAGE = 'storages.backends.apache_libcloud.LibCloudStorage'
Django-storages has a backend for Google Cloud Storage, but it is not documented, I realised looking in the repo. Got it working with this setup:
DEFAULT_FILE_STORAGE = 'storages.backends.gs.GSBotoStorage'
GS_ACCESS_KEY_ID = 'YourID'
GS_SECRET_ACCESS_KEY = 'YourKEY'
GS_BUCKET_NAME = 'YourBucket'
STATICFILES_STORAGE = 'storages.backends.gs.GSBotoStorage'
To get YourKEY and YourID you should create Interoperability keys, in the settings tab.
Hope it helps and you don't have to learn it the hard way :)
Ah in case you haven't yet, the dependencies are:
pip install django-storages
pip install boto
Django-storages is, in fact, a viable alternative. You must be careful with it's Google Cloud backend though
as the url() method it provides causes unnecessary HTTP calls to Google. (Django calls .url() when rendering static files, for example).
https://github.com/jschneier/django-storages/issues/491
settings.py
DEFAULT_FILE_STORAGE = 'config.storage_backends.GoogleCloudMediaStorage'
STATICFILES_STORAGE = 'config.storage_backends.GoogleCloudStaticStorage'
GS_PROJECT_ID = '<google-cloud-project-id>'
GS_MEDIA_BUCKET_NAME = '<name-of-static-bucket>'
GS_STATIC_BUCKET_NAME = '<name-of-static-bucket>'
STATIC_URL = 'https://storage.googleapis.com/{}/'.format(GS_STATIC_BUCKET_NAME)
MEDIA_URL = 'https://storage.googleapis.com/{}/'.format(GS_MEDIA_BUCKET_NAME)
storage_backends.py
"""
GoogleCloudStorage extensions suitable for handing Django's
Static and Media files.
Requires following settings:
MEDIA_URL, GS_MEDIA_BUCKET_NAME
STATIC_URL, GS_STATIC_BUCKET_NAME
In addition to
https://django-storages.readthedocs.io/en/latest/backends/gcloud.html
"""
from django.conf import settings
from storages.backends.gcloud import GoogleCloudStorage
from storages.utils import setting
from urllib.parse import urljoin
class GoogleCloudMediaStorage(GoogleCloudStorage):
"""GoogleCloudStorage suitable for Django's Media files."""
def __init__(self, *args, **kwargs):
if not settings.MEDIA_URL:
raise Exception('MEDIA_URL has not been configured')
kwargs['bucket_name'] = setting('GS_MEDIA_BUCKET_NAME', strict=True)
super(GoogleCloudMediaStorage, self).__init__(*args, **kwargs)
def url(self, name):
""".url that doesn't call Google."""
return urljoin(settings.MEDIA_URL, name)
class GoogleCloudStaticStorage(GoogleCloudStorage):
"""GoogleCloudStorage suitable for Django's Static files"""
def __init__(self, *args, **kwargs):
if not settings.STATIC_URL:
raise Exception('STATIC_URL has not been configured')
kwargs['bucket_name'] = setting('GS_STATIC_BUCKET_NAME', strict=True)
super(GoogleCloudStaticStorage, self).__init__(*args, **kwargs)
def url(self, name):
""".url that doesn't call Google."""
return urljoin(settings.STATIC_URL, name)
Note: authentication is handled by default via the GOOGLE_APPLICATION_CREDENTIALS environment variable.
https://cloud.google.com/docs/authentication/production#setting_the_environment_variable
May, 2022 Update:
With this instruction, you can connect your Django app to your bucket on GCS(Google Cloud Storage) and you can serve your static files and serve, upload and delete your media files.
For example, you have the bucket "my-django-bucket" on GCS:
And you have the service account "my-django-bucket-sa" then you need to copy(Ctrl+C) the email "my-django-bucket-sa#myproject-347313.iam.gserviceaccount.com":
Next, in Bucket details of the bucket "my-django-bucket", click on "PERMISSIONS" then "ADD":
Then, to enable the service account "my-django-bucket-sa" to have the full control of GCS resources, paste(Ctrl+V) the email "my-django-bucket-sa#myproject-347313.iam.gserviceaccount.com" to "New principals" then choose the role "Storage Admin" then click on "SAVE". *Choose other role by checking IAM roles for Cloud Storage if you don't want the role "Storage Admin" which can have the full control of GCS resources:
Next, to enable all users to view(read) files, type "allUsers" to “New principals” then choose the role "Storage Legacy Object Reader" then click on "SAVE":
Then, you should be asked as shown below so click on "ALLOW PUBLIC ACCESS":
Finally, you could add the role "Storage Admin" to the service account "my-django-bucket-sa" and the role "Storage Legacy Object Reader" to "allUsers":
Next, you need to download the private key of the service account "my-django-bucket-sa" in JSON so click on "Manage keys" from the 3 dots "⋮":
Then, click on "Create new key" from "ADD KEY":
Then, choose "JSON" then click on "CREATE"
Finally, you could download the private key of the service account "my-django-bucket-sa" in JSON "myproject-347313-020754294843.json":
Now, you have a django project and there are one settings folder "core" which has "static/core/core.js" and "settings.py" and one application folder "myapp" which has "static/myapp/myapp.css" as shown below:
Then next, you need to put "myproject-347313-020754294843.json" to the root django project directory where "db.sqlite3" and "manage.py" are:
Then, you better rename "myproject-347313-020754294843.json" to a shorter and reasonable name such as "gcpCredentials.json":
Next, you need to install "django-storages[google]" to connect to and communicate with "my-django-bucket" on GCS(Google Cloud Storage):
pip install django-storages[google]
By installing "django-storages[google]", you can get "django-storages" and other necessary packages as shown below:
"requirements.txt"
django-storages==1.12.3
cachetools==4.2.4
google-api-core==2.7.2
google-auth==2.6.5
google-cloud-core==2.3.0
google-cloud-storage==2.0.0
google-crc32c==1.3.0
google-resumable-media==2.3.2
googleapis-common-protos==1.56.0
protobuf==3.19.4
pyasn1==0.4.8
pyasn1-modules==0.2.8
Be careful, if you install "django-storages" without "[google]" as shown below:
pip install django-storages
You can only get "django-storages" as shown below.
"requirements.txt"
django-storages==1.12.3
Next, create "gcsUtils.py" in "core" folder where "settings.py" is:
Then, put this code below to "gcsUtils.py" to define "Static" and "Media" variables which each have a "GoogleCloudStorage" class object:
# "core/gcsUtils.py"
from storages.backends.gcloud import GoogleCloudStorage
Static = lambda: GoogleCloudStorage(location='static')
Media = lambda: GoogleCloudStorage(location='media')
Next, add this code below to "settings.py". *"STATICFILES_STORAGE" is like the conbination of "STATIC_ROOT" and "STATIC_URL" and "DEFAULT_FILE_STORAGE" is like the conbination of "MEDIA_ROOT" and "MEDIA_URL":
# "core/settings.py"
from google.oauth2 import service_account
# Set "static" folder
STATICFILES_STORAGE = 'core.gcsUtils.Static'
# Set "media" folder
DEFAULT_FILE_STORAGE = 'core.gcsUtils.Media'
GS_BUCKET_NAME = 'my-django-bucket'
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.path.join(BASE_DIR, 'gcpCredentials.json'),
)
Then, run this command below:
python manage.py collectstatic
Now, "static" folder is created in "my-django-bucket":
And static files are collected from "admin" and "application" folders to "static" folder in "my-django-bucket":
And this is "myapp.css" in "myapp" folder:
But as you can see, static files are not collected from the settings folder "core" to "static" folder in "my-django-bucket":
Because "STATICFILES_STORAGE" can only collect static files from "admin" and "application" folders but not from other folders like the settings folder "core":
# "core/settings.py"
STATICFILES_STORAGE = 'core.gcsUtils.Static'
So, to collect static files from the settings folder "core" to "static" folder in "my-django-bucket":
You need to add "STATICFILES_DIRS" to "settings.py" as shown below:
# "core/settings.py"
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
Then, this is the full code of "settings.py":
# "core/settings.py"
from google.oauth2 import service_account
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
# Set "static" folder
STATICFILES_STORAGE = 'core.gcsUtils.Static'
# Set "media" folder
DEFAULT_FILE_STORAGE = 'core.gcsUtils.Media'
GS_BUCKET_NAME = 'my-django-bucket'
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.path.join(BASE_DIR, 'gcpCredentials.json'),
)
Then again, run this command below:
python manage.py collectstatic
Then, static files is collected from the settings folder "core" to "static" folder in "my-django-bucket":
And this is "core.js" in "core" folder:
Next, this is the code for "myapp/models.py":
# "myapp/models.py"
from django.db import models
class Image(models.Model):
image = models.ImageField(upload_to='images/fruits')
def __str__(self):
return str(self.image)
And this is the code for "myapp/admin.py":
# "myapp/admin.py"
from django.contrib import admin
from .models import Image
admin.site.register(Image)
Then, upload "orange.jpg":
Now, "media" folder is created in "my-django-bucket":
And "orange.jpg" is uploaded in "media/images/fruits":
And because "GS_FILE_OVERWRITE = False" is set in "settings.py":
# "core/settings.py"
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
If uploading the same name file "orange.jpg" again:
Then, the unique ID "_VPJxGBW" is added to "orange.jpg" to prevent file overwrite then, "orange_VPJxGBW.jpg" is uploaded as shown below:
Next, if there is "orange.jpg" in "media/images/fruits":
Then, update(change) "orange.jpg" to "apple.jpg" being uploaded:
Then, "apple.jpg" is uploaded in "media/images/fruits" but "orange.jpg" is still in "media/images/fruits" without deleted:
And if there is "orange.jpg" in "media/images/fruits":
Then, delete "orange.jpg":
But "orange.jpg" is still in "media/images/fruits" without deleted:
So, to delete uploaded files when they are updated(changed) and deleted, you need to install "django-cleanup":
pip install django-cleanup
Then, add it to the bottom of "INSTALLED_APPS" in "settings.py":
# "core/settings.py"
INSTALLED_APPS = (
...,
'django_cleanup.apps.CleanupConfig', # Here
)
Then, if there is "orange.jpg" in "media/images/fruits":
Then, update(change) "orange.jpg" to "apple.jpg" being uploaded:
Then, "apple.jpg" is uploaded in "media/images/fruits" and "orange.jpg" is deleted from "media/images/fruits":
And if there is "orange.jpg" in "media/images/fruits":
Then, delete "orange.jpg":
Then, "orange.jpg" is deleted from "media/images/fruits":
Lastly, the GCS Bucket Settings which you have just set in "settings.py" as shown below work in both "DEBUG = True" and "DEBUG = False":
# "core/settings.py"
from google.oauth2 import service_account
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
# Set "static" folder
STATICFILES_STORAGE = 'core.gcsUtils.Static'
# Set "media" folder
DEFAULT_FILE_STORAGE = 'core.gcsUtils.Media'
GS_BUCKET_NAME = 'my-django-bucket'
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.path.join(BASE_DIR, 'gcpCredentials.json'),
)
And, there will be "STATIC_ROOT", "STATIC_URL", "MEDIA_ROOT", "MEDIA_URL" with the GCS Bucket Settings in "settings.py" as shown below. So, in this case, "STATIC_ROOT", "STATIC_URL", "MEDIA_ROOT" and "MEDIA_URL" don't work while the GCS Bucket Settings work communicating with "my-django-bucket" on GCS:
# "core/settings.py"
from google.oauth2 import service_account
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
STATIC_URL = '/static/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
MEDIA_URL = '/media/'
'''GCS Bucket Settings Start'''
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
# Set "static" folder
STATICFILES_STORAGE = 'core.gcsUtils.Static'
# Set "media" folder
DEFAULT_FILE_STORAGE = 'core.gcsUtils.Media'
GS_BUCKET_NAME = 'my-django-bucket'
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.path.join(BASE_DIR, 'gcpCredentials.json'),
)
'''GCS Bucket Settings End'''
So, if you want "STATIC_ROOT", "STATIC_URL", "MEDIA_ROOT" and "MEDIA_URL" to work, just comment the GCS Bucket Settings then set "STATICFILES_DIRS" as shown below:
# "core/settings.py"
from google.oauth2 import service_account
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
STATIC_URL = '/static/'
MEDIA_ROOT = os.path.join(BASE_DIR, 'media')
MEDIA_URL = '/media/'
"""
'''GCS Bucket Settings Start'''
# Collect static files from the settings folder
# "core" which is not "admin" and "application" folder
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'core/static'),
]
# Set "static" folder
STATICFILES_STORAGE = 'core.gcsUtils.Static'
# Set "media" folder
DEFAULT_FILE_STORAGE = 'core.gcsUtils.Media'
GS_BUCKET_NAME = 'my-django-bucket'
# Add an unique ID to a file name if same file name exists
GS_FILE_OVERWRITE = False
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
os.path.join(BASE_DIR, 'gcpCredentials.json'),
)
'''GCS Bucket Settings End'''
"""
So, this basically will work. (With this library and settings).
The trick to making it work, is knowing where to get the 'user' and 'key' parameters for libcloud.
On Google Cloud Console > Storage, click Settings. Then click on the right-hand tab called Interoperability. On that panel, is a lone button, which says something like Enable Interoperability. Click it.
Voila! You now have a username and key.
Note: Do not use django-storages from pypi. It has not been updated, and doesn't work with recent releases of Django.
Use this version:
pip install -e 'git+https://github.com/jschneier/django-storages.git#egg=django-storages'
Edit: If you want to use a reverse proxy, then you may consider my slightly modified version.
https://github.com/jschneier/django-storages/compare/master...halfnibble:master
Description:
Under certain circumstances, it may be necessary to load files using a reverse proxy. This could be used to alleviate cross-origin request errors.
This small PR allows the developer to set an optional LIBCLOUD_PROXY_URL in settings.py.
Example Usage
# Apache VirtualHost conf
ProxyPass /foo http://storage.googleapis.com
ProxyPassReverse /foo http://storage.googleapis.com
# settings.py
LIBCLOUD_PROXY_URL = '/foo/'
As the latest version, access key and key ID changed to service account file. And we want to use a bucket with 2 folders static and media like a local server. Below low the update configs:
Create a file like gcloud_storages.py:
"""
Modify django-storages for GCloud to set static, media folder in a bucket
"""
from django.conf import settings
from storages.backends.gcloud import GoogleCloudStorage
class GoogleCloudMediaStorage(GoogleCloudStorage):
"""
GoogleCloudStorage suitable for Django's Media files.
"""
def __init__(self, *args, **kwargs):
kwargs['location'] = 'media'
super(GoogleCloudMediaStorage, self).__init__(*args, **kwargs)
class GoogleCloudStaticStorage(GoogleCloudStorage):
"""
GoogleCloudStorage suitable for Django's Static files
"""
def __init__(self, *args, **kwargs):
kwargs['location'] = 'static'
super(GoogleCloudStaticStorage, self).__init__(*args, **kwargs)
Use location argument to set the location of static, media files in bucket.
In settings.py
from google.oauth2 import service_account
...
GOOGLE_APPLICATION_CREDENTIALS = '/path/service-account.json'
DEFAULT_FILE_STORAGE = 'app.gcloud_storages.GoogleCloudMediaStorage'
STATICFILES_STORAGE = 'app.gcloud_storages.GoogleCloudStaticStorage'
GS_BUCKET_NAME = 'name-of-bucket'
GS_PROJECT_ID = 'project-id'
GS_DEFAULT_ACL = 'publicRead'
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
GOOGLE_APPLICATION_CREDENTIALS
)
Since I cannot comment on Alan Wagner's answer, here is an addition.
If you are using python3, you may get this error,
...
ImportError: No module named 'google_compute_engine'
If so, you will need to install google-compute-engine. The /etc/boto.cfg file tells python to use the 2.7 version of the library. You will have to run this next line to regenerate /etc/boto.cfg.
python3 -c "from google_compute_engine.boto.boto_config import BotoConfig; BotoConfig()"
Another error you may hit,
...
File "/app/venv/lib/python3.4/site-packages/boto/gs/connection.py", line 95, in create_bucket
data=get_utf8_value(data))
File "/app/venv/lib/python3.4/site-packages/boto/s3/connection.py", line 656, in make_request
auth_path = self.calling_format.build_auth_path(bucket, key)
File "/app/venv/lib/python3.4/site-packages/boto/s3/connection.py", line 94, in build_auth_path
path = '/' + bucket
TypeError: Can't convert 'bytes' object to str implicitly
I made a pull request to fix this. You may use my repo as a pip dependency if you wish until it gets merged.
I will try to keep this repo up to date. I have set the default develop branch as protected. I am the only one who can commit/approve merge requests. I have also only made one commit.
You will have to install google-compute-engine and run that line above before you can instalkl/build my boto repo.
I have detailed my step by step process on another thread
Serve Static files from Google Cloud Storage Bucket (for Django App hosted on GCE)
Here are my main references:
https://django-storages.readthedocs.io/en/latest/backends/gcloud.html
https://medium.com/#umeshsaruk/upload-to-google-cloud-storage-using-django-storages-72ddec2f0d05
I used the following packages:
pip3 install django-storages # https://pypi.org/project/django-storages/
pip3 install google-cloud-storage # https://pypi.org/project/google-cloud-storage/
Related
I am trying to use s3 bucket with django (I have done this like twice before) but this time, after installing boto3 and django-storages and assigning correct values to necessary variables in settings.py, python manage.py collectstatic is still collecting static files to a local directory on my computer instead of s3 bucket. Below is my settings.py...
settings.py
INSTALLED_APPS = [
...
"storages",
]
AWS_ACCESS_KEY_ID = "*****"
AWS_SECRET_ACCESS_KEY = "******"
AWS_STORAGE_BUCKET_NAME = "****"
AWS_S3_CUSTOM_DOMAIN = "%s.s3.amazonaws.com" % AWS_STORAGE_BUCKET_NAME
AWS_S3_OBJECT_PARAMETERS = {"CacheControl": "max-age=86400"}
AWS_DEFAULT_ACL = None
AWS_LOCATION = 'static'
STATICFILES_DIRS = [
BASE_DIR / "build/static", #this is the correct path by the way!
]
STATIC_URL = "https://%s/%s/" % (AWS_S3_CUSTOM_DOMAIN, AWS_LOCATION)
STATICFILES_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
terminal
(env) C:\Users\LENOVO\Desktop\project> python manage.py collectstatic
You have requested to collect static files at the destination
location as specified in your settings:
C:\Users\LENOVO\Desktop\project\staticfiles
This will overwrite existing files!
Are you sure you want to do this?
According to all tutorials and my expectation, collectstatic is supposed to be collecting my static files into my s3 bucket.. Am I missing something??
Thanks for your time!
I tend to have define my own subclass of S3Boto3Storage and use that
# ./storage_backends.py - lives in the same dir as the config file.
from storages.backends.s3boto3 import S3Boto3Storage
from django.conf import settings
class StaticRootS3BotoStorage(S3Boto3Storage):
location = settings.STATIC_DIRECTORY
Then
# config.py
STATIC_DIRECTORY = BASE_DIR + "/build/static" # You will need to change this to your path
STATICFILES_STORAGE = 'config.settings.storage_backends.StaticRootS3BotoStorage' # you also need to change this to the path to the file we created above.
STATIC_URL = 'https://%s/%s/' % (AWS_S3_CUSTOM_DOMAIN, STATIC_DIRECTORY)
Now you can override methods for debugging and see where the problem might be.
I'm trying to serve the static Files for my django App from Cloud Storage Bucket but don't know the exact process. Can someone please suggest a proper way to do so ?
Steps I did:
Uploaded all the static files on Google Cloud Storage Bucket(www.example.com) using gsutil command.
Edited /etc/apache2/sites-available/default-ssl.conf File.
File Content:
<VirtualHost *:443>
ServerName example.com
ServerAdmin admin#example.com
# Alias /static /opt/projects/example-google/example_static
Alias /static https://storage.googleapis.com/www.example.com/static
<Directory /opt/projects/example-google/example_static>
Require all granted
</Directory>
<Directory /opt/projects/example-google/example/example>
<Files wsgi.py>
Require all granted
</Files>
</Directory>
WSGIDaemonProcess example python-path=/opt/projects/example-google/example:/opt/projects/example-google/venv/lib/python2.7/site-packages
WSGIProcessGroup example
WSGIApplicationGroup %{GLOBAL}
WSGIScriptAlias / /opt/projects/example-google/example/example/wsgi.py
SSLEngine on
SSLCertificateFile /etc/apache2/ssl/example.com.crt
SSLCertificateKeyFile /etc/apache2/ssl/example.com.key
SSLCertificateChainFile /etc/apache2/ssl/intermediate.crt
</VirtualHost>
settings.py File:
# Static files (CSS, JavaScript, Images)
STATIC_URL = '/static/'
# STATIC_URL = 'https://storage.googleapis.com/www.example.com/static/'
STATIC_ROOT = os.path.join(BASE_DIR, '../example_static')
MEDIA_URL = '/media/'
MEDIA_ROOT = os.path.join(BASE_DIR, '../example_media')
STATICFILES_DIRS = (os.path.join(BASE_DIR, 'static'), MEDIA_ROOT,)
Any suggestion on what all additional changes are required for this task ?
Thanks,
Main references:
https://django-storages.readthedocs.io/en/latest/backends/gcloud.html
https://medium.com/#umeshsaruk/upload-to-google-cloud-storage-using-django-storages-72ddec2f0d05
Prerequisite steps
Go to GCP: Cloud Storage (GCS) and click on CREATE BUCKET (fill-up as needed)
Once created, you can make it public if you want it to act like a CDN of your website (storage of your static files such as css, images, videos, etc.)
Go to your newly created bucket
Go to Permissions and then Click Add members
Add a new member "allUsers" with role "Cloud Storage - Storage Object Viewer"
Reference: https://cloud.google.com/storage/docs/quickstart-console
Method 1 (easier and faster, but requires constant manual copying of files to the GCS)
Configure your Django's static file settings in your settings.py
# Tell Django about the different locations to where the static files used by the project can be found
STATICFILES_DIRS = [
os.path.join(BASE_DIR, 'templates'),
os.path.join(BASE_DIR, "yourapp1", "templates"),
os.path.join(BASE_DIR, "yourapp2", "static"),
os.path.join(BASE_DIR, "watever"),
"/home/me/Music/TaylorSwift/",
"/home/me/Videos/notNsfw/",
]
# If the command "collectstatic" is invoked, tell Django where to place all the collected static
# files from all the directories included in STATICFILES_DIRS. Be aware that configuring it with a
# path outside your /home/me means that you need to have permissions to write to that folder later
# on when you invoke "collectstatic", so you might need to login as root first or run it as sudo.
STATIC_ROOT = "/var/www/mywebsite/"
# Tell Django the base url to access the static files. Think of this as the "prefix" of the URL
# to where your static files are. Note that if you browse through your bucket and happen to see a
# URL such as "https://storage.cloud.google.com/<your_bucket_name>/someFileYouHaveUploaded", such
# URL requires that whoever accesses it should be currently logged-in with their Google accounts. If
# you want your static files to be publicly accessible by anyone whether they are logged-in or not,
# use the link "https://storage.googleapis.com/<your_bucket_name>/someFileYouHaveUploaded" instead.
STATIC_URL = "https://storage.googleapis.com/<your_bucket_name>/"
# References:
# https://docs.djangoproject.com/en/3.0/howto/static-files/
# https://docs.djangoproject.com/en/3.0/howto/static-files/deployment/
# https://docs.djangoproject.com/en/3.0/ref/settings/
If you have HTML files or CSS files that access other static files, make sure that they reference those other static files with this updated STATIC_URL setting.
In your home.html
{% load static %}
<link rel="stylesheet" type="text/css" href="{% static 'home/css/home.css' %}">
Then in your home.css file
background-image: url("../assets/img/myHandsomeImage.jpg");
The home.css link now would translate to:
https://storage.googleapis.com/[your_bucket_name]/home/css/home.css
While the myHandsomeImage.jpg would be:
https://storage.googleapis.com/[your_bucket_name]/home/assets/img/myHandsomeImage.jpg
Of course if you wish, you could just put the absolute path (complete URL), but such configuration would always require you to update the used URLs manually, like if you switched to development mode and wanted to just access the static files locally instead of from GCS.
Run below. This would copy all files from each directory in STATICFILES_DIRS to STATIC_ROOT directory.
python3 manage.py collectstatic
# or if your STATIC_ROOT folder requires permissions to write to it then:
# sudo python3 manage.py collectstatic
Go to the STATIC_ROOT folder and upload its contents to GCS. Either upload them manually through the GCS GUI Console or through the Google provided tool "gsutil" along with rsync
Now, your GCS bucket already contains your static files, with your Django project configured to directly access those files through the configured STATIC_URL.
Method 2 (longer, but do not require manual copying after)
[OPTIONAL STEP] Prepare your python virtual environment
python3 -m venv path/to/the/target/location/for/the/virtual/environment
source path/to/the/target/location/for/the/virtual/environment/bin/activate
Install the necessary packages to be able to access and store directly to your GCS
pip3 install django-storages # https://pypi.org/project/django-storages/
pip3 install google-cloud-storage # https://pypi.org/project/google-cloud-storage/
[MANDATORY STEP if you are on a computer outside the Google Infrastructure] Go to GCP: IAM, Service Accounts, and click on CREATE SERVICE ACCOUNT
Name: SomeName
ID / email: somename
Role: Project - Owner
CREATE KEY, select JSON
Store the downloaded JSON file. This generated json key would be used later for authentication purposes once we start accessing and storing to the GCS
Reference: https://cloud.google.com/docs/authentication/getting-started
Configure your Django's static file settings in your settings.py
STATICFILES_DIRS = ['same_values_as_in_method_1_above']
DEFAULT_FILE_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage'
GS_BUCKET_NAME = 'your_bucket_name'
STATICFILES_STORAGE = 'storages.backends.gcloud.GoogleCloudStorage'
STATIC_URL = 'https://storage.googleapis.com/<your_bucket_name>/'
from google.oauth2 import service_account
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
'path/to/the/downloaded/json/key/credentials.json' # see step 3
)
# There are 2 ways to authenticate, you could either do 1 of the following
# 1. Define the variable GS_CREDENTIALS in the settings.py (as done above), or just
# 2. write the command "export GOOGLE_APPLICATION_CREDENTIALS='path/to/credentials.json'" in the shell where you would run the "collectstatic" command
Run below. This would copy all files from each directory in STATICFILES_DIRS directly to your GCS bucket. This might take a while.
python3 manage.py collectstatic
Now, your GCS bucket already contains your static files, with your Django project configured to directly access those files through the configured STATIC_URL.
Basically you need to:
create cloud storage bucket and set it to public readable.
collect static file local
copy file to cloud storage
set STATIC_URL
Check the step 1-4
https://cloud.google.com/python/django/container-engine#deploy_the_app_to_container_engine
I am trying to use django-storages with s3boto in my app and trying to serve media and static files from s3.
I have the following settings in my settings file:
AWS_STORAGE_BUCKET_NAME = '<bucket_name>'
AWS_S3_ACCESS_KEY_ID = '<access_key>'
AWS_S3_SECRET_ACCESS_KEY = '<secret>'
AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
STATICFILES_LOCATION = 'static'
STATICFILES_STORAGE = '<custom_storage_satic>'
MEDIAFILES_LOCATION = 'media'
DEFAULT_FILE_STORAGE = '<custom_storage_media>'
And my custom_storages.py is
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
When I create an image in django, instead of getting the relative path to my image starting with
image.url
'/media/image/<rest_of_the_path>.jpg'
I am getting the absolute url, which is something like
image.url
'https://<s3_bucket_name>.s3.amazonaws.com/media/image/original/'
When I use local storage instead of s3boto, it works as expected and gives me the relative path. Am I missing something here?
I struck the same issue when attempting to use the Imgix CDN for my S3 media (I suspect we're both using the same tutorial based on your use of the custom_storages.py override).
Here is an abridged version of the S3BotoStorage class in the django-storages framework. This excerpt highlights the important properties and methods for this issue, namely the custom-domain property.
class S3BotoStorage(Storage):
location = setting('AWS_LOCATION', '')
custom_domain = setting('AWS_S3_CUSTOM_DOMAIN')
def url(self, name, headers=None, response_headers=None, expire=None):
# Preserve the trailing slash after normalizing the path.
name = self._normalize_name(self._clean_name(name))
if self.custom_domain:
return "%s//%s/%s" % (self.url_protocol, self.custom_domain, filepath_to_uri(name))
As you can see in the url method, a URL is generated to override the STATIC_URL and MEDIA_URL Django settings. Currently the domain of the URL is created with the AWS_S3_CUSTOM_DOMAIN setting, which is why you continue to see the static S3 URL for media files.
So first, in your Django settings file, add a setting describing your CDN's domain.
IMGIX_DOMAIN = 'example.imgix.net'
Then, similar to the override of the location property, add an override to the custom_domain property in your MediaStorage class.
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
custom_domain = settings.IMGIX_DOMAIN
Now the final URL to your media files should begin with your CDN's domain, followed by the relative path to your file on the S3 bucket.
If you are serving static media from an S3 bucket, you must use an absolute URL, since the media is being served from a wholly different server.
I'm developing a Django website and I have a media folder where users can upload some stuff. This folder is in my root folder and when I run the server in local (with the ./manage.py runserver) it works fine and put the files in MyApp/media/
The problem is that I have a production server Apache with the website running via mod_wsgi. The folder of my project is in /var/www/MyApp/ and it is creating my media folder in /var/www/media instead of /var/www/MyApp/media.
In my settings I have
STATIC_URL = 'static/'
MEDIA_URL = 'media/'
And the way I'm creating the path for my uploaded files is this:
def generate_path(self, filename):
url = "media/files/users/%s/%s" % (self.user.username, filename)
return url
Any idea of what in production it is changing the directory?
Thanks in advance
Configure your MEDIA_ROOT:
# Project root is intended to be used when building paths,
# e.g. ``os.path.join(PROJECT_ROOT, 'relative/path')``.
PROJECT_ROOT = os.path.abspath(os.path.dirname(__name__))
# Absolute path to the directory that will hold uploaded files.
#
# For more information on ``MEDIA_ROOT``, visit
# https://docs.djangoproject.com/en/1.8/ref/settings/#media-root
MEDIA_ROOT = os.path.join(PROJECT_ROOT, 'uploads/')
Then, use upload_to to specify path relative to your MEDIA_ROOT.
I'm configuring a Django project that were using the server filesystem for storing the apps static files (STATIC_ROOT) and user uploaded files (MEDIA_ROOT).
I need now to host all that content on Amazon's S3, so I have created a bucket for this. Using django-storages with the boto storage backend, I managed to upload collected statics to the S3 bucket:
MEDIA_ROOT = '/media/'
STATIC_ROOT = '/static/'
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'KEY_ID...'
AWS_SECRET_ACCESS_KEY = 'ACCESS_KEY...'
AWS_STORAGE_BUCKET_NAME = 'bucket-name'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
Then, I got a problem: the MEDIA_ROOT and STATIC_ROOT are not used within the bucket, so the bucket root contains both the static files and user uploaded paths.
So then I could set:
S3_URL = 'http://s3.amazonaws.com/%s' % AWS_STORAGE_BUCKET_NAME
STATIC_URL = S3_URL + STATIC_ROOT
MEDIA_URL = 'S3_URL + MEDIA_ROOT
And use those settings in the templates, but there is no distinction of static/media files when storing in S3 with django-storages.
How this can be done?
Thanks!
I think the following should work, and be simpler than Mandx's method, although it's very similar:
Create a s3utils.py file:
from storages.backends.s3boto import S3BotoStorage
StaticRootS3BotoStorage = lambda: S3BotoStorage(location='static')
MediaRootS3BotoStorage = lambda: S3BotoStorage(location='media')
Then in your settings.py:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3BotoStorage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3BotoStorage'
A different but related example (that I've actually tested) can be seen in the two example_ files here.
I'm currently using this code in a separated s3utils module:
from django.core.exceptions import SuspiciousOperation
from django.utils.encoding import force_unicode
from storages.backends.s3boto import S3BotoStorage
def safe_join(base, *paths):
"""
A version of django.utils._os.safe_join for S3 paths.
Joins one or more path components to the base path component intelligently.
Returns a normalized version of the final path.
The final path must be located inside of the base path component (otherwise
a ValueError is raised).
Paths outside the base path indicate a possible security sensitive operation.
"""
from urlparse import urljoin
base_path = force_unicode(base)
paths = map(lambda p: force_unicode(p), paths)
final_path = urljoin(base_path + ("/" if not base_path.endswith("/") else ""), *paths)
# Ensure final_path starts with base_path and that the next character after
# the final path is '/' (or nothing, in which case final_path must be
# equal to base_path).
base_path_len = len(base_path) - 1
if not final_path.startswith(base_path) \
or final_path[base_path_len:base_path_len + 1] not in ('', '/'):
raise ValueError('the joined path is located outside of the base path'
' component')
return final_path
class StaticRootS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(StaticRootS3BotoStorage, self).__init__(*args, **kwargs)
self.location = kwargs.get('location', '')
self.location = 'static/' + self.location.lstrip('/')
def _normalize_name(self, name):
try:
return safe_join(self.location, name).lstrip('/')
except ValueError:
raise SuspiciousOperation("Attempted access to '%s' denied." % name)
class MediaRootS3BotoStorage(S3BotoStorage):
def __init__(self, *args, **kwargs):
super(MediaRootS3BotoStorage, self).__init__(*args, **kwargs)
self.location = kwargs.get('location', '')
self.location = 'media/' + self.location.lstrip('/')
def _normalize_name(self, name):
try:
return safe_join(self.location, name).lstrip('/')
except ValueError:
raise SuspiciousOperation("Attempted access to '%s' denied." % name)
Then, in my settings module:
DEFAULT_FILE_STORAGE = 'myproyect.s3utils.MediaRootS3BotoStorage'
STATICFILES_STORAGE = 'myproyect.s3utils.StaticRootS3BotoStorage'
I got to redefine the _normalize_name() private method to use a "fixed" version of the safe_join() function, since the original code is giving me SuspiciousOperation exceptions for legal paths.
I'm posting this for consideration, if anyone can give a better answer or improve this one, it will be very welcome.
File: PROJECT_NAME/custom_storages.py
from django.conf import settings
from storages.backends.s3boto import S3BotoStorage
class StaticStorage(S3BotoStorage):
location = settings.STATICFILES_LOCATION
class MediaStorage(S3BotoStorage):
location = settings.MEDIAFILES_LOCATION
File: PROJECT_NAME/settings.py
STATICFILES_LOCATION = 'static'
MEDIAFILES_LOCATION = 'media'
if not DEBUG:
STATICFILES_STORAGE = 'PROJECT_NAME.custom_storages.StaticStorage'
DEFAULT_FILE_STORAGE = 'PROJECT_NAME.custom_storages.MediaStorage'
AWS_ACCESS_KEY_ID = 'KEY_XXXXXXX'
AWS_SECRET_ACCESS_KEY = 'SECRET_XXXXXXXXX'
AWS_STORAGE_BUCKET_NAME = 'BUCKET_NAME'
AWS_HEADERS = {'Cache-Control': 'max-age=86400',}
AWS_QUERYSTRING_AUTH = False
And run: python manage.py collectstatic
I think the answer is pretty simple and done by default. This is working for me on AWS Elastic Beanstalk with Django 1.6.5 and Boto 2.28.0:
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
)
TEMPLATE_LOADERS = (
'django.template.loaders.filesystem.Loader',
'django.template.loaders.app_directories.Loader',
)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = os.environ['AWS_ACCESS_KEY_ID']
AWS_SECRET_ACCESS_KEY = os.environ['AWS_SECRET_KEY']
The AWS keys are passed in from the container config file and I have no STATIC_ROOT or STATIC_URL set at all. Also, no need for the s3utils.py file. These details are handled by the storage system automatically. The trick here is that I needed to reference this unknown path in my templates correctly and dynamically. For example:
<link rel="icon" href="{% static "img/favicon.ico" %}">
That is how I address my favicon which lives locally (pre-deployment) in ~/Projects/my_app/project/my_app/static/img/favicon.ico.
Of course I have a separate local_settings.py file for accessing this stuff locally in dev environment and it does have STATIC and MEDIA settings. I had to do a lot of experimenting and reading to find this solution and it works consistently with no errors.
I understand that you need the static and root separation and considering that you can only provide one bucket I would point out that this method takes all the folders in my local environment under ~/Projects/my_app/project/my_app/static/and creates a folder in the bucket root (ie: S3bucket/img/ as in the example above). So you do get separation of files. For example you could have a media folder in the static folder and access it via templating with this:
{% static "media/" %}
I hope this helps. I came here looking for the answer and pushed a bit harder to find a simpler solution than to extend the storage system. Instead, I read the documentation about the intended use of Boto and I found that a lot of what I needed was built-in by default. Cheers!
If you want to have subfolders even before media or static seperations, you can use AWS_LOCATION on top of bradenm answer.
Reference: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html#usage
AWS_STORAGE_BUCKET_NAME = 'bucket_name'
AWS_LOCATION = 'path1/path2/'
Bradenm's answer is outdated and doesn't work so I updated it in March 2021.
Updated One:
Create a s3utils.py in the same folder of "settings.py":
from storages.backends.s3boto3 import S3Boto3Storage
StaticRootS3Boto3Storage = lambda: S3Boto3Storage(location='static')
MediaRootS3Boto3Storage = lambda: S3Boto3Storage(location='media')
Then, add 2 lines of code to settings.py and change "myproject" to your folder name:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3Boto3Storage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3Boto3Storage'
The updated one has multiple "3s" as I emphasize below.
s3utils.py:
from storages.backends.s3boto"3" import S3Boto"3"Storage
StaticRootS3Boto"3"Storage = lambda: S3Boto"3"Storage(location='static')
MediaRootS3Boto"3"Storage = lambda: S3Boto"3"Storage(location='media')
settings.py:
DEFAULT_FILE_STORAGE = 'myproject.s3utils.MediaRootS3Boto"3"Storage'
STATICFILES_STORAGE = 'myproject.s3utils.StaticRootS3Boto"3"Storage'
Check and compare with Bradenm's (outdated) answer.
"I respect Bradenm's answer."