Serving CHANGING static django files in production - python

I need to serve a static file that has a changing name. This works fine with DEBUG = True, but immediately breaks with DEBUG = False. I'm currently using Whitenoise to serve files in production, and tried executing collectstatic after the filename changes, but that didn't help. Any advice is appreciated
Edit: This is how the filename is changed
newname = 'myfile' + str(random.randint(0, 999999)) + '.css'
newpath = os.path.join(current_dir, ('../staticfiles/stuff/css/' + newname))
os.rename(r'{}'.format(filepath), r'{}'.format(newpath))
I can't quite give full context for why I need this file's name to change

Related

creating/deleting folders in runtime using heroku/django

I have developed a Django app where I am uploading a file, doing some processing using a project folder name media.
Process:
user uploads a csv file, python code treats the csv data by creating temp folders in Media folder. After processing is complete, these temp folders are deleted and processed data is downloaded through browser.
I am using the below lines of code to make and delete temp file after processing
temp = 'media/temp3'
os.mkdir(temp)
shutil.copyfile('media/' + file_name, temp + '/' + file_name)
shutil.rmtree(temp, ignore_errors=True)
To set the media root, I used the below lines in settings.py which I am sure I am not using in other parts of the code.
MEDIA_ROOT = os.path.join(BASE_DIR, 'media/')
MEDIA_URL = "/media/"
Everything works fine when I run the app on local host. but as soon as i deployed it to heroku, it seems like these folders were not created/not found.
I am looking for:
Either a solution to create, read and delete folders/files in runtime using heroku,
or
a better way to manage files/folders in runtime.

Python with open doesn't create new file after deployed at Heroku

I'm working on a python project in which I need to create a new JSON file.It's working locally but when I deploy my app to Heroku the file creation doesn't work.
Here's what I have tried:
From settings.py
APP_ROOT = os.path.dirname(os.path.abspath(__file__)) # refers to application_top
APP_FINALIZED = os.path.join(APP_ROOT, 'finalized')
From app.py
HOME = os.path.join(APP_FINALIZED)
print(HOME)
with open(HOME + '/description_' + str(fid) + '.json', 'w', encoding="utf-8")\
as f:
f.write(json.dumps(data, indent=4, ensure_ascii=False))
Updated: can we write this file directly to the S3 bucket, anyway?
it's working fine locally, but when I deploy it on Heroku the file doesn't create, even it doesn't show any error.
I'll add this as answer as well in case someone elese needs help.
Heroku's file system is (as far as I can remember) read-only.
Please check this answer.

Django-storages not detecting changed static files

I'm using django-storages and amazon s3 for my static files. Following the documentation, I put these settings in my settings.py
STATIC_URL = 'https://mybucket.s3.amazonaws.com/'
ADMIN_MEDIA_PREFIX = 'https://mybucket.s3.amazonaws.com/admin/'
INSTALLED_APPS += (
'storages',
)
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'mybucket_key_id'
AWS_SECRET_ACCESS_KEY = 'mybucket_access_key'
AWS_STORAGE_BUCKET_NAME = 'mybucket'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
And the first time I ran collect static everything worked correctly and my static files were uploaded to my s3 bucket.
However, after making changes to my static files and running python manage.py collectstatic this is outputted despite the fact that static files were modified
-----> Collecting static files
0 static files copied, 81 unmodified.
However, if I rename the changed static file, the changed static file is correctly copied to my s3 bucket.
Why isn't django-storages uploading my changed static files? Is there a configuration problem or is the problem deeper?
collectstatic skips files if "target" file is "younger" than source file. Seems like amazon S3 storage returns wrong date for you file.
you could investigate [code][1] and debug server responses. Maybe there is a problem with timezone.
Or you could just pass --clear argument to collectstatic so that all files are deleted on S3 before collecting
https://github.com/antonagestam/collectfast
From readme.txt : Custom management command that compares the MD5 sum and etag from S3 and if the two are the same skips file copy. This makes running collect static MUCH faster
if you are using git as a source control system which updates timestamps.
Create a settings file just for collectstatic sync, with this config:
TIME_ZONE = 'UTC'
Run collectstatic with a specific settings with this line:
python manage.py collectstatic --settings=settings.collectstatic
This question is a little old but in case it helps someone in the future, I figured I'd share my experience. Following advice found in other threads I confirmed that, for me, this was indeed caused by a difference in time zone. My django time wasn't incorrect but was set to EST and S3 was set to GMT. In testing, I reverted to django-storages 1.1.5 which did seem to get collectstatic working. Partially due to personal preference, I was unwilling to a) roll back three versions of django-storages and lose any potential bug fixes or b) alter time zones for components of my project for what essentially boils down to a convenience function (albeit an important one).
I wrote a short script to do the same job as collectstatic without the aforementioned alterations. It will need a little modifying for your app but should work for standard cases if it is placed at the app level and 'static_dirs' is replaced with the names of your project's apps. It is run via terminal with 'python whatever_you_call_it.py -e environment_name (set this to your aws bucket).
import sys, os, subprocess
import boto3
import botocore
from boto3.session import Session
import argparse
import os.path, time
from datetime import datetime, timedelta
import pytz
utc = pytz.UTC
DEV_BUCKET_NAME = 'dev-homfield-media-root'
PROD_BUCKET_NAME = 'homfield-media-root'
static_dirs = ['accounts', 'messaging', 'payments', 'search', 'sitewide']
def main():
try:
parser = argparse.ArgumentParser(description='Homfield Collectstatic. Our version of collectstatic to fix django-storages bug.\n')
parser.add_argument('-e', '--environment', type=str, required=True, help='Name of environment (dev/prod)')
args = parser.parse_args()
vargs = vars(args)
if vargs['environment'] == 'dev':
selected_bucket = DEV_BUCKET_NAME
print "\nAre you sure? You're about to push to the DEV bucket. (Y/n)"
elif vargs['environment'] == 'prod':
selected_bucket = PROD_BUCKET_NAME
print "Are you sure? You're about to push to the PROD bucket. (Y/n)"
else:
raise ValueError
acceptable = ['Y', 'y', 'N', 'n']
confirmation = raw_input().strip()
while confirmation not in acceptable:
print "That's an invalid response. (Y/n)"
confirmation = raw_input().strip()
if confirmation == 'Y' or confirmation == 'y':
run(selected_bucket)
else:
print "Collectstatic aborted."
except Exception as e:
print type(e)
print "An error occured. S3 staticfiles may not have been updated."
def run(bucket_name):
#open session with S3
session = Session(aws_access_key_id='{aws_access_key_id}',
aws_secret_access_key='{aws_secret_access_key}',
region_name='us-east-1')
s3 = session.resource('s3')
bucket = s3.Bucket(bucket_name)
# loop through static directories
for directory in static_dirs:
rootDir = './' + directory + "/static"
print('Checking directory: %s' % rootDir)
#loop through subdirectories
for dirName, subdirList, fileList in os.walk(rootDir):
#loop through all files in subdirectory
for fname in fileList:
try:
if fname == '.DS_Store':
continue
# find and qualify file last modified time
full_path = dirName + "/" + fname
last_mod_string = time.ctime(os.path.getmtime(full_path))
file_last_mod = datetime.strptime(last_mod_string, "%a %b %d %H:%M:%S %Y") + timedelta(hours=5)
file_last_mod = utc.localize(file_last_mod)
# truncate path for S3 loop and find object, delete and update if it has been updates
s3_path = full_path[full_path.find('static'):]
found = False
for key in bucket.objects.all():
if key.key == s3_path:
found = True
last_mode_date = key.last_modified
if last_mode_date < file_last_mod:
key.delete()
s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path))
print "\tUpdated : " + full_path
if not found:
# if file not found in S3 it is new, send it up
print "\tFound a new file. Uploading : " + full_path
s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path))
except:
print "ALERT: Big time problems with: " + full_path + ". I'm bowin' out dawg, this shitz on u."
def get_mime_type(full_path):
try:
last_index = full_path.rfind('.')
if last_index < 0:
return 'application/octet-stream'
extension = full_path[last_index:]
return {
'.js' : 'application/javascript',
'.css' : 'text/css',
'.txt' : 'text/plain',
'.png' : 'image/png',
'.jpg' : 'image/jpeg',
'.jpeg' : 'image/jpeg',
'.eot' : 'application/vnd.ms-fontobject',
'.svg' : 'image/svg+xml',
'.ttf' : 'application/octet-stream',
'.woff' : 'application/x-font-woff',
'.woff2' : 'application/octet-stream'
}[extension]
except:
'ALERT: Couldn\'t match mime type for '+ full_path + '. Sending to S3 as application/octet-stream.'
if __name__ == '__main__':
main()

Pyramid: How can I making a static view to some absolute path, and then let users upload files to that path?

In my view callable, I want users to be able to create a new file called filename like so:
#view_config(route_name='home_page', renderer='templates/edit.pt')
def home_page(request):
if 'form.submitted' in request.params:
name= request.params['name']
input_file=request.POST['stl'].filename
vertices, normals = [],[]
for line in input_file:
parts = line.split()
if parts[0] == 'vertex':
vertices.append(map(float, parts[1:4]))
elif parts[0] == 'facet':
normals.append(map(float, parts[2:5]))
ordering=[]
N=len(normals)
...parsing data...
data=[vertices,ordering]
jsdata=json.dumps(data)
renderer_dict = dict(name=name,data=jsdata)
app_dir = request.registry.settings['upload_dir']
filename = "%s/%s" % ( app_dir , name )
html_string = render('tutorial:templates/view.pt', renderer_dict, request=request)
with open(filename,'w') as file:
file.write(new_comment)
return HTTPFound(location=request.static_url('tutorial:pages/%(pagename)s.html' % {'pagename': name}))
return {}
right now, when I attempt to upload a file, I am getting this error message: IOError: [Errno 2] No such file or directory: u'/path/pages/one' (one is the name variable) I believe this is because I am incorrectly defining the app_dir variable. I want filename to be the url of the new file that is being created with the name variable that is defined above (so that it can be accessed at www.domain.com/pages/name). Here is the file structure of my app:
env
tutorial
tutorial
templates
home.pt
static
pages
(name1)
(name2)
(name3)
....
views.py
__init__.py
In my init.py I have:
config.add_static_view(name='path/pages/', path=config.registry.settings['upload_dir'])
In my development.ini file I have
[app:main]
use = egg:tutorial
upload_dir = /path/pages
Edit: If anyone has an idea on why this question isn't getting much attention, I would love to hear it.
While I feel like you probably have a misunderstanding of how to serve up user-generated content, I will show you a way to do what you're asking. Generally user-generated content would not be uploaded into your source, you'll provide some configurable spot outside to place it, as I show below.
Make the path configurable via your INI file:
[app:main]
use = egg:tutorial
upload_dir = /path/to/writable/upload/directory
Add a static view that can serve up files under that directory.
config.add_static_view(name='/url/to/user_uploads', path=config.registry.settings['upload_dir'])
In your upload view you can get your app_dir via
app_dir = request.registry.settings['upload_dir']
Copy the data there, and from then on it'll be available at /url/to/user_uploads/filename.

matplotlib with Django fails on print_figure throwing [Errno 2] No such file or directory

I am trying to create a png via matplotlib but I get:
[Errno 2] No such file or directory
The same code works on unit tests. print_figure call is in
# creates a canvas from figure
canvas = FigureCanvasAgg(fig)
filename = "directory" + os.sep + name_prefix + ".png"
# saves figure to filesystem in png format
canvas.print_figure(filename)
I am thinking it can be a permission issue, but it seems weird to me that the same code works via manage.py test
Thanks
My recommendation is to use fully qualified path names. For example: you could determine the MEDIA_ROOT from your django settings, write a snippet of code that ensures that a subdirectory for the graphs exists, then save the images there.
Your current code seems to rely on finding a subdirectory with the appropriate name in the "current working directory". The "current working directory" is a finicky thing - it will be different in testing, dev, production...
# import settings
from django.conf import settings
...
# ensure that a subdirectory with the appropriate name exists
if not os.path.exists(directory):
os.makedirs(directory)
# save the plots
canvas = FigureCanvasAgg(fig)
filename = settings.MEDIA_ROOT + os.sep + directory + os.sep + name_prefix + ".png"
# saves figure to filesystem in png format
canvas.print_figure(filename)
...
The actual location that you will save at should be determined by your needs. The key points are to use fully qualified paths and to check for the existence of the directory / subdirectory before attempting to save the image.

Categories

Resources