Serving static directories in CherryPy - python

I am trying to serve static files using CherryPy but I am unable to. I have looked in the tutorials but setting it up like that is also not working properly.
All this is using Python 3.4
Config
config = {
'/ws': {
'tools.websocket.on': True,
'tools.websocket.handler_cls': ChatWebSocketHandler,
'tools.websocket.protocols': ['toto', 'mytest', 'hithere']
},
'/assets': {
'tools.staticdir.on': True,
'tools.staticdir.dir': constants.TEMPLATE_PATH
},
}
I am starting up cherryPy like this
app_root = Root(args.host, args.port, args.ssl, ssl_port=args.ssl_port)
cherrypy.quickstart(app_root, '', config=config)
Constant Path is
TEMPLATE_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)),"assets/")
I have tried using paths like assets/, /assets/ as well instead of the above constant.
The thing is it does not recognize anyone of them and always gives a 404 error.

I too have had difficulty setting this up. I have a rather complicated setup complete with multiple subdomains that has evolve through several early versions of CherryPy which may no longer be valid, and I've not verified this will work in the simpler quickstart configuration you have here. However the key lines in a setup that actually works for me is to put the config lines below in the webservice object that you mount. I put the config dict that defines the static dir in the class definition before any resources. It looks to me like you've defined your static dir in the configuration dict rather of a specific resource rather than the object. So perhaps try in your hosted service object:
class WebService(object):
_cp_config = {
'tools.staticdir.on': True,
'tools.staticdir.dir': '/path/to/serve/static/files/from'
}
#cherrypy.expose
def index(self):
[ ...additional resource definitions, etc ...]
Then later on:
my_cp_app =
cherrypy.tree.mount(subDomain.WebService(),
'/subdomainFileLocation',
subdomainConfigDict)
cherrypy.quickstart(config=domainConfig)
I know you're working on Python 3. This above works for me on Python 2.7 + cherrypy-8.1.2. I hope this is helpful.

Related

Python to Java script SendMailRequest with SourceArn and FromArn

I have here a part of a code in Python which is for AWS SendEmailRequest(SES)
response = boto3.client('ses').send_raw_email(
FromArn='response = boto3.client('ses').send_raw_email(
FromArn='arn:aws:ses:us-east-1:123456789012:identity/example.com',
SourceArn='arn:aws:ses:us-east-1:123456789012:identity/example.com',
RawMessage={
'Data': msg
},
)
This is working as expected. My problem is that I also need to have this in my Java script but I'm confused how to incorporate it. I've been trying but it seems to be not working. This is the existing Java script part below:
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
message.writeTo(outputStream);
RawMessage rawMessage = new RawMessage(ByteBuffer.wrap(outputStream.toByteArray()));
SendRawEmailRequest rawEmailRequest = new SendRawEmailRequest(rawMessage)
client.sendRawEmail(rawEmailRequest);
I think the FromArn and SourceArn should be incorporated in the rawMessage or rawEmailRequest but I couldn't make it work. On the top of the code, there are values declared like this:
public class SESEMail {
static final String FROM = "example#web.com";
static final String key = Config.key;
static final String privatekey = Config.privateKey;
static Logger logger = Logger.getLogger(SESEMail.class);
public static Variables variables;
I've been reading this one but still confused with how Java language works. http://javadox.com/com.amazonaws/aws-java-sdk-ses/1.10.29/com/amazonaws/services/simpleemail/model/SendRawEmailRequest.html#getSourceArn()

How to use variables from backend in .scss files

I have a site built with Django, Python and Wagtail.
What I want is to be able to add some styles in the backend and then use it in my frontent's .scss files.
For example I want to be able to set a primary color to #fff via backend and then use it in my .scss file as:
$g-color-primary: primary_color_from_backend;
$g-font-size-default: primary_font_size_from_backend;
I do not have any idea how I can do that and if it's possible at all?
Thanks for the help.
Unfortunately, it is not possible. You can instead define different classes in the CSS file, then use them in your HTML template dependent on the Django template variables there.
This would require to write out the sass color variables content (with scss syntax) in a physical .scss file, which depends on your development environment. And then import it into other .scss file to get compiled and output through a frontend build process tool like Gulp, Webpack.
For example, Webpack's sass-loader plugin provides option to prepend sass code at Frontend build/compile time.
https://github.com/webpack-contrib/sass-loader
module.exports = {
module: {
rules: [
{
test: /\.s[ac]ss$/i,
use: [
'style-loader',
'css-loader',
{
loader: 'sass-loader',
options: {
prependData: '$env: ' + process.env.NODE_ENV + ';',
},
},
],
},
],
},
};

django-pipeline with s3 storage is not compressing my js

I'm using django-pipeline with s3. I'm successfully using collectstatic to combined my Javascript files and store them in my s3 bucket, but they are not getting compressed for some reason (verified by looking at the file, its size, and its content-encoding). Otherwise things are working correctly with the combined scripts.js that is produced.
Here are the changes I made to use django-pipeline:
Added pipeline to installed apps.
Added 'pipeline.finders.PipelineFinder' to STATICFILES_FINDERS.
Set STATICFILES_STORAGE = 'mysite.custom_storages.S3PipelineManifestStorage' where this class is as defined in the documentation, as seen below.
Set PIPELINE_JS as seen below, which works but just isn't compressed.
PIPELINE_ENABLED = True since DEBUG = True and I'm running locally.
PIPELINE_JS_COMPRESSOR = 'pipeline.compressors.yuglify.YuglifyCompressor' even though this should be default.
Installed the Yuglify Compressor with npm -g install yuglify.
PIPELINE_YUGLIFY_BINARY = '/usr/local/bin/yuglify' even though the default with env should work.
Using the {% load pipeline %} and {% javascript 'scripts' %} which work.
More detail:
PIPELINE_JS = {
'scripts': {
'source_filenames': (
'lib/jquery-1.11.1.min.js',
...
),
'output_filename': 'lib/scripts.js',
}
}
class S3PipelineManifestStorage(PipelineMixin, ManifestFilesMixin, S3BotoStorage):
location = settings.STATICFILES_LOCATION
As mentioned, collectstatic does produce scripts.js just not compressed. The output of that command includes:
Post-processed 'lib/scripts.js' as 'lib/scripts.js'
I'm using Django 1.8, django-pipeline 1.5.2, and django-storages 1.1.8.
Similar questions:
django-pipeline not compressing
django pipeline with S3 storage not compressing
The missing step was to also extend GZipMixin, AND, it has to be first in the list of parents:
from pipeline.storage import GZIPMixin
class S3PipelineManifestStorage(GZIPMixin, PipelineMixin, ManifestFilesMixin, S3BotoStorage):
location = settings.STATICFILES_LOCATION
Now collectstatic produces a .gz version of each file as well, but my templates still weren't referencing the .gz version.
To address this the author says:
To make it work with S3, you would need to change the staticfiles
storage url method to return .gz urls (and staticfiles/pipeline
template tags depending if you care for clients that don't support
gzip). Also don't forget to setup the proper header on s3 to serve
theses assets as being gzipped.
I adapted an example he provided elsewhere, which overrides the url method:
class S3PipelineManifestStorage(GZIPMixin, PipelineMixin, ManifestFilesMixin, S3BotoStorage):
location = settings.STATICFILES_LOCATION
def url(self, name, force=False):
# Add *.css if you are compressing those as well.
gzip_patterns = ("*.js",)
url = super(GZIPMixin, self).url(name, force)
if matches_patterns(name, gzip_patterns):
return "{0}.gz".format(url)
return url
This still doesn't handle setting the Content-Encoding header.
A simpler alternative is to use the S3Boto Storages option AWS_IS_GZIPPED which performs gzipping AND sets the appropriate header.
More is required to support clients without gzip, however.
Also useful are these instructions from Amazon on serving compressed files from S3.

How do you connect to AWS Elastic Transcoder?

I'm trying to transcode some videos, but something is wrong with the way I am connecting.
Here's my code:
transcode = layer1.ElasticTranscoderConnection()
transcode.DefaultRegionEndpoint = 'elastictranscoder.us-west-2.amazonaws.com'
transcode.DefaultRegionName = 'us-west-2'
transcode.create_job(pipelineId, transInput, transOutput)
Here's the exception:
{u'message': u'The specified pipeline was not found: account=xxxxxx, pipelineId=xxxxxx.'}
To connect to a specific region in boto, you can use:
import boto.elastictranscoder
transcode = boto.elastictranscoder.connect_to_region('us-west-2')
transcode.create_job(...)
I just started using boto the other day, but the previous answer didn't work for me - don't know if the API changed or what (seems a little weird if it did, but anyway). This is how I did it.
#!/usr/bin/env python
# Boto
import boto
# Debug
boto.set_stream_logger('boto')
# Pipeline Id
pipeline_id = 'lotsofcharacters-393824'
# The input object
input_object = {
'Key': 'foo.webm',
'Container': 'webm',
'AspectRatio': 'auto',
'FrameRate': 'auto',
'Resolution': 'auto',
'Interlaced': 'auto'
}
# The object (or objects) that will be created by the transcoding job;
# note that this is a list of dictionaries.
output_objects = [
{
'Key': 'bar.mp4',
'PresetId': '1351620000001-000010',
'Rotate': 'auto',
'ThumbnailPattern': '',
}
]
# Phone home
# - Har har.
et = boto.connect_elastictranscoder()
# Create the job
# - If successful, this will execute immediately.
et.create_job(pipeline_id, input_name=input_object, outputs=output_objects)
Obviously, this is a contrived example and just runs from a standalone python script; it assumes you have a .boto file somewhere with your credentials in it.
Another thing to note is the PresetId's; you can find these in the AWS Management Console for Elastic Transcoder, under Presets. Finally, the values that can be stuffed in the dictionaries are lifted verbatim from the following link - as far as I can tell, they are just interpolated into a REST call (case sensitive, obviously).
AWS Create Job API

Google App Engine w/ Django wont render 960GS

I'm trying to create a website using GAE(Google App Engine) as the server and Having pages rendered using the GAE Django API. The CSS style I'd like to use is the 960 Grid System, specifically the adaptive version found here.
My page is being rendered with Django in GAE as usual:
class MainPage(webapp.RequestHandler):
def get(self):
featuredPic = "img/featuredPic.jpg"
values = {
'featuredPic' : featuredPic,
}
self.response.out.write(template.render('index.html', values))
application = webapp.WSGIApplication([('/', MainPage)], debug=True)
And my index.html file also includes the code neccassarry for an adaptive grid system:
<script src="/js/adapt.js"></script>
<script>
// Edit to suit your needs.
var ADAPT_CONFIG = {
// Where is your CSS?
path : '/css/',
// false = Only run once, when page first loads.
// true = Change on window resize and page tilt.
dynamic : true,
// First range entry is the minimum.
// Last range entry is the maximum.
// Separate ranges by "to" keyword.
range : [ '0px to 760px = mobile.min.css',
'760px to 980px = 720.min.css',
'980px to 1280px = 960.min.css',
'1280px to 1600px = 1200.min.css',
'1600px to 1940px = 1560.min.css',
'1940px to 2540px = 1920.min.css',
'2540px = 2520.min.css' ]
};
</script>
Also I am including the css,js,img, and other folders in the app.yaml, yet despite all this the resulting HTML does not follow the 960 Grid System classes I have set to the divs. Does GAE disable JS or am I making some other mistake?
Actually, I am not that good at GAE, but I can help you, i guess.
I've also made a page on GAE and that has also template html file which uses the .js and .css files
At the template html file, I've written the script tag like the below.
<script type="application/x-javascript" src="iui/iui.js"></script>
and i put the .js file at the below path.
<app_name>\iui\iui.js
<app_name> has the models.py, urls.py, views.py, etc.
In addition, I've added the following statements on my app.yaml, "muchart" is <app_name>
app.yaml
...
handlers:
- url: /muchart/js
static_dir: muchart/js
- url: /muchart/iui
static_dir: muchart/iui
This doesn't exactly answer my question but I instead switched to the successor of 960.gs, Unsemantic, available here. It works really well with my project, as it was pretty much what I was looking for in the adaptive version of 960.gs. Also I had no issues setting it up.

Categories

Resources