I am looking at two articles on how to Dockerize a Pyramid app. I am not that familiar with Python, but I am fairly certain with a Pyramid app you need to use WSGI.
This article uses WSGI:
https://medium.com/#greut/minimal-python-deployment-on-docker-with-uwsgi-bc5aa89b3d35
This one just runs the python executable directly:
https://runnable.com/docker/python/dockerize-your-pyramid-application
It seems unlikely to me that you can run python directly and not incorporate WSGI, can anyone provide an explanation for why the runnable.com article's docker solution would work?
Per the scripts in the second link:
from wsgiref.simple_server import make_server
from pyramid.config import Configurator
from pyramid.response import Response
~snip~
if __name__ == '__main__':
config = Configurator()
config.add_route('hello', '/')
config.add_view(hello_world, route_name='hello')
app = config.make_wsgi_app() # The wsgi server is configured here
server = make_server('0.0.0.0', 6543, app) # and here
This contains an explanation of why the wsgi server is built in the if __name__=="__main__" block
Related
I have this tornado application wrapped with django function as WSGI app (using in windows)
from django.core.wsgi import get_wsgi_application
from django.conf import settings
from waitress import serve
settings.configure()
wsgi_app = tornado.wsgi.WSGIContainer(django.core.wsgi.WSGIHandler())
def tornado_app():
url = [(r"/models//predict", PHandler),
(r"/models//explain", EHandler),
('.*', tornado.web.FallbackHandler, dict(fallback=wsgi_app))]
return Application(url, db=ELASTIC_URL, debug=options.debug, autoreload=options.debug)
if __name__ == "__main__":
application = tornado_app()
http_server = HTTPServer(application)
http_server.listen(LISTEN_PORT)
IOLoop.current().start()
Not sure how to use waitress,
To serve using waitress I tried http_server = serve(application), server is starting, now sure is it correct, getting error when hitting the endpoint
waitress is a WSGI server; Tornado is not based on or compatible with WSGI. You cannot use waitress to serve the Tornado portions of your application.
To serve both Tornado and WSGI applications in one thread, you need to use Tornado's HTTPServer as you've done in the original example. For better scalability, I'd recommend splitting the Tornado and Django portions of your application into separate processes and putting a proxy like nginx or haproxy in front of them.
How do I run my Flask app which uses SSL keys using waitress. The SSL context is specified in my Flask's run() as in
app.run(ssl_context=('cert.pem', 'key.pem'))
But app.run() is not used when using waitress as in the code below. So, where do I specify the keys? Thanks for the help.
from flask import Flask, request
app = Flask(__name__)
#app.route("/")
def hello():
return "Hello World!"
if __name__ == '__main__':
# app.run(ssl_context=('../cert.pem', '../key.pem'))
from waitress import serve
serve(app, host="0.0.0.0", port=5000)
At the current version (1.4.3), Waitress does not natively support TLS.
See TLS support in https://github.com/Pylons/waitress/blob/36240c88b1c292d293de25fecaae1f1d0ad9cc22/docs/reverse-proxy.rst
You either need a reverse proxy in front to handle the tls/ssl part, or use another WSGI server (CherryPy, Tornado...).
I have a dash(plotly) app set up using flask as the server and can serve it on our Windows Server to port:8041 using waitress. My code to launch waitress is below;
#!/usr/bin/env python3
from waitress import serve
from src.pacedash.app import server as application
if __name__ == "__main__":
serve(application, threads=100, port=8041)
Everything works great if I use python run_waitress.py, except that when someone on our network navigates to the servename:8041 there is a "Not Secure" warning next to the url. Our IT vendor was able to get a cert file and key, but I'm not sure how to bring those into my current setup.
I have been trying to use nginx, but I can't find a guide to setting it up with waitress and I'm not too familiar with web apps or wsgi because I primarily work as the lone data person here.
I have been working on this same issue and have a solution. The nginx .conf file needs have a location defined like this:
location /myapp/ {
# Define the location of the proxy server to send the request to
proxy_pass http://localhost:8041/myapp/;
# standard proxy_set_header stuff below...
}
Then in your Dash application set the url_base_pathname to the same value:
app = dash.Dash(__name__, url_base_pathname='/myapp/')
I would use ngrok to expose your web app. It's amazingly simple:
Read this: https://ngrok.com/
I could be misinterpreting what you need, because I am not familiar with waitress (why not serve the app locally just using flask?), but if you need to test the live app, ngrok is what you should use.
I am starting with web development. I am trying to develop and webapp using the Instagram API and Django. I was looking that a lot of people it's using Tornado Web Server for Real Time Subscriptions. So I am using Webfaction as a host and found this code so I can wrap my Django project with the "WSGI Container" that Tornado Web Server provides:
import os
import tornado.httpserver
import tornado.ioloop
import tornado.wsgi
import tornado.web
import sys
import django.core.handlers.wsgi
sys.path.append('/path/to/project')
class HelloHandler(tornado.web.RequestHandler):
def get(self):
self.write('Hello from tornado')
def main():
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings' # path to your settings module
wsgi_app = tornado.wsgi.WSGIContainer(django.core.handlers.wsgi.WSGIHandler())
tornado_app = tornado.web.Application(
[
('/hello-tornado', HelloHandler),
('.*', tornado.web.FallbackHandler, dict(fallback=wsgi_app)),
]
)
http_server = tornado.httpserver.HTTPServer(tornado_app)
http_server.listen(8080)
tornado.ioloop.IOLoop.instance().start()
if __name__ == "__main__":
main()
So I run this python script inside my Webfaction server and everytime I try to access "http://mywebsite.com/hello-tornado/" does not seem to work. I know I am running that Tornado web server on that port but do not know how too access from the browser or something like that. What I am doing wrong here? Thanks for your help and patience. Will cyber high-five for every answer.
EDIT: What I am really trying to do is that I want to receive all the calls from the subscriptions that I make with the Instagram RealTime Subscription API through Tornado, for that I have a callback url "http://mysite.com/sub" and I want to be able to receive through Tornado.
You are starting the server at port 8080, Web browsers use port 80 by default, try using: http://mywebsite.com:8080/hello-tornado
if you want to use port 80 and you already have a web server running in the box you can try following Ali-Akber Saifee suggestion, or run the WSGI application directly from the server, using something like mod_python (http://www.modpython.org), you will lose the ability to run Tornado code, but Django will work.
You have to create a custom app (listening on port), note the port that is assigned to your app then configure tornado to serve on that port: http_server.listen(my port)
You can also avoid tornado and start directly by installing a django app.
Pyramid uses it's own Waitress web server for development purposes, but I want to serve my WSGI app under Tornado. I think I should configure it using the pserve .ini files, but I can't get it to work
The Pyramid application can be loaded from the INI files easily. From there you just pass the wsgi app into Tornado's WSGIContainer.
from pyramid.paster import get_app
app = get_app('development.ini')
container = tornado.wsgi.WSGIContainer(app)
Again, not really recommending running WSGI under Tornado, since it gives you none of the advantages of Tornado.
Should you still want to do it for some reason, the second example of the docs seems to be what you are looking for: http://www.tornadoweb.org/documentation/wsgi.html
def simple_app(environ, start_response):
status = "200 OK"
response_headers = [("Content-type", "text/plain")]
start_response(status, response_headers)
return ["Hello world!\n"]
container = tornado.wsgi.WSGIContainer(simple_app)
http_server = tornado.httpserver.HTTPServer(container)
http_server.listen(8888)
tornado.ioloop.IOLoop.instance().start()