Very new to this.
I am trying to follow along with this tutorial.
The main difference to my setup is that I am not using a domain, but would rather just access the flask site (something for personal use only) through the ip address.
This step:
Test sample flask app: If you want, test yourself for typos. uwsgi --socket 0.0.0.0:5000 --protocol=http -w wsgi
... works and I can access the flask app at ipaddress:5000
My /etc/httpd/conf/httpd.conf ServerName is configured with the ipaddress rather than flaskdemo.com
Problematically though once I spin up the apache service with
sudo systemctl start httpd
... I get a 503 at ipaddress
Thanks in advance for any support.
Related
I am working on a FastAPI web application and using Uvicorn ASGI server. Now, want to configure server stats in Uvicorn but have not found a reference regarding.
Ex - As like uWSGI Stats Server provides stats -
uwsgi --socket :3031 --stats :1717 --module welcome
So, my question is Does Uvicorn supports the stats server mechanism? or, Is there any other way to achieve this?
No, there is an open issue on uvicorn for this. Check this comment for details https://github.com/encode/uvicorn/issues/610#issuecomment-611987371
I'm started developing a new site using Django. For realistic testing I wanted to run it on a Synology DS212J NAS.
Following the official Synology guides I installed ipkg and with it the mod_wsgi package.
As Next step: Following the standard tutorial I made a virtualenv and installed Django in it. Opening a new project and adjust the settings following to: https://www.digitalocean.com/community/tutorials/how-to-serve-django-applications-with-apache-and-mod_wsgi-on-ubuntu-16-04
I'm able to reach the "Hello World" Site from Django by use of manage.py
As suggested I want to exchange the manage.py through the apache server on the NAS. So I think I should go and edit the apache config files for e.g. define a virtual host...
However I can't localize the files for it, as it seems they where moved at DSM6 (which I use) in comparison too other guides.
Where need I to enter the values following the Tutorial on the Synology?
As I'm quite new into the topic do I need to especially load the mod_wsgi module for Apache and if where?
Is it a good idea to use the basic mode of wsgi instead of the daemon mode? I'm not sure which Django modules will be used later on in development...
Thanks for the support!
Activate the python 3 package and the webstation
In webstation> general settings> main server http enable nginx
In Control Panel> Network> DSM Settings> Enable Custom Domain: "test"
(which will allow us to access the nas by entering test.local and simplify the task later.)
Enable ssh connection in control panel> terminal and smtp
We use the ddns service of synology to have external access in our case "test.synology.me"
In control panel> security> certificate : we generate our ssl certificate with let's encrypt
Connect to the nas in ssh
Take root rights sudo -i
Install virtualenv: easy_install virtualenv
We set up our virtual environment: virtualenv -p python3 flasktest
Flask and gunicorn are installed:
pip install flask gunicorn
We create our web application, file: init.py
We launch our web application with gunicorn:
gunicorn --certfile /usr/syno/etc/certificate/system/default/cert.pem --keyfile /usr/syno/etc/certificate/system/default/privkey.pem -b 127.0 .0.1: 5000 app: app
In /etc/nginx/sites-enabled we create a server configuration file, we will use nginx as a proxy, in our case the file will be flasktest.conf
flasktest.conf file:
`
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
gzip on;
server_name test.synology.me;
location / {
proxy_pass https://127.0.0.1:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
error_log /volume1/projects/flasktest/logs/error.log;
access_log /volume1/projects/flasktest/logs/acess.log;
}
`
Open the control panel port> external access> router configuration> create> integrate application> enable the check box for webstation and apply
We check our server file for that we enter the command, nginx -t
We are restarting nginx synoservicecfg --restart nginx
You now have access to your python web applications from outside in https ** https: //test.synology.me**
a little more information ...
To end and access your application permanently if you will ever be able to reboot, crash ... you can create a script that will restart gunicorn because otherwise the webstation takes over elsewhere if you enter ip nas locally you do not will not see your web apps in python because we did not modify the main configuration file /etc/nginx/nginx.conf locally so this is the default index.html page of the webstation that will be displayed.
example:
cd / volume1 / projects / flasktest
source bin / activate
gunicorn --certfile /usr/syno/etc/certificate/system/default/cert.pem --keyfile /usr/syno/etc/certificate/system/default/privkey.pem -b 127.0.0.1:5000 app: app
</ dev / null 2> & 1 &
This method found with other python framework
I'm getting started with WSGI and until now, with a little help from some tutorials,I'm making some tests towards Flask with uWSGI in front of it, since that Flask is not a good option for Production environments (doesn't scale well and by default, it answers one request per time - http://flask.pocoo.org/docs/0.12/deploying/) and uWSGI gives flexibility and more reliability, spawning workers and processes. Am I wrong?
Most of the tutorials that I saw until, are pointing about setups with Nginx in front of WSGI, but is it really necessary? What I'm trying to do is just to give a scalable way to deliver requests to my Flask application, something with more performance and scalability.
So I have this basic setup:
hello.py
from flask import Flask
app = Flask(__name__)
#app.route("/")
def hello():
return "Hello World!"
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8080)
wsgi.py
from hello import app
if __name__ == "__main__":
app.run()
Running uWSGI:
uwsgi --socket 0.0.0.0:8080 --plugin python --wsgi-file wsgi.py --callable app --master --processes 4 --threads 2 &
When I perform a curl against the loopback address, I receive an empty reply..
curl http://127.0.0.1:8080
invalid request block size: 21573 (max 4096)...skip
curl: (52) Empty reply from server
Forgive me, but I can't see what I'm missing. Does anyone here, more experienced with WSGI, could point where is the failure of this setup? Any help would me much appreciated.
Reference documents:
https://www.digitalocean.com/community/tutorials/how-to-serve-flask-applications-with-uwsgi-and-nginx-on-ubuntu-16-04
http://uwsgi-docs.readthedocs.io/en/latest/WSGIquickstart.html
Your option, socket should be used when you combine uwsgi with web server (nginx for example). Otherwise you should use http, so
uwsgi --http 0.0.0.0:8080 --plugin python --wsgi-file wsgi.py --callable app --master --processes 4 --threads 2
will work.
Production environments (doesn't scale well and by default, it answers
one request per time - http://flask.pocoo.org/docs/0.12/deploying/)
and uWSGI gives flexibility and more reliability, spawning workers and
processes. Am I wrong?
You are right.
Most of the tutorials that I saw until, are pointing about setups with
Nginx in front of WSGI, but is it really necessary? What I'm trying to
do is just to give a scalable way to deliver requests to my Flask
application, something with more performance and scalability.
Well, nginx is designed to be in front and having it is much better then only application server (uwsgi). Specialisation, that's the key. Let your application server focus on bussiness processing and python.
I just started to learn Falcon (http://falcon.readthedocs.org/en/latest/user/quickstart.html)
but it need a web server running and docs suggesting use uwsgi or gunicorn.
though they have mentioned that how to use it with gunicorn
$ pip install gunicorn #install
$ gunicorn things:app #and run app through gunicorn.
But I want to run this sample app with uwsgi. but I have no clue how to.
I have installed it pip install uwsgi also gevent as suggested here http://falcon.readthedocs.org/en/latest/user/install.html
but what now. somebody guide me.
You'll probably find your answer on the uWSGI documentation site, specifically try this page:
http://uwsgi-docs.readthedocs.org/en/latest/WSGIquickstart.html
I've never used Falcon or uWSGI, but it looks like you can probably get away with:
uwsgi --wsgi-file things.py --callable app
You can use uwsgi.ini to store configuration and easy run. Good way to setup uwsgi as services. Configuration with virtualenv
[uwsgi]
http = :8000
chdir = /home/user/www/uwsgi-ini
virtualenv = /home/user/www/uwsgi-ini/venv/
wsgi-file = main.py
callable = app
processes = 4
threads = 2
stats = 127.0.0.1:9191
and run in app folder:
uwsgi uwsgi.ini
command with virtualenv
uwsgi --http :8000 --wsgi-file main.py --callable app -H $(pwd)/venv/
Here's a data flow:
http <--> nginx <--> uWSGI <--> python webapp
I guess there's http2uwsgi transfer in nginx, and uwsgi2http in uWSGI.
What if I want to directly call uWSGI to test an API in a webapp?
actually i'm using pyramid. just config [uwsgi] in .ini and run uWSGI. but i want to test if uWSGI hold webapp function normally, the uWSGI socket is not directly reachable by http.
Try using uwsgi_curl
$ pip install uwsgi-tools
$ uwsgi_curl 10.0.0.1:3030 /path
or if you need to do some more requests try uwsgi_proxy from the same package
$ uwsgi_proxy 10.0.0.1:3030
Proxying remote uWSGI server 10.0.0.1:3030 "" to local HTTP server 127.0.0.1:3030...
so you can browse it locally at http://127.0.0.1:3030/.
If your application allows only certain Host header, you can specify host name as well
$ uwsgi_curl 10.0.0.1:3030 host.name/path
$ uwsgi_proxy 10.0.0.1:3030 -n host.name
If application has static files, you can redirect such requests to your front server using -s argument. You can also specify different local port if needed.
From your question I'm assuming, you want to directly run your WSGI-compliant app with uWSGI and open an HTTP-Socket. You can do so by configuring your uwsgi.ini (or whatever the filename is) with
http=127.0.0.1:8080
uwsgi will now open an HTTP-socket that listen on port 8080 for incoming connections from localhost (see documentation: http://uwsgi-docs.readthedocs.org/en/latest/HTTP.html)
Alternatively you can directly start your process from the command-line with the http-parameter:
$ uwsgi --http=127.0.0.1:8080 --module=yourapp:wsgi_entry_point
If you use unix-sockets to configure uwsgi nginx is able to communicate with that socket via the uwsgi-protocol (http://uwsgi-docs.readthedocs.org/en/latest/Protocol.html).
Keep in mind, that if you usually serve static content (css, javascript, images) through nginx you will need to set that up, too, if you run uwsgi directly. But if you only want to test a REST-API this should work out for you.
First, consider those questions:
On which port is uWSGI running?
Is uWSGI running on your or on a remote machine?
If it's running on a remote machine, is the port accessible from your computer? (iptables rules might forbid external access)
If you made sure you have access, you can just call http://hostname:port/path/to/uWSGI for direct API access.
I know this is an old question but I just needed this and found out that this docker+nginx solution works for me the best
cat > /tmp/nginx.conf << EOF
events {}
http {
server {
listen 8000;
location / {
include uwsgi_params;
uwsgi_pass 127.0.0.1:3031;
}
}
}
EOF
docker run -it --network=host --rm --name uswgi-proxy -v /tmp/nginx.conf:/etc/nginx/nginx.conf:ro nginx