I am creating a restful api using flask. I have a some data operations that need to be run before the server starts so that calling the api wont load the data again and again.
However, the data also updates via a cronjob. Since the updated data is the input the variable remains static as long as the flask app runs.
I am aware that the flask app reloads on code change but is there a way to make it reload periodically?
One possible, but maybe not the best solution could be:
run your Flask app via Supervisor ( http://supervisord.org/ )
after the cronjob finishes, kill your Flask app
Supervisor will automatically re-start your Flask app
Related
I'm building a web app using flask. I've made a completely separate python program and I want to run it in the backend of the web app. It is basically a program that takes in inputs and makes some calculations about scheduling and sends the user an email.
I don't know where or how to implement it into the web app.
normally one would implement such a manner with a message queue to pass it to a backend or with a database to persist these calculation jobs for another backend program.
If you want to stay with python flask is offering something for you called Celery
I use an API to synchronise a lot of data to my app every night. The API itself uses a callback system, so I send a request to their API (including a webhook URL) and when the data is ready they will send a notification to my webhook to tell me to query the API again for the data.
The problem is, these callbacks flood in at a high rate (thousands per minute) to my webhook, which puts an awful lot of strain on my Flask web app (hosted on a Heroku dyno) and causes errors for end users. The webhook itself has been reduced down to forwarding the message on to a RabbitMQ queue (running on separate dynos), which then works through them progressively at its own pace. Unfortunately, this does not seem to be enough.
Is there something else I can do to manage this load? Is it possible to run a particular URL (or set of URLs) on a separate dyno from the public facing parts of the app? i.e. having two web dynos?
Thanks
you can deploy you application with SAME code on more than one dyno using free tier. For example, you application is called rob1 and hosted at https://rob1.herokuapp.com, and source code accessible from https://git.heroku.com/rob1.git. You can create application rob2, accessible from https://rob2.herokuapp.com and with source code hosted at https://git.heroku.com/rob2.git
Than, you can push code to 2nd application.
$ cd projects/bob1
$ git remote add heroku2 https://git.heroku.com/rob2.git
$ git push heroku2 master
As result, you have single repo on your machine, and 2 identical heroku applications running code of your project. Probably, you'll need to copy environment parameters of 1st app to 2nd one.
But anyway, you'll have 2 identical apps on free tier.
Later, if you have obtained domain name, for example robsapp.example.org, you can make it have to CNAME DNS records pointing to your heroku apps to make load balancing like this
rob1.herokuapp.com
rob2.herokuapp.com
as result, you have you application webhooks available on robsapp.example.org and it automatically load balance requests between rob1 and rob2 apps
I have a simple python script that gets the local weather forecast and sends an email with the data
I want to run this script daily, i found out that cron is used for this purpose but online cron jobs require a url
I wanted to ask how to host my python scripts so that they run online through a url, if possible that is...
I would recommend using Heroku with a python buildpack as a starting point. Using the flask library, you can very minimally start a web container and expose the endpoint online which can then be queried from your cron service. Heroku also provides a free account which ideally should fit your need.
As a peek into how easy it is to setup flask, well..
from flask import Flask
app = Flask(__name__)
#app.route('/cron-me')
def cron_me():
call_my_function_here()
return 'Success'
.. and you're done ¯\_(ツ)_/¯
Try setting up a simple Flask app at www.pythonanywhere.com, I think it will do the job for you even with the free account.
EDIT: And for sending e-mails, you can use Mailgun, with the free version you can send e-mails to a small number of addresses that need to be validated from the recipient side (to avoid people using it for spam :-))
I've been making a little system to monitoring a flask app and others (postgres database, linux server, etc) with prometheus. Everything is going well, but I would like monitoring my flask app without modifying the code.
For example to monitoring methods of my app I did:
# Create a metric to track time spent and requests made.
REQUEST_TIME = Summary('request_processing_seconds', 'Time spent processing request')
#app.route('/')
#REQUEST_TIME.time()
def index():
myUser = User.query.all()
return render_template('add_user.html', myUser= myUser)
I used this python library.
Also, I used other library to monitoring a flask app:
monitor(app, port=9999)
unfortunately both are modifying my code. I want to monitoring my flask app without modifying his code. It is possible?
It really isn't clear what you're asking. However, if you just need info about which requests are called and how long they take, you could run your flask app with newrelic: they offer a free tier (yay!) and using it you will gain lots of insight about your app. however to use it you'll need to run your app with their client (but no code changes are required).
more info on what you'll get can be found here:
https://newrelic.com/python/flask
I`m new to this, and misunderstand how Gunicorn + Flask works.
When i run Gunicorn with 4 workers it creates 4 instances of my Flask app, or it will create 4 processes that handle web requests from Nginx and one instance of Flask app?
If i make simple implementation of memory cache(dictionary for example) in my app, will gunicorn create more than one instances of app and therefore more than one instances of cache?
It will create 4 gunicorn workers to handle the one flask app. If you spin 4 instances of a flask app (with docker for example) you will need to run gunicorn 4 times. Finally to handle all those flask instances you will need a Nginx server in front of it acting as a load balancer.
For example, if one user is doing a registration routine that takes a lot of time due to multiple querys to the database you still have another worker to send the request to the flask instance.
I get our point, but Flask is not WSGI ready, which is the stardard. Gunicorn is playing that role in production so you get more reliability instead of using the Develpment standard Werkzeug server that comes with it. In other words, Gunicorn is just a wrapper on you flask object. It just handles the requests and let Flask do its thing.