I have a simple python script that gets the local weather forecast and sends an email with the data
I want to run this script daily, i found out that cron is used for this purpose but online cron jobs require a url
I wanted to ask how to host my python scripts so that they run online through a url, if possible that is...
I would recommend using Heroku with a python buildpack as a starting point. Using the flask library, you can very minimally start a web container and expose the endpoint online which can then be queried from your cron service. Heroku also provides a free account which ideally should fit your need.
As a peek into how easy it is to setup flask, well..
from flask import Flask
app = Flask(__name__)
#app.route('/cron-me')
def cron_me():
call_my_function_here()
return 'Success'
.. and you're done ¯\_(ツ)_/¯
Try setting up a simple Flask app at www.pythonanywhere.com, I think it will do the job for you even with the free account.
EDIT: And for sending e-mails, you can use Mailgun, with the free version you can send e-mails to a small number of addresses that need to be validated from the recipient side (to avoid people using it for spam :-))
Related
I use an API to synchronise a lot of data to my app every night. The API itself uses a callback system, so I send a request to their API (including a webhook URL) and when the data is ready they will send a notification to my webhook to tell me to query the API again for the data.
The problem is, these callbacks flood in at a high rate (thousands per minute) to my webhook, which puts an awful lot of strain on my Flask web app (hosted on a Heroku dyno) and causes errors for end users. The webhook itself has been reduced down to forwarding the message on to a RabbitMQ queue (running on separate dynos), which then works through them progressively at its own pace. Unfortunately, this does not seem to be enough.
Is there something else I can do to manage this load? Is it possible to run a particular URL (or set of URLs) on a separate dyno from the public facing parts of the app? i.e. having two web dynos?
Thanks
you can deploy you application with SAME code on more than one dyno using free tier. For example, you application is called rob1 and hosted at https://rob1.herokuapp.com, and source code accessible from https://git.heroku.com/rob1.git. You can create application rob2, accessible from https://rob2.herokuapp.com and with source code hosted at https://git.heroku.com/rob2.git
Than, you can push code to 2nd application.
$ cd projects/bob1
$ git remote add heroku2 https://git.heroku.com/rob2.git
$ git push heroku2 master
As result, you have single repo on your machine, and 2 identical heroku applications running code of your project. Probably, you'll need to copy environment parameters of 1st app to 2nd one.
But anyway, you'll have 2 identical apps on free tier.
Later, if you have obtained domain name, for example robsapp.example.org, you can make it have to CNAME DNS records pointing to your heroku apps to make load balancing like this
rob1.herokuapp.com
rob2.herokuapp.com
as result, you have you application webhooks available on robsapp.example.org and it automatically load balance requests between rob1 and rob2 apps
I have a Flask Web Application that is periodically receiving JSON information from another application via HTTP POST.
My Flask Web Application is running on a CentOS 7 Server with Python 2.7.X.
I am able to parse the fields from this received JSON in the Flask Web Application and get some of the information that interests me. For example: I get some JSON input and extract an "ID":"7" field from it.
What I want to do now is run a perl script from within this Flask Web Application by using this "ID":"7".
Running 'perl my_perl_script.pl 7' manually on the command line works fine. What I want is for the Flask Web Application to perform this automatically whenever it receives an HTTP POST, by using the specific ID number found in this POST.
How can I do that in Flask?
Is it a good idea to do it with a subprocess call or should I consider implementing queues with Celery/rq? Or maybe some other solution?
I think the perl script should be invoked as a separate Linux process, independent of the Flask Web Application.
Thank you in advance :)
Sub,
I vote yes on subprocess, here's a post on SO about it. Control remains with Flask that way. An alternative might be to code a perl script that watchdogs for a trigger event depending on your needs, but that would put more of the process control on the perl side of things and less efficient use of resources.
I have a JSON containing users' information:
[
{
'id': 1
'password': 'abc'
'email': 'abc#def.com'
},
...
]
and a function:
foo(user_id)
I would like to write a Python script to run on my GCE server, which can send emails to all of my users containing a link each.
When e.g. user with id = 2 clicks his/her link, my server will run
foo(2)
What is the simplest way of generating the URLs and making them trigger the function on my server?
I don't know how to use HTTP but would love to take this opportunity to learn, I just have no idea where to start, as most python-HTTP tutorials I found dealt with client-side.
Rather than using a GCE server, I recommend looking at Google App Engine, since Google will manage a lot more for you. This will let you focus on just the server code, rather than the server configuration.
Check out this tutorial that shows how to deploy a simple python app to GCP and do something similar to what you're asking about.
If you must use GCE, you can setup a simple HTTP server in Python like Flask or Bottle to run you code.
We have created an app for a production facility that is very simple using django and python. But throughout prototyping we used Runserver command and localhost. The problem is this: We want to deploy the app without using localhost and the command line every time. The people using it wont be able to do this. It will be on implemented on one computer so it shouldnt be that challenging. The app pulls data from one database and stores data in another. It would be nice to have our own URL. Do we need to do it through wsgi? Apache? I know the problem is simple but there seem to be so many ways to deploy and many of them are overcomplicated for our needs.
Follow up question: I read that it just using Localhost isnt the best for this type of thing. Is this true?
Any help would be great
It sounds like you want to deploy the app live. So, I'd recommend using a dynamic hosting service like AWS/Azure/Firebase etc. If you want your own URL, purchase a domain, and in the configuration for the domain set up a CNAME file as well so you can redirect your domain to the live instance on the cloud.
Local host is better used for testing, and making changes without affecting the client and then you deploy/push to the cloud instance for production.
I've been making a little system to monitoring a flask app and others (postgres database, linux server, etc) with prometheus. Everything is going well, but I would like monitoring my flask app without modifying the code.
For example to monitoring methods of my app I did:
# Create a metric to track time spent and requests made.
REQUEST_TIME = Summary('request_processing_seconds', 'Time spent processing request')
#app.route('/')
#REQUEST_TIME.time()
def index():
myUser = User.query.all()
return render_template('add_user.html', myUser= myUser)
I used this python library.
Also, I used other library to monitoring a flask app:
monitor(app, port=9999)
unfortunately both are modifying my code. I want to monitoring my flask app without modifying his code. It is possible?
It really isn't clear what you're asking. However, if you just need info about which requests are called and how long they take, you could run your flask app with newrelic: they offer a free tier (yay!) and using it you will gain lots of insight about your app. however to use it you'll need to run your app with their client (but no code changes are required).
more info on what you'll get can be found here:
https://newrelic.com/python/flask