How to run python in backend of flask web app? - python

I'm building a web app using flask. I've made a completely separate python program and I want to run it in the backend of the web app. It is basically a program that takes in inputs and makes some calculations about scheduling and sends the user an email.
I don't know where or how to implement it into the web app.

normally one would implement such a manner with a message queue to pass it to a backend or with a database to persist these calculation jobs for another backend program.
If you want to stay with python flask is offering something for you called Celery

Related

How to trigger python script with Hasura event

I'm currently building a selfhosted Vuejs webapp (with account logins).
The webapp needs to be a Python webscraper GUI where my user has control over the Python scraper.
So for example, the user fills in an endpoint, starts scraper, view results, trigger a new more depth scraper, etc.
I have the Python scripts for scraping.
And I have decided to go with VueJS + AWS cognito + Hasura for the frontend
I have difficulties understanding how to trigger the python scripts and dump the results in the database and show them to the frontend.
I do like the 3 factor approach:
The data from my scrapers can be many db entries, so I don't like to enter them in the database via mutations.
Do I have to make Flask endpoints to let Hasura trigger these webhooks?
I'm not familiar with serverless.
How do I make my Python scraper scripts serverless?
Or can I just use SQLalchemy to dump the scraper results into the database?
But how do I notify my frontend user that the data is ready?
There are a lot of questions in this one and the answer(s) will be somewhat opinionated so it might not be the greatest fit for StackOverflow.
That being said, after reading through your post I'd recommend that for your first attempt at this you use SQLalchemy to store the results of your scraper jobs directly into the Database. It sounds like you have the most familiarity with this approach.
With Hasura, you can simply have a subscription to the results of the job that you query in your front end so the UI will automatically update on the Vue side as soon as the results become available.
You'll have to decide how you want to kick off the scraping jobs, you have a few different options:
Expose an API endpoint from your Python app and let the UI trigger it
Use Hasura Actions
Build a GQL server in Python and attach it to Hasura using Remote Schemas
Allow your app put a record into the Database using a graphql mutation that includes information about the scrape job and then allow Hasura to trigger a webhook endpoint in your Python app using Hasura Event Triggers
Hasura doesn't care how the data gets into the database it provides a ton of functionality and value even if you're using a different Database access layer in another part of your stack.

Run a perl script from within Flask Web Application

I have a Flask Web Application that is periodically receiving JSON information from another application via HTTP POST.
My Flask Web Application is running on a CentOS 7 Server with Python 2.7.X.
I am able to parse the fields from this received JSON in the Flask Web Application and get some of the information that interests me. For example: I get some JSON input and extract an "ID":"7" field from it.
What I want to do now is run a perl script from within this Flask Web Application by using this "ID":"7".
Running 'perl my_perl_script.pl 7' manually on the command line works fine. What I want is for the Flask Web Application to perform this automatically whenever it receives an HTTP POST, by using the specific ID number found in this POST.
How can I do that in Flask?
Is it a good idea to do it with a subprocess call or should I consider implementing queues with Celery/rq? Or maybe some other solution?
I think the perl script should be invoked as a separate Linux process, independent of the Flask Web Application.
Thank you in advance :)
Sub,
I vote yes on subprocess, here's a post on SO about it. Control remains with Flask that way. An alternative might be to code a perl script that watchdogs for a trigger event depending on your needs, but that would put more of the process control on the perl side of things and less efficient use of resources.

Python web service with React/Node Application

I have the bulk of my web application in React (front-end) and Node (server), and am trying to use Python for certain computations. My intent is to send data from my Node application to a Python web service in JSON format, do the calculations in my Python web service, and send the data back to my Node application.
Flask looks like a good option, but I do not intend to have any front-end usage for my Python web service. Would appreciate any thoughts on how to do this.
In terms of thoughts:
1) You can build a REST interface to your python code using Flask. Make REST calls from your nodejs.
2) You have to decide if your client will wait synchronously for the result. If it takes a relatively long time you can use a web hook as a callback for the result.

Dynamically updating a web interface from a Python daemon

I'll briefly explain what I'm trying to achieve: We have a lot of servers behind ipvsadm VIPs (LVS load balancing) and we regularly move servers in/out of VIPs manually. To reduce risk (junior ops make mistakes...) I'd like to abstract it to a web interface.
I have a Python daemon which repeatedly runs "ipvsadm -l" to get a list of servers and statistics, then creates JSON from this output. What I'd now like to do is server this JSON, and have a web interface that can pass commands. For example, selecting a server in a web UI and pressing remove triggers an ipvsadm -d <server>... command. I'd also like the web UI to update every 10 seconds or so with the statistics from the list command.
My current Python daemon just outputs to a file. Should I somehow have this daemon also be a web server and serve its file and accept POST requests with command identifiers/arguments? Or a second daemon for the web UI? My only front end experience is with basic Bootstrap and jQuery usually backed by Laravel, so I'm not sure if there's a better way of doing this with sockets and some fancy JS modern-ism.
If there is a more appropriate place for this post, please move it if possible or let me know where to re-post.
You don't need fancy js application. To take the path of least resistance, I would create some extra application - if you like python, I recommend flask for this job. If you prefer php, then how about slim?
In your web application, if you want to make it fast and easy, you can even implement ajax mechanism fetching results based on interval to refresh servers' data every 10 seconds. You will fetch it from json served by independent, already existing deamon.
Running commands clicked on Web UI can be done by your web application.
Your web application is something extra and I find it nice to be separated from deamon which fetch data about servers and save it as json. Anytime you can turn off the page, but all statistics will be still fetching and available for console users in json format.

choosing an application framework to handle offline analysis with web requests

I am trying to design a web based app at the moment, that involves requests being made by users to trigger analysis of their previously entered data. The background analysis could be done on the same machine as the web server or be run on remote machines, and should not significantly impede the performance of the website, so that other users can also make analysis requests while the background analysis is being done. The requests should go into some form of queueing system, and once an analysis is finished, the results should be returned and viewable by the user in their account.
Please could someone advise me of the most efficient framework to handle this project? I am currently working on Linux, the analysis software is written in Python, and I have previously designed dynamic sites using Django. Is there something compatible with this that could work?
Given your background and the analysys code already being written in Python, Django + Celery seems like an obvious candidate here. We're currently using this solution for a very processing-heavy app with one front-end django server, one dedicated database server, and two distinct celery servers for the background processing. Having the celery processes on distinct servers keeps the djangon front responsive whatever the load on the celery servers (and we can add new celery servers if required).
So well, I don't know if it's "the most efficient" solution but it does work.

Categories

Resources