I've created a page that allows a user to upload an excel file, which is then parsed into it's columns and then the rows are inserted into the database 500 rows at a time.
This is isn't a terribly long process - between 25 and 90 seconds, but long enough that I would like to give the user some feedback to let them know it's actually still working - in the form of status messages and/or progress bars.
My app is written in flask like this:
app.py
from flask import Flask, render_template, request
import tqdm
app = Flask(__name__)
#app.route('/', methods=['GET', 'POST'])
def fun():
if request.method == 'GET':
return render_template('index.html')
else:
filename = request.form['filename']
print('opening file') #change from console print to webpage
df = pandas.read_excel(filename)
print('File read.. processing data\n') #change from console to webpage
processData()
print('connecting to db....\n') #change to webpage print
db.connect()
print('connected to db! inserting rows') #change to webpage print
bulk_inserts = rows/500
for i in tqdm(range(bulk_inserts)): #wrapping tqdm around range makes a progress bar
insert500rows()
db.commit()
db.disconnect()
return 'Complete. ' + str(rows) + ' inserted.' #this gets sent as a post to the page
app.run()
I know you can only send one response to a post request, but how can I give the user status of the process if I can only send one response? Maybe I'm going about this the wrong way, but I think this is a pretty common use case. How else should I set this up if this way won't work?
For some reason marked as a duplicate of this question. That question asks how to print a continuous stream of values to a screen. Here I am asking how to send a message at certain points of execution. I think the comments provided about Flask-socketio provide a different approach to a different problem.
The "one response to one request" is a matter of how HTTP protocol works: the client sends a query and some data (the POST request), and the server responds with some other data (your "one response"). While you could technically get the server to send back pieces of the response in chunks, that is not how it works in practice; for one, browsers don't handle that too well.
You need to do this a different way. For instance, create a "side channel" with SocketIO, as the commenters helpfully suggest. Then you can send updates to the client through this side channel - instead of your prints, you would use socketio.emit.
On the client side, you would first subscribe to a SocketIO channel when the page loads. Then you would submit the file through an AJAX call (or in an separate iframe), and keep the SocketIO connection open on the page, to display the updates.
This way the POST request is separated from your "page loading". The javascript on the page remains active and can read and display progress update, and the upload (with the associated wait time) happens in background.
I would also do it like #matejcik explained in their answer, but there is also another way. What websockets does is pushing the data back to browser when there is an update. There is also the pull method.
You can send queries to the server periodically and the server will give you the update. You still have to use AJAX to send requests, and javascript's setTimeout function to wait between the queries but what you do is basically simply refreshing the page without showing it to user. It is easier to understand for beginners as the used technology is still GET. Instead of printing your new logs you add it to a string, (or an array) and when the GET request is made, you return this array, clear your text output and write this new array, both with old and new info.
This method is far less efficient than websockets but for prototyping it can be faster.
Related
My use case is I have a flaskapp running
The idea is it'll just display a static page (currently using PySimpleGUI) and upon receiving the information sent from the other computer (such as the name for e.g.) then it'll show the name as a QR code. (so perhaps a constant refresh on the GUI and changing upon receiving the data)
The code setup is this way currently
#app.route('/qrcode', methods=['GET'])
def displayQrCode():
args = request.args
name = args.get('name') #
try:
img = qrcode.make(name)
img.save('checkin-qrcode.png')
# Display QR Code
qrWindowThread = threading.Thread(target=qrCodeWindowShow)
qrWindowThread.start()
print(name)
return 'Success'
except RuntimeError:
return 'Error in GET'
However, being new to Flask i am not sure how to send a string from a separate computer running jupyter notebook to this Flask app. Online tutorials show like having a form field page but i wish to not have an user input.
So for instance if the jupyter notebook code sends a post request to the ipaddress:5000 it should ideally show (correct me here: ipaddress:5000/qrcode?name=JohnDoe)
so the flask app would retrieve this JohnDoe and display it as a QR code. I can hardcode or manually access this page but how would i have it 'listen' to this /qrcode and then Get the value when it's sent. Or am i misunderstanding it all and I need to do it via another way?
EDIT:
Alright third time lucky, I think I understand what you want now. So your use case is a first computer that is displaying a GUI and running a flask server with an route that can take in a string.
There's a second computer that can send a http request to that route with a name and when that happens you want the GUI on the first computer to refresh to display a QR code of the name.
I'd solve this by having the qrcode route write that name to persistent storage. Could be anything sqlite db, an environment variable, a string in a file.
The first computer running the PySimpleGUI interface poles this persistent storage for changes, and if it sees a change then it renders a new QR code for that name and displays it.
There are multiple ways available to do that:
REST API integration
Realtime data transmission using tornado or ??
Through a base origin for data transmission
Hope you get help with this.
I was playing with mobti, and I noticed that when I add an user what I'm really doing is sending a json with that user. See follow example.
Json in the request.
{"type":"mob:update","mob":[{"id":"v63ghnghgn8","name":"test2"},{"id":"qlxy16bl9q","name":"test 1"},{"id":"zxu28bb3ar","name":"test4"},{"id":"03kys4hqrjmm","name":"test 5"},{"id":"w9osegszlzm","name":"test 3"},{"id":"od6hattxo3","name":"test6"},{"id":"d972agvwux","name":"test 7"},{"id":"y65txgk19p","name":"example stackoverflow"}]}
So I was able to do an script in python to send a long update with a lot of users.
url = "wss://mobti.me/example"
ws = create_connection(url)
ws.send(json.dumps(data))
result = ws.recv()
print (result)
ws.close()
Being data a big json with a lot of names.
So now, here's de question, I was wondering if another user send a request with a new data, could I intercept that request? and how? I'm interested in intercepting the request and sending a new one. (It doesn't matter if I can't reject their request). Notice that this website isn't mine, so I can't intercept that request from inside (and that's not what I'm looking for)
Thanks in advance!
I'm working on a python/flask application and I have my logging handled on a different server. The way I currently set it up is to have a function which sends a request to the external server whenever somebody visits a webpage.
This, of course extends my TTB because execution only continues after the request to the external server is completed. I've heard about threading but read that that also takes a little extra time.
Summary of current code:
log_auth_token = os.environ["log_auth"]
def send_log(data):
post_data = {
"data": data,
"auth": log_auth_token
}
r = requests.post("https://example.com/log", data=data)
#app.route('/log')
def log():
send_log("/log was just accessed")
return("OK")
In short:
Intended behavior: User requests webpage -> User recieves response -> Request is logged.
Current behavior: User requests webpage -> Request is logged -> User recieves response.
What would be the fastest way to achieve my intended behavior?
What would be the fastest way to achieve my intended behavior?
Log locally and periodically send the log files to a separate server. More specifically, you need to create rotating log files and archive them so you don't end up with 1 huge file. In order to do this you need to configure your reverse proxy (like NGINX).
Or log locally and create an application that allows you to read the log files remotely.
Sending a log per server call to a separate server simply isn't efficient unless you have another process do that. Users shouldn't have to wait for your log action to complete
I'm working on a Flask app which retrieves the user's XML from the myanimelist.net API (sample), processes it, and returns some data. The data returned can be different depending on the Flask page being viewed by the user, but the initial process (retrieve the XML, create a User object, etc.) done before each request is always the same.
Currently, retrieving the XML from myanimelist.net is the bottleneck for my app's performance and adds on a good 500-1000ms to each request. Since all of the app's requests are to the myanimelist server, I'd like to know if there's a way to persist the http connection so that once the first request is made, subsequent requests will not take as long to load. I don't want to cache the entire XML because the data is subject to frequent change.
Here's the general overview of my app:
from flask import Flask
from functools import wraps
import requests
app = Flask(__name__)
def get_xml(f):
#wraps(f)
def wrap():
# Get the XML before each app function
r = requests.get('page_from_MAL') # Current bottleneck
user = User(data_from_r) # User object
response = f(user)
return response
return wrap
#app.route('/one')
#get_xml
def page_one(user_object):
return 'some data from user_object'
#app.route('/two')
#get_xml
def page_two(user_object):
return 'some other data from user_object'
if __name__ == '__main__':
app.run()
So is there a way to persist the connection like I mentioned? Please let me know if I'm approaching this from the right direction.
I think you aren't approaching this from the right direction because you place your app too much as a proxy of myanimelist.net.
What happens when you have 2000 users? Your app end up doing tons of requests to myanimelist.net, and a mean user could definitely DoS your app (or use it to DoS myanimelist.net).
This is a much cleaner way IMHO :
Server side :
Create a websocket server (ex: https://github.com/aaugustin/websockets/blob/master/example/server.py)
When a user connects to the websocket server, add the client to a list, remove it from the list on disconnect.
For every connected users, do frequently check myanimelist.net to get the associated xml (maybe lower the frequence the more online users you get)
for every xml document, make a diff with your server local version, and send that diff to the client using the websocket channel (assuming there is a diff).
Client side :
on receiving diff : update the local xml with the differences.
disconnect from websocket after n seconds of inactivity + when disconnected add a button on the interface to reconnect
I doubt you can do anything much better assuming myanimelist.net doesn't provide a "push" API.
I am new to python. I am using Flask for creating a web service which makes lots of api calls to linkedin. The problem with this is getting the final result set lot of time and frontend remains idle for this time. I was thinking of returning partial results found till that point and continuing api calling at server side. Is there any way to do it in Python? Thanks.
Flask has the ability to stream data back to the client. Sometimes this requires javascript modifications to do what you want but it is possible to send content to a user in chunks using flask and jinja2. It requires some wrangling but it's doable.
A view that uses a generator to break up content could look like this (though the linked to SO answer is much more comprehensive).
from flask import Response
#app.route('/image')
def generate_large_image():
def generate():
while True:
if not processing_finished():
yield ""
else:
yield get_image()
return Response(generate(), mimetype='image/jpeg')
There are a few ways to do this. The simplest would be to return the initial request via flask immediately and then use Javascript on the page you returned to make an additional request to another URL and load that when it comes back. Maybe displaying a loading indicator or something.
The additional URL would look like this
#app.route("/linkedin-data")
def linkedin():
# make some call to the linked in api which returns "data", probably in json
return flask.jsonify(**data)
Fundamentally, no. You can't return a partial request. So you have to break your requests up into smaller units. You can stream data using websockets. But you would still be sending back an initial request, which would then create a websocket connection using Javascript, which would then start streaming data back to the user.