Using Flask-SQLAlchemy Event API to broadcast to Flask-SocketIO? - python

I'm a novice developing a simple multi-user game (think Minesweeper) using Flask for the API backend and AngularJS for the frontend. I've followed tutorials to structure the Angular/Flask app and I've coded up a RESTful API using Flask-Restless.
Now I'd like to push events to all the clients when game data is changed in the database (as it is by a POST to one the Restless endpoints). I was looking at using the SqlAlchemy event.listen API to call the Flask-SocketIO emit function to broadcast the data to clients. Is this an appropriate method to accomplish what I'm trying to do? Are there drawbacks to this approach?

#CESCO's reply works great if all of your computation is being done in the same process. You could also use this syntax (see full source code here):
#sa.models_committed.connect_via(app)
def on_models_committed(sender, changes):
for obj, change in changes:
print 'SQLALCHEMY - %s %s' % (change, obj)
Read on if you're interested in subscribing to all updates to a database ...
That won't work if your database is being updated from another process, however.
models_committed only works in the same process where the commit comes from (it's not a DB-level notification, it's sqlalchemy after committing to the DB)
https://github.com/mitsuhiko/flask-sqlalchemy/issues/369#issuecomment-170272020
I wrote a little demo app showing how to use any of Redis, ZeroMQ or socketIO_client to communicate real-time updates to your server. It might be helpful for anyone else trying to deal with outside database access.
Also, you could look into subscribing to postgres events:
https://blog.andyet.com/2015/04/06/postgres-pubsub-with-json/

Thats the barebone version of the code you want. I would start testing from there. You will have to figure out what you mean by change, so you can atach the right SqlAlchemy event to the type of change you are doing.
User database after insetion listen event
from sqlalchemy import event
from app import socketio
def after_insert_listener(mapper, connection, target):
socketio.emit('namespace response',{'data': data[0]},
namespace='/namespace')
print(target.id_user)
event.listen(User, 'after_insert', after_insert_listener)
SocketIO

Related

Send data from Django to another server

I have an already existing Django app. I would like to add a system that sends data from my Django application to another Python application hosted on another server, so that the Python application receives data from the Django App in json format, possibly.
So for example, i would need to create a view that every tot seconds sends the data from a DB table to this application, or when a form is hit, the data is sent to this external application.
How can i do this? Is there an example for this particular matter? I don't know what tools i'd need to use to create this system, i only know that i would need to use Celery to perform asynchronous tasks, but nothing else; should i use Webhooks maybe? Or Django channels?
Edit: adding some more context:
I have my Django client. Then i have one or two Python applications running on another server. On my Django client i have some forms. Once the form is submitted, the data is saved on the db, but i also want this data to be sent instantly to my Python applications. The Python applications should receive the data from Django in Json format and perform some tasks according to the values submitted by users. Then the application should send a response to Django.
Come on! I'll call your Django app here "DjangoApp" and your Python apps, in Flask or another framework by "OtherApp".
First as you predicted you will need a framework that is capable of performing tasks, the new **Django 3.0 allows this, but I haven't used it yet ... I will pass on to you something that you are using and fully functional with Django 2.8 and Python 3.8**.
On your DjangoApp server you will need to structure the communication well with your Celery, let's leave the tasks to him. You can read Celery Docs and this post, its very ok to make this architecture.
Regardless of how your form or Django App looks, when you want it to activate a task in celery, it is basically the function to transmit data but in the background.
from .tasks import send_data
...
form.save()
# Create a function within the form to get the data the way you want it
# or do it the way you want.
values = form.new_function_serializedata()
send_data.delay(values) # [CALL CELERY TASKS][1]
...
Read too CALL CELERY TASKS
In all your other applications you will need to have a POST route to receive and serialize this data, do this with lightweight frameworks like Pyramid
This way, every time a form is submitted, you will have this data sent to the server within the send_data function.
In my experience, but not knowing much about your problem I would use a similar architecture but using Celery Beat.
CELERY_BEAT_SCHEDULE = {
'send_data': {
'task': 'your_app.tasks.send_data',
'schedule': crontab(), # CONFIGURE YOUR CRON
},
}
Not only is the above code added, but it is something like that.
Within your models I would create one field for sent. And every 2 seconds, 10 seconds .. as long as I wish I would filter all objects with sent = false, and pass all objects for the send_data task.
I don't know if you got confused, that's a lot to explain. But I hope I can help and answer your questions.
import requests
from django import http
def view(request):
url = 'python.app.com' # replace with other python app url or ip
request_data = {'key': 'value'} # replace with data to be sent to other app
response = requests.post(url, json=request_data)
response_data = response.json() # data returned by other app
return http.JsonResponse(response_data)
This is an example of a function based view that uses the requests library to hit an external service. The request lib takes care of encoding/decoding your data to/from json.
Yeah, webhook would be one of the options, but there are other options available too.
-> You can use Rest Apis to send data from one app to another. but In their case, you need to think about synchronization. That depends on your requirement, If you don't want data in synchronize manner then you may use RabbiMq or other async tools. Just push your rest API request in Rabbitmq and Rabbitmq will handle.

Flask with Peewee getting new data only on restart

I'm facing a really strange issue with my Flask + Peewee app.
I have an webapp that insert data into MySQL database.
My Flask app connects also to that database with peewee.
The issue is, when i insert something with the webapp If I make a Select to the database within Flask app, it’s returning the data available at the start of the Flask app. To get new data With the same Select I need to restart Flask server service in order to get the new data.
Does anyone knows what's happening?
UPDATE
I found the issue. It wasn't directly related with peewee but with Flask.
I have something like this:
def some_method(id, user_id, date_from = datetime.now(), limit = 50):
It seems that when I do this the date_from is setted with the datetime of the compilation. After that, all requests that make use of it use always that date time.
Changed to this:
def some_method(id, user_id, date_from = None, limit = 50):
if (date_from is None):
date_from = datetime.now()
And it start working.
I'm sharing this for other guys like me that could have this same issue.
That's the expected behavior for web applications. HTTP is a stateless protocol, that means your webapp's frontend can't know if your backend's state has been changed unless it makes a new request. Therefor, your backend changes are visible only after you restart the Flask app. Depends on your code, you probably can also see the changes by refresh the browser.
If you want to see the changes immediately without restart/refresh, learn how to use Javascript to query the changes and update the frontend DOM. You can get started by learning jquery, or a modern framework like React/Vue/Angular.

Django websockets implementation

I'm new to python and Django and i'm trying to implement websockets in Django.
What i do is i'm following the steps described in websockets documentation
The problem is that the server side command described has to be run in console. When i run it from console it works, but i want to run it inside the Django view asynchronously with a GET request. When i try it the server raises an exception something like this RuntimeError: There is no current event loop in thread 'Thread-2'.
To be more specific I want to use the technology to show a real time logs. For example an oracle procedure performs an insert and server pushes it to a page with websockets.
Am I on a wrong path for implementing the described or can anyone suggest a right/better solution?
I'm on django version 1.9 implemented on both Django's development server and Uwsgi and Nginx server, python version 3.5.2 on RedHatEnterpriseServer Release: 6.7
UPDATE
The exact code from the above URL and i put it in the view.
def ws(request):
async def time(websocket, path):
while True:
now = datetime.datetime.utcnow().isoformat() + 'Z'
await websocket.send(now)
await asyncio.sleep(random.random() * 3)
start_server = websockets.serve(time, '192.168.4.177', 9876)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_forever()
return render(request,"ws.html")
When the URL is handled by this view the above mentioned error occurs.
My ws.html is the exact copy from the above mentioned websockets documentation example
Django's request/response cycle is strictly synchronous. What you are trying to do is not possible in a normal Django view.
You might be interested in Django Channels, a project that aims to remove this limitation.
You can't really do this. I can't say why you're getting the exact errors you're getting, but a GET request to a Django view needs to return a response after some finite time, not run forever, otherwise the browser (or other parts in between like Nginx) will see the non-response as a timeout. If you want to run a websocket server, do it in a separate process outside of Django's.
There is much ongoing work to add async functionality and websockets to Django, in the form of channels -- I think the docs at http://channels.readthedocs.io/en/latest/ are the latest version of the code you can currently already use; hopefully it will be part of Django 1.10. Current version should be useable as a Django app that will allow you to make websockets in Django, but it's not as easy as your try above.

Do I authenticate at database level, at Flask User level, or both?

I have an MS-SQL deployed on AWS RDS, that I'm writing a Flask front end for.
I've been following some intro Flask tutorials, all of which seem to pass the DB credentials in the connection string URI. I'm following the tutorial here:
https://medium.com/#rodkey/deploying-a-flask-application-on-aws-a72daba6bb80#.e6b4mzs1l
For deployment, do I prompt for the DB login info and add to the connection string? If so, where? Using SQLAlchemy, I don't see any calls to create_engine (using the code in the tutorial), I just see an initialization using config.from_object, referencing the config.py where the SQLALCHEMY_DATABASE_URI is stored, which points to the DB location. Trying to call config.update(dict(UID='****', PASSWORD='******')) from my application has no effect, and looking in the config dict doesn't seem to have any applicable entries to set for this purpose. What am I doing wrong?
Or should I be authenticating using Flask-User, and then get rid of the DB level authentication? I'd prefer authenticating at the DB layer, for ease of use.
The tutorial you are using uses Flask-Sqlalchemy to abstract the database setup stuff, that's why you don't see engine.connect().
Frameworks like Flask-Sqlalchemy are designed around the idea that you create a connection pool to the database on launch, and share that pool amongst your various worker threads. You will not be able to use that for what you are doing... it takes care of initializing the session and things early in the process.
Because of your requirements, I don't know that you'll be able to make any use of things like connection pooling. Instead, you'll have to handle that yourself. The actual connection isn't too hard...
engine = create_engine('dialect://username:password#host/db')
connection = engine.connect()
result = connection.execute("SOME SQL QUERY")
for row in result:
# Do Something
connection.close()
The issue is that you're going to have to do that in every endpoint. A database connection isn't something you can store in the session- you'll have to store the credentials there and do a connect/disconnect loop in every endpoint you write. Worse, you'll have to either figure out encrypted sessions or server side sessions (without a db connection!) to prevent keeping those credentials in the session from becoming a horrible security leak.
I promise you, it will be easier both now and in the long run to figure out a simple way to authenticate users so that they can share a connection pool that is abstracted out of your app endpoints. But if you HAVE to do it this way, this is how you will do it. (make sure you are closing those connections every time!)

Testing Flask REST server

I have a tiny Flask server that is supposed to load data from a file and run a function on it. This function will return a DataFrame and I return the json version of it. Much to my surprise this all works nicely. However, how would I test this? I have included some attempts below but I don't understand Flask (nor REST) well enough yet:
#!/home/thomas/python
from flask import Flask
from flask.ext.restful import Resource, Api
app = Flask(__name__)
api = Api(app)
class UniverseAPI(Resource):
def get(self):
import pandas as pd
frame = pd.read_csv("//datasrv10//data$//AQ//test.csv", index_col=0, header=0)
return frame.to_json()
api.add_resource(UniverseAPI, '/data/universe')
I am happy to include a few of my attempts here... I appreciate any hints. I have read the official documentation.
I should specify what I mean with testing. I can run this on my linux server and can extract all the required information with the requests package. However, I want to create a unittest that comes without the need to start the server on the localhost. I think I have managed with the FLASK test-client. However, the problem now is that the requests response object and the flask response object treat the underlying json strings rather differently. So I guess my problem is more related to json string issues rather than FLASK. Thanks for all your helpful feedback though
Well, the basics of writing a REST API are essentially a set of design principles. My understanding of it is based on this article by Miguel Grinberg, http://blog.miguelgrinberg.com/post/designing-a-restful-api-with-python-and-flask .
In it, he talks about how a REST API is:
"Stateless" - All interactions with the service can happen using the information from one request.
Built upon accessing "resources" from URIs using HTTP requests like GET, PUTS, and POST. A resource could be an order in a store, a task in a web app, or whatever you like.
There's also a bunch of stuff about how the server should standardize all forms of communication between itself and the client, indicate whether it can do cacheing, and other stuff like that. From an initial design standpoint, though, this is "the point" as he put it:
"The task of designing a web service or API that adheres to the REST guidelines then becomes > an exercise in identifying the resources that will be exposed and how they will be affected > by the different request methods."
If you're looking for an interesting example of a REST API that might be suited to your interests (I know it is to mine), reddit's is open source. It's a relatable example to see how they try and structure the interactions behind requests: http://www.reddit.com/dev/api

Categories

Resources