I'm facing a really strange issue with my Flask + Peewee app.
I have an webapp that insert data into MySQL database.
My Flask app connects also to that database with peewee.
The issue is, when i insert something with the webapp If I make a Select to the database within Flask app, it’s returning the data available at the start of the Flask app. To get new data With the same Select I need to restart Flask server service in order to get the new data.
Does anyone knows what's happening?
UPDATE
I found the issue. It wasn't directly related with peewee but with Flask.
I have something like this:
def some_method(id, user_id, date_from = datetime.now(), limit = 50):
It seems that when I do this the date_from is setted with the datetime of the compilation. After that, all requests that make use of it use always that date time.
Changed to this:
def some_method(id, user_id, date_from = None, limit = 50):
if (date_from is None):
date_from = datetime.now()
And it start working.
I'm sharing this for other guys like me that could have this same issue.
That's the expected behavior for web applications. HTTP is a stateless protocol, that means your webapp's frontend can't know if your backend's state has been changed unless it makes a new request. Therefor, your backend changes are visible only after you restart the Flask app. Depends on your code, you probably can also see the changes by refresh the browser.
If you want to see the changes immediately without restart/refresh, learn how to use Javascript to query the changes and update the frontend DOM. You can get started by learning jquery, or a modern framework like React/Vue/Angular.
Related
I have an already existing Django app. I would like to add a system that sends data from my Django application to another Python application hosted on another server, so that the Python application receives data from the Django App in json format, possibly.
So for example, i would need to create a view that every tot seconds sends the data from a DB table to this application, or when a form is hit, the data is sent to this external application.
How can i do this? Is there an example for this particular matter? I don't know what tools i'd need to use to create this system, i only know that i would need to use Celery to perform asynchronous tasks, but nothing else; should i use Webhooks maybe? Or Django channels?
Edit: adding some more context:
I have my Django client. Then i have one or two Python applications running on another server. On my Django client i have some forms. Once the form is submitted, the data is saved on the db, but i also want this data to be sent instantly to my Python applications. The Python applications should receive the data from Django in Json format and perform some tasks according to the values submitted by users. Then the application should send a response to Django.
Come on! I'll call your Django app here "DjangoApp" and your Python apps, in Flask or another framework by "OtherApp".
First as you predicted you will need a framework that is capable of performing tasks, the new **Django 3.0 allows this, but I haven't used it yet ... I will pass on to you something that you are using and fully functional with Django 2.8 and Python 3.8**.
On your DjangoApp server you will need to structure the communication well with your Celery, let's leave the tasks to him. You can read Celery Docs and this post, its very ok to make this architecture.
Regardless of how your form or Django App looks, when you want it to activate a task in celery, it is basically the function to transmit data but in the background.
from .tasks import send_data
...
form.save()
# Create a function within the form to get the data the way you want it
# or do it the way you want.
values = form.new_function_serializedata()
send_data.delay(values) # [CALL CELERY TASKS][1]
...
Read too CALL CELERY TASKS
In all your other applications you will need to have a POST route to receive and serialize this data, do this with lightweight frameworks like Pyramid
This way, every time a form is submitted, you will have this data sent to the server within the send_data function.
In my experience, but not knowing much about your problem I would use a similar architecture but using Celery Beat.
CELERY_BEAT_SCHEDULE = {
'send_data': {
'task': 'your_app.tasks.send_data',
'schedule': crontab(), # CONFIGURE YOUR CRON
},
}
Not only is the above code added, but it is something like that.
Within your models I would create one field for sent. And every 2 seconds, 10 seconds .. as long as I wish I would filter all objects with sent = false, and pass all objects for the send_data task.
I don't know if you got confused, that's a lot to explain. But I hope I can help and answer your questions.
import requests
from django import http
def view(request):
url = 'python.app.com' # replace with other python app url or ip
request_data = {'key': 'value'} # replace with data to be sent to other app
response = requests.post(url, json=request_data)
response_data = response.json() # data returned by other app
return http.JsonResponse(response_data)
This is an example of a function based view that uses the requests library to hit an external service. The request lib takes care of encoding/decoding your data to/from json.
Yeah, webhook would be one of the options, but there are other options available too.
-> You can use Rest Apis to send data from one app to another. but In their case, you need to think about synchronization. That depends on your requirement, If you don't want data in synchronize manner then you may use RabbiMq or other async tools. Just push your rest API request in Rabbitmq and Rabbitmq will handle.
I'm a novice developing a simple multi-user game (think Minesweeper) using Flask for the API backend and AngularJS for the frontend. I've followed tutorials to structure the Angular/Flask app and I've coded up a RESTful API using Flask-Restless.
Now I'd like to push events to all the clients when game data is changed in the database (as it is by a POST to one the Restless endpoints). I was looking at using the SqlAlchemy event.listen API to call the Flask-SocketIO emit function to broadcast the data to clients. Is this an appropriate method to accomplish what I'm trying to do? Are there drawbacks to this approach?
#CESCO's reply works great if all of your computation is being done in the same process. You could also use this syntax (see full source code here):
#sa.models_committed.connect_via(app)
def on_models_committed(sender, changes):
for obj, change in changes:
print 'SQLALCHEMY - %s %s' % (change, obj)
Read on if you're interested in subscribing to all updates to a database ...
That won't work if your database is being updated from another process, however.
models_committed only works in the same process where the commit comes from (it's not a DB-level notification, it's sqlalchemy after committing to the DB)
https://github.com/mitsuhiko/flask-sqlalchemy/issues/369#issuecomment-170272020
I wrote a little demo app showing how to use any of Redis, ZeroMQ or socketIO_client to communicate real-time updates to your server. It might be helpful for anyone else trying to deal with outside database access.
Also, you could look into subscribing to postgres events:
https://blog.andyet.com/2015/04/06/postgres-pubsub-with-json/
Thats the barebone version of the code you want. I would start testing from there. You will have to figure out what you mean by change, so you can atach the right SqlAlchemy event to the type of change you are doing.
User database after insetion listen event
from sqlalchemy import event
from app import socketio
def after_insert_listener(mapper, connection, target):
socketio.emit('namespace response',{'data': data[0]},
namespace='/namespace')
print(target.id_user)
event.listen(User, 'after_insert', after_insert_listener)
SocketIO
I have a PostgreSQL schema that resides in a schema.sql file that gets run each time a database connection is initiated in Python. It looks something like:
CREATE TABLE IF NOT EXISTS users (
id SERIAL PRIMARY KEY,
facebook_id TEXT NOT NULL,
name TEXT NOT NULL,
access_token TEXT,
created TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW()
);
The app is deployed on Heroku, using their PostgreSQL and everything works as expected.
Now, what if I want to change a bit the structure of my users table? How can I do this the easiest and the best way? I thought of writing an ALTER... line in schema.sql for each change I want to produce in the database, but I don't think this is the best approach, since after some time the schema file will be full of ALTERs and it will slow down my app.
What's the indicated way to deploy changes made to a database?
Running a hard-coded script on each connection is not a great way to handle schema management.
You need to either manage the schema manually, or use a full-fledged tool that keeps a schema version identifier in the database, checks that, and applies a script to upgrade to the next schema version if it's different to the latest one. Rails calls this "migrations" and it kind-of works. If you're using Django it has schema management too.
If you're not using a framework like that, I suggest just writing your own schema upgrade scripts. Add a "schema_version" table with a single row. SELECT it when the app first starts after a redeploy and if it's lower than the current version the app knows about, apply the update script(s) in order, eg schema_1_to_2, schema_2_to_3, etc.
I don't recommend doing this on connect, do it on app start, or better, as a special maintenance command. If you do it on every connection you'll have multiple connections trying to make the same changes and you'll land up with duplicated columns and all sorts of other mess.
I support several django apps on heroku with Postgres. I just connect via PgAdmin and run my scripts when changes are required. I don't see any need for running a script every time a connection is made.
I recently began a project to migrate our web app from apache + Mod_python to just cherry-py.
There is still a good deal of stuff I still need to do, but for now, it is CherryPy's sessions that are giving me a bit of a headache.
My first question is how do they work?
In Mod_python, we do something like this:
...
from mod_python import Session
sess = Session.Session(req, timeout = 60*60, lock=0)
#req is the request page object.
Judging from the CherryPy documentation, all I need to do to start a session is modify the config by adding something like the following:
cherrypy.config.update({
'tools.sessions.on': True,
'tools.sessions.storage_type': 'ram'})
The above defaults to a time of 60 minutes (though you can manually set your own), but what if I want to destroy that session and make a new one? Do, I call cherrypy.lib.sessions.expire() in any arbitrary file and then do the cherrypy.config.update thing again? Or, will CherryPy make a new session by itself? What if I want to make a new session with a different expiry time?
Note: When I say arbitrary file, I mean a file that is not running CherryPy (My "config" file imports and gets html back from our other pages much like the standard Publisher that comes with Mod_Python).
I tried making a quick little test file:
import cherrypy
from cherrypy.lib import sessions
def index(sid=0, secret=None, timeout=30, lock=1):
cherrypy.session['test'] = 'test'
cherrypy.lib.sessions.expire()
return cherrypy.session.get('test','None')
The end result is that 'test' is still displayed on the screen. Is this happening because the client side session is expired, but the local one still has data? In that case, how can I check if a session expired or not?
Sorry for the confusing question, but I am confused.
Thanks for all your help!
Try this to end a session.
sess = cherrypy.session
sess['_cp_username'] = None
and try this to create a session...
cherrypy.session.regenerate()
cherrypy.session['_cp_username'] = cherrypy.request.login
I used this example to handle most of my session activity.
http://tools.cherrypy.org/wiki/AuthenticationAndAccessRestrictions
Hope this helps,
Andrew
To my surprise, I haven't found this question asked elsewhere. Short version, I'm writing an app that I plan to deploy to the cloud (probably using Heroku), which will do various web scraping and data collection. The reason it'll be in the cloud is so that I can have it be set to run on its own every day and pull the data to its database without my computer being on, as well as so the rest of the team can access the data.
I used to use AWS's SimpleDB and DynamoDB, but I found SDB's storage limitations to be to small and DDB's poor querying ability to be a problem, so I'm looking for a database system (SQL or NoSQL) that can store arbitrary-length values (and ideally arbitrary data structures) and that can be queried on any field.
I've found many database solutions for Heroku, such as ClearDB, but all of the information I've seen has shown how to set up Django to access the database. Since this is intended to be script and not a site, I'd really prefer not to dive into Django if I don't have to.
Is there any kind of database that I can hook up to in Heroku with Python without using Django?
You can get a database provided from Heroku without requiring your app to use Django. To do so:
heroku addons:add heroku-postgresql:dev
If you need a larger more dedicated database, you can examine the plans at Heroku Postgres
Within your requirements.txt you'll want to add:
psycopg2
Then you can connect/interact with it similar to the following:
import psycopg2
import os
import urlparse
urlparse.uses_netloc.append('postgres')
url = urlparse.urlparse(os.environ['DATABASE_URL'])
conn = psycopg2.connect("dbname=%s user=%s password=%s host=%s " % (url.path[1:], url.username, url.password, url.hostname))
cur = conn.cursor()
query = "SELECT ...."
cur.execute(query)
I'd use MongoDB. Heroku has support for it, so I think it will be really easy to start and scale out: https://addons.heroku.com/mongohq
About Python: MongoDB is a really easy database. The schema is flexible and fits really well with Python dictionaries. That's something really good.
You can use PyMongo
from pymongo import Connection
connection = Connection()
# Get your DB
db = connection.my_database
# Get your collection
cars = db.cars
# Create some objects
import datetime
car = {"brand": "Ford",
"model": "Mustang",
"date": datetime.datetime.utcnow()}
# Insert it
cars.insert(car)
Pretty simple, uh?
Hope it helps.
EDIT:
As Endophage mentioned, another good option for interfacing with Mongo is mongoengine. If you have lots of data to store, you should take a look at that.
I did this recently with Flask. (https://github.com/HexIce/flask-heroku-sqlalchemy).
There are a couple of gotchas:
1. If you don't use Django you may have to set up your database yourself by doing:
heroku addons:add shared-database
(Or whichever database you want to use, the others cost money.)
2. The database URL is stored in Heroku in the "DATABASE_URL" environment variable.
In python you can get it by doing.
dburl = os.environ['DATABASE_URL']
What you do to connect to the database from there is up to you, one option is SQLAlchemy.
Create a standalone Heroku Postgres database. http://postgres.heroku.com