I have a program that stores my database data in a QtableWidget.
I would like to add EventListener so whenever user edits any column in a row,
It will update immediately the data in the server.
I am trying to think of a way doing this.
The idea is to send an UPDATE query to servers database. But i stuck on finding a way of making it to see the change and update immediately.
Or updating when button clicked after many rows been edited. But then it have to store all the changes so I think the first option is better.
Any advise will be great!
Thanks ahead!
I agree with you; I think the first option is the better one. Here is a way you could achieve that: you could wait for the user to make the change (the table by default should be editable) and when the user presses enter, process the event. To do that, take a look at BeetDemGuise's answer.
After the enter key has been pressed, a signal will be emitted, and you can connect it to a slot function that will look at the current cell data and update it in the database. e.g. signal.connect(handle_signal). In your handle_signal(), you can get the current text (mytable.currentItem().text()), then make the change in the database. If you're using SqlAlchemy it will look something like:
table = self.values.update()
self.engine.execute(table.values(value="[the current text]".
where(table.id == id))
Of course, this will vary depending on what ORM you're using.
Related
I am developing a GUI app that will be used supposedly by mutliple users. In my app, I use QAbstractTableModel to display a MS Access Database (stored on a local server, accessed by several PCs) in a QTableView. I developped everything I needed for unique user interaction. But now I'm moving to the step where I need to think about multi-user interaction.
For exemple, if user A changes a specific line, the instance of the app on user's B PC needs to update the changed line. Another example, if user A is modifying a specific line, and user B also wants to modify it, it needs to be notified as "already being modified, wait please", and once the modification from user A is done, the user B needs to see this modification updated before he has any interaction.
Today, because of the local nature of the MS Access database, I have to update the table view a lot of time, based on user interaction, in order to not miss any database modification from other potential users. It is kinda greedy in terms of performance and resources.
I was thinking about using Django in order make the different app instances communicate with each other, but maybe I'm overthingking it and may be there is other solutions.
Dunno if it's clear, I'm available for more informations !
Perhaps you could simply store a "lastUpdated" timestamp on the row. With each update, you update that timestamp.
Now, when you submit an update, you include that timestamp, and if the timestamps don't match, you let the user know, and handle the conflict on the frontend (Perhaps a simple "overwrite local copy, or force update and overwrite server copy" option).
That's a simple and robust solution, but if you don't want users wasting time writing updates for old rows, you could use WebSockets to communicate from a server to any clients with that row open for editing, and let them know that the row has been updated.
If you want to "lock" rows while the row is already being edited, you could simply store a "inUse" boolean and have users check the value before continuing.
Usually, when using a MVC pattern (which is what QAbstractTableModel + QTableView is) the responsibility of updating the view should lie on the model itself. I.e. it's the model that should notify the view that something changed.
It seems that QAbstractTableModel has a dataChanged signal that gets emitted on data changes.
I suggest you to connect it to your view refresh slot as done here.
In this way you avoid the need of another moving part/infrastructure component (django).
I dont have much knowledge in dbs, but wanted to know if there is any technique by which when i update or insert a specific entry in a table, it should notify my python application to which i can then listen whats updated and then update that particular row, in the data stored in session or some temporary storage.
I need to send data filter and sort calls again n again, so i dont want to fetch whole data from sql, so i decided to keep it local, nd process it from there. But i was worried if in the mean time the db updates, and i could have been passing the same old data to filter requests.
Any suggestions?
rdbs only will be updated by your program's method or function sort of things.
you can just print console or log inside of yours.
if you want to track what updated modified deleted things,
you have to build a another program to able to track the logs for rdbs
thanks.
I'm looking for a way to constantly check my database (MySQL) for new entries. Once a new entry is committed I want to output it in a webpage using Flask.
Since the process takes time to finish I would like to give the users the impression it took only few seconds to retrieve data.
For now I'm waiting that the whole process finishes to give to the user the whole result. But I would prefer to update the result web-page every time a new entry was added to the DB. So for example the first entry is added to the DB, immediately the user can see it on the web-page, then a second entry is added the user can now see both the first and the second entries on the web-page and so on. I don't know if it has to come from flask or other ways
Any idea?
You can set MySQL to log all commits to General Query Log and monitor all changes (for example via Watchdog or PyNotify). Once the file changes, you can parse the new log entries and get the signal. By this way you'll avoid pooling for changes.
The better way would be of course send the signal while storing data to the database.
I want all registered users to have the ability to change existing data..
But to make sure they don't mess with it, I want the ability to check the new data before commiting it to my db.
How can I do it as nice as possible, that me and all admins only need one click to reject/accept updated data?
This job can implement by workflow systems similar viewflow or GoFlow and others, in this way added/changed data saves on database but waiting to confirm by workflow master actors.
The Google_App-Engine erases everything in my table using the put statement. I don't want it to do that, it makes for more code to have to re.put everything back in the table, every time something is added.
Basically the issue is that the put statement erases everything. is there a way to save what I don't want to update?
here is the code: ((python web2py))
biography2 = bayside(key_name='bayside', Biography=form_biography.vars.one)
biography2.put()
redirect(URL("b1", "bayside"))
the put statement, will update the biography under the table bayside, but it erases everything else in that table (genre, songs, etc...) I want it to keep the other table elements and only update the biography. Is that possible? Right now I have had to resort to hack that updates all table elements when I really just want to update one. it is very frustrating, and makes for a ton of extra code.
You need to get the entity from the datastore first. Then, you can modify the entity and put it back into the datastore.
to me it looks like you are overwriting an existing entity instead of getting and updating properties of an existing one.
you should take a look at the docs.
https://developers.google.com/appengine/docs/python/datastore/entities#Updating_an_Entity