The Google_App-Engine erases everything in my table using the put statement. I don't want it to do that, it makes for more code to have to re.put everything back in the table, every time something is added.
Basically the issue is that the put statement erases everything. is there a way to save what I don't want to update?
here is the code: ((python web2py))
biography2 = bayside(key_name='bayside', Biography=form_biography.vars.one)
biography2.put()
redirect(URL("b1", "bayside"))
the put statement, will update the biography under the table bayside, but it erases everything else in that table (genre, songs, etc...) I want it to keep the other table elements and only update the biography. Is that possible? Right now I have had to resort to hack that updates all table elements when I really just want to update one. it is very frustrating, and makes for a ton of extra code.
You need to get the entity from the datastore first. Then, you can modify the entity and put it back into the datastore.
to me it looks like you are overwriting an existing entity instead of getting and updating properties of an existing one.
you should take a look at the docs.
https://developers.google.com/appengine/docs/python/datastore/entities#Updating_an_Entity
Related
I am looking for a more efficient way to update a bunch of model objects. Every night I have background jobs creating 'NCAABGame' objects from an API once the scores are final.
In the morning I have to update all the fields in the model with the stats that the API did not provide.
As of right now I get the stats formatted from an excel file and I copy and paste each update and run it like this:
NCAABGame.objects.filter(
name__name='San Francisco', updated=False).update(
field_goals=38,
field_goal_attempts=55,
three_points=11,
three_point_attempts=24,
...
)
The other day there were 183 games, most days between 20-30 so it can be very timely doing it this way. I've looked into bulk_update and a few other things but I can't really find a solution. I'm sure there is something simple that I'm just not seeing.
I appreciate any ideas or solutions you can offer.
If you need to update each object that gets created via the API manually anyway, I would not even bother going through Django. Just load your games from the API directly in Excel, then make your edits in Excel, and save as CSV file. Then I would add the CSV directly into the database table, unless there is a specific reason that objects must be created via Django? I mean, you can of course do that with something like the below, which could be modified to also work for your current method via updates, but then you need to first retrieve the correct pk of the object that you want to update.
import csv
with open("my_data.csv", 'r') as my_data_file:
reader = csv.reader(my_data_file)
for row in reader:
# get_or_create returns a tuple. 'created' is a boolean that indicates
# if a new object was created or not, with game holding the object that
# was either retrieved or created
game, created = NCAABGame.objects.get_or_create(
name=row[0],
field_goals=row[1],
field_goal_attempts=row[2],
....,
)
In our DynamoDB database, we have table where we usually have thousands of items that are junk because of test_data and we clean it up once awhile.
But there is a specific item that we don't want to delete but when we do select all, that one gets deleted as well.
Is there a way in the table, where we define the ID and stop it from getting deleted? Or if someone comes and wants to delete all, it will delete everything except that one?
I can think of two options:
Add a policy, to anyone (or any role) who might perform this action, that denies permission to delete that item. You can accomplish this by Specifying Conditions: Using Condition Keys using the dynamodb:LeadingKeys condition key.
Add a stream handler to your table and any time the record is deleted you can automatically add it back.
The first option is probably best, but you would need to be sure it's always attached to the appropriate users/roles. You also need to be sure you are handling the error you're going to get when you try to delete the record you aren't allowed to delete.
The second option removes the need to worry about it but it comes with the overhead of a Lambda running everytime you create, update, or delete a record in the table (with some batching, so not EVERY change). It also opens up a brief period where the record will be deleted, so if it's important that the record NEVER be deleted then this isn't a viable option.
The only way I can find of adding new data to a TinyDB table is with the table.insert() method. However this appends the entry to the end of the table, but I would like to maintain the sequence of entries and sometimes I need to insert into an arbitrary index in the middle of the table. Is there no way to do this?
There is no way to do what you are asking. Normally, the default index created tracks insertion order. When you add data, it will go at the end. If you need to maintain a certain order, you could create a new property the handle that case, and retrieve with a sort on that property.
If you truly want to insert in a specific id, you would need to add some logic to cascade the documents down. The logic would flow as:
Insert a new record which is equal to the last record.
Then, go backwards and cascade the records to the new open location
Stop when you get to the location you need, and update the record with what you want to insert by using the ID.
The performance would drag since you are having to shift the records down. There are other ways to maintain the list - it would be similar to inserting a record in the middle of an array. Similar methods would ally here. Good Luck!
I dont have much knowledge in dbs, but wanted to know if there is any technique by which when i update or insert a specific entry in a table, it should notify my python application to which i can then listen whats updated and then update that particular row, in the data stored in session or some temporary storage.
I need to send data filter and sort calls again n again, so i dont want to fetch whole data from sql, so i decided to keep it local, nd process it from there. But i was worried if in the mean time the db updates, and i could have been passing the same old data to filter requests.
Any suggestions?
rdbs only will be updated by your program's method or function sort of things.
you can just print console or log inside of yours.
if you want to track what updated modified deleted things,
you have to build a another program to able to track the logs for rdbs
thanks.
I have a program that stores my database data in a QtableWidget.
I would like to add EventListener so whenever user edits any column in a row,
It will update immediately the data in the server.
I am trying to think of a way doing this.
The idea is to send an UPDATE query to servers database. But i stuck on finding a way of making it to see the change and update immediately.
Or updating when button clicked after many rows been edited. But then it have to store all the changes so I think the first option is better.
Any advise will be great!
Thanks ahead!
I agree with you; I think the first option is the better one. Here is a way you could achieve that: you could wait for the user to make the change (the table by default should be editable) and when the user presses enter, process the event. To do that, take a look at BeetDemGuise's answer.
After the enter key has been pressed, a signal will be emitted, and you can connect it to a slot function that will look at the current cell data and update it in the database. e.g. signal.connect(handle_signal). In your handle_signal(), you can get the current text (mytable.currentItem().text()), then make the change in the database. If you're using SqlAlchemy it will look something like:
table = self.values.update()
self.engine.execute(table.values(value="[the current text]".
where(table.id == id))
Of course, this will vary depending on what ORM you're using.