Tracking changes in SQLAlchemy model and getting json_body user_id - python

I am using code like below from the post of Tracking model changes in SQLAlchemy. The only issue I am currently having is my map gets the end user's username through a token and I need to pass that username info into this function so it is logged with the changes. Is there a better method or do I just need to add a column to my tables called last modified by and just scrape that with this listener on update to the table?
#event.listens_for(cls, 'before_update')
def before_udpate(mapper, connection, target):
state = db.inspect(target)
changes = {}
for attr in state.attrs:
hist = state.get_history(attr.key, True)
if not hist.has_changes():
continue
# hist.deleted holds old value
# hist.added holds new value
changes[attr.key] = hist.added
# now changes map keys to new values
Or is there some type of way that I can call this information from my view?

After sleeping on this for the night. I came up with doing a mixin. Similar to what is on this page: http://francesco.pischedda.info/posts/logging-model-changes-with-sqlalchemy-listeners.html
I pass in my request object so that my target object will have the request object.

Related

pre_save signal using tastypie api not allowing me to access "instance" field

I'm working on this small Django project in which I'm using pre_save signals to update a table in which I save the cumulative value of a certain quantity, whenever a new Transaction is created or modified, the corresponding value in the table is updated. If I add a transaction manually from the admin page everything works fine but today I tried to create a new Transaction through a POST request using tastypie generated api, the problem is that when my update_total_if_changed function is called by the signal, the instance parameter is /api/v1/transaction/ instead of the actual python object, therefore I get "Transaction has no FieldName." since the instance actually points to the tastypie entrypoint instead of the newly created object.
Below you can see the code of my signal
#receiver(pre_save, sender=Transaction)
def update_total_if_changed(sender, instance, **kwargs):
try:
obj = sender.objects.get(pk=instance.pk)
except sender.DoesNotExist: #new transaction
tw, new = TotalWaste.objects.get_or_create(depot=instance.depot, waste = instance.waste)
tw.total += instance.quantity
tw.save()
else:
if not obj.quantity == instance.quantity: # Field has changed
tw, new = TotalWaste.objects.get_or_create(depot=instance.depot, waste = instance.waste)
tw.total = tw.total + instance.quantity - obj.quantity
tw.save()
I figured out the problem, I forgot to define the specific Resource, therefore the endpoint was not seen as a set of objects but just as a link that didn't point to anything, therefore I could not access the needed field.

Data Model persistence between requests

I'm building an application using Flask and Flask-SQLAlchemy.
In the application I use database models written in the SQLAlchemy declarative language, let's say that I have a table called Server.
The application, by design choices, ask the user -via WTForms- to set values for the fields of the Server table between different pages(views) and I need save the instance to the database in the last view.
My problem is: I have 2 'circular' views and would like to store the instances of objects created in the first view directly in the database session in order to be able to query the session in the second view and committing the result only in the last view (the end of the loop), like the pseudocode shows (very simplified, it makes no sense in this form but is to explain the concept):
def first_view():
form = FormOne() #prompt the first form
if form.validate_on_submit():
#form validate, i create the instance in SQLalchemy
server = Server() #Database Model
server.hostname = form.hostname.data
loop_count = form.repetition.data
session['loop'] = loop_count
db.session.add(server) #adding the object to the session
#but i'm not committing the session in fact the next view needs to know about this new object and a third view needs to commit based on the user input
return redirect(url_for('second_view'))
return render_template("first_view.html", form=form)
def second_view():
form = FormTwo() #prompt the second form
if form.validate_on_submit():
hostname_to_search = form.hostname.data #i get some other input
#i use the input to query the session (here different server instace can appear, depends on the user input of the first view)
rslt= db.session.query(Server).filter(Server.hostname==hostname_to_search ).all()
#the session is empty and doesn't contain the instance created in the previous view... <-- :(
if session['loop'] <= 0 :
#end the loop
return redirect(url_for('commit_view'))
else:
loop_count = session.pop('loop',1)
session['loop'] = loop_count-1
#go back to the first page to add another server instance
return redirect(url_for('first_view'))
return render_template("first_view.html", form=form)
def commit_view():
#loop finished, now i can commit my final instances
db.session.commit() <--here i save the data, db.session is empty
return 'DONE!'
but SEEMS that the session in Flask-SQLAlchemy is local to the request, so it appears that between a view and another the db.session is resetted/empty.
The first solution came into my mind is to store even the server object values in the flask.session (in json format) but this means that I need to jsonify and parse back every time the flask.session to build back the objects in each view: I cannot use the query power of the database, but for example I have to manually check if the hostname input by the user is already present in the previously created server objects.
My question is: how is possible(if it is possible and good practice) to keep the db session 'open' between different views:
How can i implement this?
Is it convenient?
Is it thread safe?
Maybe i'm using the wrong approach to solve the problem? (Of course I can do a single page, but the real case in much more complex and needs to be structured between different pages)

Appengine ndb: differentiating between creating and editing?

I have the following code in models.py:
class Order(ndb.Model):
created_at = ndb.DateTimeProperty(auto_now_add=True)
updated_at = ndb.DateTimeProperty(auto_now=True)
name = ndb.StringProperty()
due_dates = ndb.DateProperty(repeated=True)
class Task(ndb.Model):
created_at = ndb.DateTimeProperty(auto_now_add=True)
updated_at = ndb.DateTimeProperty(auto_now=True)
order = ndb.KeyProperty(required=True)
order_updated_at = ndb.DateTimeProperty(required=True)
...
When an order is created, 6 tasks will be created. Currently, I have the following method:
def _post_put_hook(self, future):
# Deleting old tasks
tbd = Task.query(Task.order == self.key).fetch(keys_only=True)
ndb.delete_multi(tbd)
# Generating new tasks
for entry in self.entries:
pt = entry.producetype.get()
# Now create Tasks and store them into the database
Task(order=self.key,
order_updated_at=self.updated_at,
order_entry_serial=entry.serial,
date=dt_sowing,
action=TaskAction.SOWING).put()
Now I am changing the way Order and Task are created.
I want to create Tasks when an Order is created AND I want to delete the tasks of an Order when an order is modified
Unfortunately, ndb's API states:
The Datastore API does not distinguish between creating a new entity
and updating an existing one. If the object's key represents an entity
that already exists, the put() method overwrites the existing entity.
You can use a transaction to test whether an entity with a given key
exists before creating one. See also the Model.get_or_insert() method.
I don't really understand how Model.get_or_insert can be applied in my scenario.
Note that I can't use _pre_put_hooks because my Tasks needs to reference their Order via its key.
Ignore get_or_insert(), it return an entity in any case and don't help you. You need to check if tasks exist in the datastore. I think to wrap a get() or a get_multi() function in a try/except. If the entities exists, delete it else create 6 new Task entities with a put_multi().
edit: you need timestamps for check the preexistance. Look datetime property and auto_now_add/auto_now options.

Model save update only specific fields

I'm trying to write a webservice which performs inserts or updates.
The request is a post with headers,value which contains table name, column name and the value to be set for each column, I'm parsing the request headers and forming a parameter dict
def handel_request(request): if request.method == "POST":
param_dict = formParmDict(request)
##if insert param_dict["Model"] is {'pk':1,'field1':100,'field2':200}
##if update param_dict["Model"] is {'pk':1,'field1':100}
Model(**param_dict["Model"]).save() ## if update then sets field2 to null
return HttpResponse()
else:
return HttpResponseBadRequest()
This works fine while the .save() performs an insert.
In case of update ie if param_dict["Model"] contains {pk:1, field1:somevalue} to be updated then it sets the rest of the fields other than the ones specified in param_dict["Model"] to null. why is that? am I doing something wrong? isn't save suppose to update only the fields specified?
This is not how you're supposed to update.
Model(**param_dict["Model"]).save()
You're creating a new instance with the same id. Instead, you should get the instance, and then update it appropriately.
m = Model.objects.get(id=param_dict['id'])
m.field = param_dict['some_field']
m.save()
Or, you can use the Manager update method:
Model.objects.filter(id=param_dict['id']).update(**param_dict['Model'])
There's also the get_or_create method if you're not sure whether or not the record already exists.
You can try using a REST framework, like tasty-pie or django-rest-framework, which might alleviate some problems you're having.
Edit:
A brief summary about how save works in django. This is what I meant about whether or not an INSERT or an UPDATE is happening. Unless your post_data dict contains empty values for all the fields, read the documentation on how save works for a more thorough understanding of how django works.
So, what is happening in your case is this:
dict = {'id': 1, 'field1': 'my_value'}
m = Model(**dict)
m.id # 1
m.field1 # my_value
m.field2 # None (because you haven't set it, it defaults to None
m.save() # UPDATEs the existing instance with id 1 with ALL of the values of `m`
So, you're saving an instance that contains None values. That's why I'm suggesting you do a get, so that all the correct values are filled, before saving to the database.
Maybe you shoul use some function like this:
def insert_or_update(param_dict):
pk = param_dict.get('pk', None)
if pk:
Model.objects.filter(pk=pk).update(**param_dict)
else:
Model(**param_dict)
Model.save()

Keeping an Audit Trail of any/all Python Database objects in GAE

I'm new to Python. I'm trying to figure out how to emulate an existing application I've coded using PHP and MS-SQL, and re-create the basic back-end functionality on the Google Apps Engine.
One of the things I'm trying to do is emulate the current activity on certain tables I have in MS-SQL, which is an Insert/Delete/Update trigger which inserts a copy of the current (pre-change) record into an audit table, and stamps it with a date and time. I'm then able to query this audit table at a later date to examine the history of changes that the record went through.
I've found the following code here on stackoverflow:
class HistoryEventFieldLevel(db.Model):
# parent, you don't have to define this
date = db.DateProperty()
model = db.StringProperty()
property = db.StringProperty() # Name of changed property
action = db.StringProperty( choices=(['insert', 'update', 'delete']) )
old = db.StringProperty() # Old value for field, empty on insert
new = db.StringProperty() # New value for field, empty on delete
However, I'm unsure how this code can be applied to all objects in my new database.
Should I create get() and put() functions for each of my objects, and then in the put() function I create a child object of this class, and set its particular properties?
This is certainly possible, albeit somewhat tricky. Here's a few tips to get you started:
Overriding the class's put() method isn't sufficient, since entities can also be stored by calling db.put(), which won't call any methods on the class being written.
You can get around this by monkeypatching the SDK to call pre/post call hooks, as documented in my blog post here.
Alternately, you can do this at a lower level by implementing RPC hooks, documented in another blog post here.
Storing the audit record as a child entity of the modified entity is a good idea, and means you can do it transactionally, though that would require further, more difficult changes.
You don't need a record per field. Entities have a natural serialization format, Protocol Buffers, and you can simply store the entity as an encoded Protocol Buffer in the audit record. If you're operating at the model level, use model_to_protobuf to convert a model into a Protocol Buffer.
All of the above are far more easily applied to storing the record after it's modified, rather than before it was changed. This shouldn't be an issue, though - if you need the record before it was modified, you can just go back one entry in the audit log.
I am bit out of touch of GAE and also no sdk with me to test it out, so here is some guidelines to given you a hint what you may do.
Create a metaclass AuditMeta which you set in any models you want audited
AuditMeta while creating a new model class should copy Class with new name with "_audit" appended and should also copy the attribute too, which becomes a bit tricky on GAE as attributes are itself descriptors
Add a put method to each such class and on put create a audit object for that class and save it, that way for each row in tableA you will have history in tableA_audit
e.g. a plain python example (without GAE)
import new
class AuditedModel(object):
def put(self):
print "saving",self,self.date
audit = self._audit_class()
audit.date = self.date
print "saving audit",audit,audit.date
class AuditMeta(type):
def __new__(self, name, baseclasses, _dict):
# create model class, dervied from AuditedModel
klass = type.__new__(self, name, (AuditedModel,)+baseclasses, _dict)
# create a audit class, copy of klass
# we need to copy attributes properly instead of just passing like this
auditKlass = new.classobj(name+"_audit", baseclasses, _dict)
klass._audit_class = auditKlass
return klass
class MyModel(object):
__metaclass__ = AuditMeta
date = "XXX"
# create object
a = MyModel()
a.put()
output:
saving <__main__.MyModel object at 0x957aaec> XXX
saving audit <__main__.MyModel_audit object at 0x957ab8c> XXX
Read audit trail code , only 200 lines, to see how they do it for django

Categories

Resources