I see lots of apps using Form objects to validate data and then passing the data to the model, while putting absolutely no validation in the model. I feel it's better to put core validation in the model itself (e.g., no users under the age of 18, ever) to be run regardless of the context. In other words, I don't care how the user is being created (whether through web ui or command line), the core rules should always apply.
I'm using SQLAlchemy (within a Pyramid application), and I would like to define my core validation rules within the model in a way that my forms (WTForms) always respect the core rules defined in the model so that all data is consistent.
Is anybody else already doing this, or something similar?
Something similar to this php solution.
SQLAlchemy allows you to register listeners, which are invoked when certain events occur, for example you can register an event to be triggered when a model's field is modified. From SQLAlchemy documentation:
Listeners have the option to return a possibly modified version of the
value, when the retval=True flag is passed to listen():
def validate_phone(target, value, oldvalue, initiator):
"Strip non-numeric characters from a phone number"
return re.sub(r'(?![0-9])', '', value)
# setup listener on UserContact.phone attribute, instructing
# it to use the return value
listen(UserContact.phone, 'set', validate_phone, retval=True)
A validation function like the above can also raise an exception such
as ValueError to halt the operation.
So, as you see, you can either modify or reject values for certain field at the model level.
I'm not sure if your form library integrates with this feature, but it definitely should not be too difficult to roll your own solution.
Related
I want to create a new model which uses the https://developer.microsoft.com/en-us/graph/graph-explorer api as a data source as i want to have additional info on the user.
Using a computed property on the model does not work as it is going to query for each instance in the set.
So, i want to have the model relate to a new model which has the api as it´s data source.
I could not find anything on this topic, besides maybe abusing the from_db() method, if this even works.
It appears that what you're trying to do is to cache data from an external API that relates to, and augments/enriches, your user model data. If so, you can simply use a custom user model (instead of Django's default; this is a highly-recommended practice anyway) and then simply store the external API data in serialized form in a TextField attribute of your custom user model (let's call it user_extras; you can write model methods that serializes and deserializes this field for convenience upon access in your views).
The key challenge then is how to keep user_extras fresh, without doing something terrible performance-wise or hitting some constraint like API call limits. As you said, we can't do API queries in computed properties. (At least not synchronously.) One option then is to have a batch job/background task that regularly goes through your user database to update the user_extras in some controlled, predictable fashion.
In my Django application I have the following two models:
class Event(models.Model):
capacity = models.PositiveSmallIntegerField()
def get_number_of_registered_tickets():
return EventRegistration.objects.filter(event__exact=self).aggregate(total=Coalesce(Sum('number_tickets'), 0))['total']
class EventRegistration(models.Model):
time = models.DateTimeField(auto_now_add=True)
event = models.ForeignKey(Event, on_delete=models.CASCADE)
number_tickets = models.PositiveSmallIntegerField(validators=[MinValueValidator(1)])
The method get_number_of_registered_tickets() do I need at several places in my application (e.g. template rendering). So I thought it makes sense to put it into the model also because it's related to it and I often heard it's good to have "fat models and lightweight views".
My problem now:
In order to prevent that two people want to register for the event in parallel, I have to use locking. Example: Let's say there's one ticket left to register for. Now, to people are on my website and click "Register" simultaneously. Under unforunate circumstances, it could happen that both requests are valid and now I have more registrations than capacity.
I'm relatively new to Django, but reading through the docs, I thought that select_for_update() should be the solution, am I right here (I use PostgreSQL, so that should be supported)?
However, the docs also say that using select_for_update() is only valid within a transcation.
Evaluating a queryset with select_for_update() in autocommit mode on
backends which support SELECT ... FOR UPDATE is a
TransactionManagementError error because the rows are not locked in
that case. If allowed, this would facilitate data corruption and could
easily be caused by calling code that expects to be run in a
transaction outside of one.
My idea was now to change my model method get_number_of_registered_tickets() and add select_for_update():
def get_number_of_registered_tickets():
return EventRegistration.objects.select_for_update().filter(event__exact=self).aggregate(total=Coalesce(Sum('number_tickets'), 0))['total']
Different questions now:
Is using select_for_update() the right solution to my problem?
Does it mean that I cannot use the method get_number_of_registered_tickets() in different views/templates now, given that it seems to only work within a transaction? Do I have to violate DRY here and copy and paste the query with select_for_update() to another place in my code?
I tested it locally and Django does not raise the TransactionManagementError while being in autocommit mode (not using any transactions). What could be the reason or do I misunderstand something?
Doing select_for_update() on an EventRegistration queryset isn't the way to go. That locks the specified rows, but presumably the conflict you're trying to prevent involves creating new EventRegistrations. Your lock won't prevent that.
Instead you can acquire a lock on the Event. Something like:
class Event(models.Model):
...
#transaction.atomic
def reserve_tickets(self, number_tickets):
list(Event.objects.filter(id=self.id).select_for_update()) # force evaluation
if self.get_number_of_registered_tickets() + number_tickets <= self.capacity:
# create EventRegistration
else:
# handle error
Note that this uses the transaction.atomic decorator to make sure you are running inside a transaction.
Note that in multiple database environment you must have atomic and select_for_update on the same db, otherwise it wont work
with transaction.atomic(using='dbwrite'):
Model.objects.using('dbwrite').select_for_update()....
select_for_update is a database function option implemented using Django. Whenever you are writing an operation to do some update, database (in your case POSTGRES) takes care of the reliable transactions that adhere to these ACID properties.
To me your approach seems correct. And the last question answer would be to test this using a time.sleep delay.
You can do a select operation then in the next line put a time.sleep(10) while this occurs hit the api to make another transaction. You will be able to find the
TransactionManagementError
I am attempting to combine Flask-Admin and a Flask S3 Tool Suite written by someone to store images to Amazon S3 simple storage. The tool I am trying to specifically use is the S3Saver functionality and in the article link the user goes on to say that to incorporate this tool, I would ideally use it on a flask-admin callback.
However, I cannot find a list of Flask-Admin callbacks anywhere. I have done my homework and I have literally checked the entire docs and source code for these callbacks Flask-Admin documentation .
In flask_admin.contrib.sqla I found some methods that kind of resembled a callback but its not what I am looking for. I imagine that it is something close to Rails's before_action and after_action. Can someone please point me in the right direction?
By the way this is the actual quote from the article
If you wanted to handle deleting files in the admin as well, you could (for example) use s3-saver, and hook it in to one of the Flask-Admin event callbacks
What callbacks are those?
Flask-Admin Callback Functions (as far as I know)
after_model_change(form, model, is_created)
#Called from create_model after successful database commit.
after_model_delete(model)
#Called from delete_model after successful database commit (if it has any meaning for a store backend).
on_form_prefill(form,id)
# Called from edit_view, if the current action is rendering the form rather than receiving client side input, after default pre-filling has been performed.
on_model_change(form,model,is_created)
#Called from create_model and update_model in the same transaction (if it has any meaning for a store backend).
on_model_delete(model)
# Called from delete_model in the same transaction (if it has any meaning for a store backend).
More info at http://flask-admin.readthedocs.org/en/latest/api/mod_model/ (search for "Called from")
Here is the documentation for Flask Admin events :
https://flask-admin.readthedocs.org/en/latest/api/mod_model/#flask_admin.model.BaseModelView.on_form_prefill
on_form_prefill(form, id) Perform additional actions to pre-fill the edit form.
Called from edit_view, if the current action is rendering the form
rather than receiving client side input, after default pre-filling has
been performed.
By default does nothing.
You only need to override this if you have added custom fields that
depend on the database contents in a way that Flask-admin can’t figure
out by itself. Fields that were added by name of a normal column or
relationship should work out of the box.
Parameters: form – Form instance id – id of the object that is going
to be edited
on_model_change(form, model, is_created) Perform some actions before a model is created or updated.
Called from create_model and update_model in the same transaction (if
it has any meaning for a store backend).
By default does nothing.
Parameters: form – Form used to create/update model model – Model
that will be created/updated is_created – Will be set to True if model
was created and to False if edited
on_model_delete(model) Perform some actions before a model is deleted.
Called from delete_model in the same transaction (if it has any
meaning for a store backend).
By default do nothing.
I want to do some custom actions before a user is created. and I thought of using the pre_save signal for that. And in case one of those action would raise an exception stop the transaction, abort creating the user etc.
Is this the way to go? would it abort the save if something fails in this step (this is the required behavior) I suspect so but couldn't find docs about it.
What is the best practice for getting the future user.id. from my understanding it doesn't exists yet in pre-save but I need it as input for some of the extra custom actions.
Thanks
From the docs:
There's no way to tell what the value of an ID will be before you call
save(), because the value is determined by your database, not by
Django.
So if your pre-save processing requires the user.id, I am afraid this is not possible.
Here is the two part answer:
Yes, raising an exception in the pre_save() signal will abort the call to the save() method. For example, I did this in my pre_save():
if SOME_TEST:
raise exceptions.ParseError(detail="I reject your data. Here is why.")
Note: This was using the DRF exceptions, but I will assume you know how to raise whatever exception you prefer.
You could use the post_save() signal to get the id AFTER the save() method. But if that will not work for you, then below is my original explanation of some tactics to do what you want in the pre_save():
In the pre_save you can access User.objects.all() (if you import the User model, of course). If you are using the standard User model then the User.id value is AUTO INCREMENT. So if you NEED to know what the value WILL be for the new user, just get the highest User.id currently in the database (ie: the id of the last user created). Here is a simple query to do this:
last_user = User.objects.latest('id')
Then of course, last_user.id will give you the value of the id for the last user created and if you add one to that you will have the next user id. Of course, if you have a busy site you may have some problems with simultaneous user creation. The risk is that this value is received by two (or more) User creation attempts and one (or more) of them ends up wrong.
Of course, you can always set another field to be primary_key=True and this will replace the id field on the model. Then you can devise any sort of indexing mechanism that you can dream up! You could use a hash value of a unique characteristic. For example: The User.username has a Unique constraint, you could hash or hex encode that as a numeric ID to pre-determine the User.id. Or you could leave the id field in place and set it manually in the pre_save by assigning a value to obj.id. It could be anything you want. Even a hash value!
i'm not sure i can easily do this but i need to automatically add a filter on a field for every query related to my model. I've added a boolean property "active" to my model called Node.
For example
Node.query()
should return every node with the field Node.active set to True and ignore nodes with the active field set to false, without any other instructions.
Is it possible to override the function in any way or something similar? I'm not really good with neither python nor with app engine so i'm not sure i can actually do this.
You should create a class method to your Node class that does the query for you, then always use that. e.g.
class Node(db.Model):
# some properties and stuff
#classmethod
def active_nodes(cls):
return cls.all().filter('active = ',True)
Then always use Node.active_nodes()
This is forms part of your formal api that you always use. I use the approach extensively rather than writing the same (and often more complex query) all over the place.