Assuming i have a class that is called Customer that is defined in sqlalchemy to represent the customer table. I want to write a search method so that ...
results = Customer.search(query)
will return the results based on the method. Do I want to do this as a #classmethod?
#classmethod
def search(cls,query):
#do some stuff
return results
Can i use cls in place of DBSession.query?
DBSession.query(Customer).filter(...)
cls.query(Customer).filter(...)
To use
cls.query
you have to assign a query_property to your model classes!
You probably want to use this in other model classes as well, so you might want to do that in your model Base class somewhere in your model/__init__.py:
Base.query = Session.query_property()
Then you can simply write:
cls.query.filter(...)
Note that you don't specify the Object to query for anymore, that is automatically done by using the query_property mechanism.
I just recently wrote some similar code, following this reference.
class Users(Base):
__tablename__ = 'users'
#classmethod
def by_id(cls, userid):
return Session.query(Users).filter(Users.id==userid).first()
So the answer to your first question appears to be "yes". Not sure about the second, as I didn't substitute cls for DBSession.
Related
I have a model A and want to make subclasses of it.
class A(models.Model):
type = models.ForeignKey(Type)
data = models.JSONField()
def compute():
pass
class B(A):
def compute():
df = self.go_get_data()
self.data = self.process(df)
class C(A):
def compute():
df = self.go_get_other_data()
self.data = self.process_another_way(df)
# ... other subclasses of A
B and C should not have their own tables, so I decided to use the proxy attirbute of Meta. However, I want there to be a table of all the implemented proxies.
In particular, I want to keep a record of the name and description of each subclass.
For example, for B, the name would be "B" and the description would be the docstring for B.
So I made another model:
class Type(models.Model):
# The name of the class
name = models.String()
# The docstring of the class
desc = models.String()
# A unique identifier, different from the Django ID,
# that allows for smoothly changing the name of the class
identifier = models.Int()
Now, I want it so when I create an A, I can only choose between the different subclasses of A.
Hence the Type table should always be up-to-date.
For example, if I want to unit-test the behavior of B, I'll need to use the corresponding Type instance to create an instance of B, so that Type instance already needs to be in the database.
Looking over on the Django website, I see two ways to achieve this: fixtures and data migrations.
Fixtures aren't dynamic enough for my usecase, since the attributes literally come from the code. That leaves me with data migrations.
I tried writing one, that goes something like this:
def update_results(apps, schema_editor):
A = apps.get_model("app", "A")
Type = apps.get_model("app", "Type")
subclasses = get_all_subclasses(A)
for cls in subclasses:
id = cls.get_identifier()
Type.objects.update_or_create(
identifier=id,
defaults=dict(name=cls.__name__, desc=cls.__desc__)
)
class Migration(migrations.Migration):
operations = [
RunPython(update_results)
]
# ... other stuff
The problem is, I don't see how to store the identifier within the class, so that the Django Model instance can recover it.
So far, here is what I have tried:
I have tried using the fairly new __init_subclass__ construct of Python. So my code now looks like:
class A:
def __init_subclass__(cls, identifier=None, **kwargs):
super().__init_subclass__(**kwargs)
if identifier is None:
raise ValueError()
cls.identifier = identifier
Type.objects.update_or_create(
identifier=identifier,
defaults=dict(name=cls.__name__, desc=cls.__doc__)
)
# ... the rest of A
# The identifier should never change, so that even if the
# name of the class changes, we still know which subclass is referred to
class B(A, identifier=3):
# ... the rest of B
But this update_or_create fails when the database is new (e.g. during unit tests), because the Type table does not exist.
When I have this problem in development (we're still in early stages so deleting the DB is still sensible), I have to go
comment out the update_or_create in __init_subclass__. I can then migrate and put it back in.
Of course, this solution is also not great because __init_subclass__ is run way more than necessary. Ideally this machinery would only happen at migration.
So there you have it! I hope the problem statement makes sense.
Thanks for reading this far and I look forward to hearing from you; even if you have other things to do, I wish you a good rest of your day :)
With a little help from Django-expert friends, I solved this with the post_migrate signal.
I removed the update_or_create in __init_subclass, and in project/app/apps.py I added:
from django.apps import AppConfig
from django.db.models.signals import post_migrate
def get_all_subclasses(cls):
"""Get all subclasses of a class, recursively.
Used to get a list of all the implemented As.
"""
all_subclasses = []
for subclass in cls.__subclasses__():
all_subclasses.append(subclass)
all_subclasses.extend(get_all_subclasses(subclass))
return all_subclasses
def update_As(sender=None, **kwargs):
"""Get a list of all implemented As and write them in the database.
More precisely, each model is used to instantiate a Type, which will be used to identify As.
"""
from app.models import A, Type
subclasses = get_all_subclasses(A)
for cls in subclasses:
id = cls.identifier
Type.objects.update_or_create(identifier=id, defaults=dict(name=cls.__name__, desc=cls.__doc__))
class MyAppConfig(AppConfig):
default_auto_field = "django.db.models.BigAutoField"
name = "app"
def ready(self):
post_migrate.connect(update_As, sender=self)
Hope this is helpful for future Django coders in need!
Flask-SQLALchemy i have a model with columns:
class MyModel(db.Model):
def my_method1(self, arg1):
pass
a = Column(String(), primary_key=True)
Now i have a function which accepts Columns as argument to retrieve some information from them:
def get_column_info(column):
if column.primary_key:
return True
else:
return False
Note that this is just an example, the get_column_info does much more than that in reality.
Now i want to be able to access the originating model in my get_column_info function. That is i want to be able to call my_method1()
from within get_column_info.
Is there a way i can retrieve the originating model from a column instance?
There is no proper out of the box method for doing this. Column object has table attribute which returns __table__ attribute of the model but you can't get actual model from it. However (as this answer suggested) you can use get_class_by_table method in sqlalchemy_utils plugin:
from sqlalchemy_utils.functions import get_class_by_table
def get_model(column):
return get_class_by_table(db.Model, column.__table__)
Is there anything wrong with inheritance in which child class is only used to present parent's values in a different way?
Example:
class Parent(db.Model):
__tablename__ = u'parent'
parent_entry_id = db.Column(db.Integer, primary_key=True)
parent_entry_value = db.Column(db.BigInteger)
class Child(Parent):
__tablename__ = u'child'
#property
def extra_value(self):
return unicode(self.parent_entry_id) + unicode(self.parent_entry_value)
No new values will be added Child class, thus Joined Table, Single Table or Concrete Table Inheritance, as for me, is not needed.
If you're simply changing how you display the data from the class, I'm pretty sure you don't need a __tablename__.
Additionally, though I don't know your exact problem domain, I would simply just add the property on the original class. You could argue that you're adding some extra behavior to your original class, but that seems like a bit of a flimsy argument in this case.
I have a Django model where a lot of fields are choices. So I had to write a lot of "is_something" properties of the class to check whether the instance value is equal to some choice value. Something along the lines of:
class MyModel(models.Model):
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
#property
def is_some_value(self):
return self.some_choicefield == SOME_CHOICES.SOME_CHOICE_VALUE
# a lot of these...
In order to automate this and spare me a lot of redundant code, I thought about patching the instance at creation, with a function that adds a bunch of methods that do the checks.
The code became as follows (I'm assuming there's a "normalize" function that makes the label of the choice a usable function name):
def dynamic_add_checks(instance, field):
if hasattr(field, 'choices'):
choices = getattr(field, 'choices')
for (value,label) in choices:
def fun(instance):
return getattr(instance, field.name) == value
normalized_func_name = "is_%s_%s" % (field.name, normalize(label))
setattr(instance, normalized_func_name, fun(instance))
class MyModel(models.Model):
def __init__(self, *args, **kwargs):
super(MyModel).__init__(*args, **kwargs)
dynamic_add_checks(self, self._meta.get_field('some_choicefield')
some_choicefield = models.IntegerField(choices=SOME_CHOICES)
Now, this works but I have the feeling there is a better way to do it. Perhaps at class creation time (with metaclasses or in the new method)? Do you have any thoughts/suggestions about that?
Well I am not sure how to do this in your way, but in such cases I think the way to go is to simply create a new model, where you keep your choices, and change the field to ForeignKey. This is simpler to code and manage.
You can find a lot of information at a basic level in Django docs: Models: Relationships. In there, there are many links to follow expanding on various topics. Beyong that, I believe it just needs a bit of imagination, and maybe trial and error in the beginning.
I came across a similar problem where I needed to write large number of properties at runtime to provide backward compatibility while changing model fields. There are 2 standard ways to handle this -
First is to use a custom metaclass in your models, which inherits from models default metaclass.
Second, is to use class decorators. Class decorators sometimes provides an easy alternative to metaclasses, unless you have to do something before the creation of class, in which case you have to go with metaclasses.
I bet you know Django fields with choices provided will automatically have a display function.
Say you have a field defined like this:
category = models.SmallIntegerField(choices=CHOICES)
You can simply call a function called get_category_display() to access the display value. Here is the Django source code of this feature:
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/base.py#L962
https://github.com/django/django/blob/baff4dd37dabfef1ff939513fa45124382b57bf8/django/db/models/fields/init.py#L704
So we can follow this approach to achieve our dynamically set property goal.
Here is my scenario, a little bit different from yours but down to the end it's the same:
I have two classes, Course and Lesson, class Lesson has a ForeignKey field of Course, and I want to add a property name cached_course to class Lesson which will try to get Course from cache first, and fallback to database if cache misses:
Here is a typical solution:
from django.db import models
class Course(models.Model):
# some fields
class Lesson(models.Model):
course = models.ForeignKey(Course)
#property
def cached_course(self):
key = key_func()
course = cache.get(key)
if not course:
course = get_model_from_db()
cache.set(key, course)
return course
Turns out I have so many ForeignKey fields to cache, so here is the code following the similar approach of Django get_FIELD_display feature:
from django.db import models
from django.utils.functional import curry
class CachedForeignKeyField(models.ForeignKey):
def contribute_to_class(self, cls, name, **kwargs):
super(models.ForeignKey, self).contribute_to_class(cls, name, **kwargs)
setattr(cls, "cached_%s" % self.name,
property(curry(cls._cached_FIELD, field=self)))
class BaseModel(models.Model):
def _cached_FIELD(self, field):
value = getattr(self, field.attname)
Model = field.related_model
return cache.get_model(Model, pk=value)
class Meta:
abstract = True
class Course(BaseModel):
# some fields
class Lesson(BaseModel):
course = CachedForeignKeyField(Course)
By customizing CachedForeignKeyField, and overwrite the contribute_to_class method, along with BaseModel class with a _cached_FIELD method, every CachedForeignKeyField will automatically have a cached_FIELD property accordingly.
Too good to be true, bravo!
I currently have a model in NDB and I'd like to change the property name without necessarily touching NBD. Let's say I have the following:
from google.appengine.ext import ndb
class User(ndb.Model):
company = ndb.KeyProperty(repeated=True)
What I would like to have is something more like this:
class User(ndb.Model):
company_ = ndb.KeyProperty(repeated=True)
#property
def company(self):
return '42'
#company.setter
def company(self, new_company):
#set company here
Is there a relatively pain-free way to do so? I'd like the convienance of using property getter/setters, but given the current implementation I would like to avoid touching the underlying datastore.
you can change the class-level property name while keeping the underlying NDB property name by specifying the name="xx" param in the Property() constructor
so something like this could be done:
class User(ndb.Model):
company_ = ndb.KeyProperty(name="company", repeated=True)
#property
def company(self):
return self.company_
#company.setter
def company(self, new_company):
self.company_ = new_company
so now anytime you access .company_ NDB will actually set/get "company" internally... and you don't have to do any data migrations
From my understanding of ndb, the property names are stored in the database along with their contents for every entity. You would have to rewrite every entity with the new property name (and without the old one).
Since that is not pain-free, maybe you could choose other names for your getter and setter like get_company and set_company.