I've been using a Node MongoEngine document for a while.
I am trying to go from a simpe Node model to some more specific elements inheriting from it.
What I've done so far
At first, I was not aware of the inheritance possibility offered by MongoEngine (see here), so I was using a 'label' field to distinguish between 3 types of Nodes (respectively Keyword, Url and Domain).
Here is the original model:
class Node(Document):
project = ReferenceField(Project,
reverse_delete_rule=CASCADE,
required=True,)
name = StringField(required=True, unique_with=['project', 'label'])
label = StringField(required=True)
volume = IntField()
clusters = ListField(ReferenceField(Cluster, reverse_delete_rule=PULL))
x = FloatField(default=random.random())
y = FloatField(default=random.random())
connections = IntField(default=0)
meta = {
'indexes': ['project', 'label', 'name', 'clusters'],
}
I worked for some time with this model, so the node collection is currently populated with thousands of documents.
Then I implemented inheritance by adding 'allow_inheritance': True to the model and creating the following model:
Inherited model
class Keyword(Node):
""" A MongoEngine Document for keyword management. """
a_keywor_specific field = IntField()
def foo(self):
print('this is a keyword specific method')
Now this works fine for creating and saving new Keyword documents.
The thing I'm having trouble with is querying the old Nodes added before this change.
Question
If I try to query all the existing nodes, only the one I added after the inheritance change is returned:
In [21]: Node.objects()
Out[21]: [<Keyword: Keyword object>]
How can I access all the Nodes that were added before introducing inheritance ?
Is there any way to migrate those old Nodes to Keywords, Urls and Domains based on their original label attribute ?
Thanks !
This happened because when you created an inherited model, the old model queries use _cls attribute to query this model's records. But old records don't have this field.
Fill in this attribute to old records.
As for you second question.
I think, if you are going to make a migration script that will fill _cls field, you can fill its value depending on the value of label field.
You can find the required _cls values inserting records of each model.
Related
How can I make sure that a parent object has only one child/type?
class Property(...):
class Meta:
abstract = False
class Flat(Property):
pass
class House(Property):
pass
class Land(Property):
pass
I want every property object to have none or at most one child. It can be either flat, house or land (or null).
Is it possible to create a DB constraint for this?
My idea was to create a constraint that checks:
class Meta:
constraints = [
models.CheckConstraint(check=Q(Q(flat__isnull=True) & Q(house__isnull=True))
|
Q(Q(flat__isnull=True) & Q(land__isnull=True))
|
Q(Q(house__isnull=True) & Q(land__isnull=True)),
name="constraint")]
But apparently, there are no such fields on a DB level (you can get flat by property.flat getter in Django but not in DB)
Edit:
properties.Property: (models.E012) 'constraints' refers to the nonexistent field 'flat'.
But apparently, there are no such fields on a DB level (you can get flat by property.flat getter in Django but not in DB)
That is correct: Django adds a property to the Property model to lazily load the related Flat object and will make a query for that, but there is no database field named flat: this is just a query in reverse where Django basically queries with:
SELECT * FROM app_name_flat WHERE property_ptr=pk
with pk the primary key of the property object. It this makes a query.
A CHECK constraint [w3-schools] spans only over a row: it can not look on other rows nor can it look at other tables. It thus can not restrict other tables, and therefore is limited. It can for example prevent one column to have a certain value based on a value for another column in the same row (record), but that is how far a CHECK constraint normally looks.
In odoo v13, the crm.lead model is inherited and modified by the sale_crm module.
In the sale_crm module, the model crm.lead is inherited and a one2many field is added, order_ids. This is an array of sales orders associated with the lead.
I am trying to inherit the crm.lead model, and create a new field that is computed using the order_ids field.
I added sale_crm in the manifest dependencies
I inherit the crm.lead model and attempt to concat the names of all the associated SOs:
class Co7Lead(models.Model):
_inherit = "crm.lead"
so_list = fields.Text(
compute='_get_sos_text',
string="Text list of associated SOs",
help="A comma separated list of SOs associated with this lead")
def _get_sos_text(self):
txt = ""
for order in super(Co7Lead, self).order_ids:
txt += order.name + ""
return txt
Unfortunately, this causes a stack overflow (haha!)
I believe I need to use .browse on the order_ids field but I'm not able to find any examples on how to do this.
The compute method must assign the computed value to the field. If it uses the values of other fields (order_ids.name), it should specify those fields using depends().
You don't need to use super here, self is a record set, so loop over it to access the value of order_ids for each record.
Example:
#api.depends('order_ids.name')
def _get_sos_text(self):
for lead in self:
lead.so_list = "\n".join(order.name for order in lead.order_ids)
I just came across a scenario that I don't know how to resolve with the existing structure of my documents. As shown below I can obviously resolve this problem with some refactoring but I am curious about how this could be resolve the most efficiently possible and respecting the same structure.
Please see that this queestion is different to How to Do An Atomic Update on an EmbeddedDocument in a ListField in MongoEngine?
Let's suppose the following models:
class Scans(mongoengine.EmbeddedDocument):
peer = mongoengine.ReferenceField(Peers, required=True)
site = mongoengine.ReferenceField(Sites, required=True)
process_name = mongoengine.StringField(default=None)
documents = mongoengine.ListField(mongoengine.ReferenceField('Documents'))
is_complete = mongoengine.BooleanField(default=False)
to_start_at = mongoengine.DateTimeField()
started = mongoengine.DateTimeField()
finished = mongoengine.DateTimeField()
class ScanSettings(mongoengine.Document):
site = mongoengine.ReferenceField(Sites, required=True)
max_links = mongoengine.IntField(default=100)
max_size = mongoengine.IntField(default=1024)
mime_types = mongoengine.ListField(default=['text/html'])
is_active = mongoengine.BooleanField(default=True)
created = mongoengine.DateTimeField(default=datetime.datetime.now)
repeat = mongoengine.StringField(choices=REPEAT_PATTERN)
scans = mongoengine.EmbeddedDocumentListField(Scans)
What I would like to do is to insert a ScanSettings object if and only if all elements of the scans fields - list of Scans embedded documents - have in turn their document list unique? By unique I mean all elements within the list at database level and not the whole list - that'd be easy.
In plain English if at the time of inserting ScanSetting any element of the scans list has a instance of scans which list of documents are duplicated, then such insertion should not happen. I mean uniqueness at the database level, taking into account existing records if any.
Given that Mongo does not support uniqueness across all elements of a list within the same document I find two solutions:
Option A
I refactor my "schema" and make Scans collection inherit from Document rather than Embedded document and change the scans field of ScanSettings to be a ListField of ReferenceFields to Scans documents. Then it is easy as I just need to save the Scans first using "Updates" with operator "add_to_set" and option "upsert=True". Then once the operation has been approved, save the ScanSettings. I will need the number of scans instances to insert + 1 queries.
Option B
I keep the same "schema" but somehow generates unique IDs for the Scans embedded document. Then before any insertion of Scan Settings with a non-empty scans field I'll fetch the already existing records to see if there are duplicated document's ObjectIds among the just retrieved records and the ones to be inserted.
In other words I would check uniqueness via Python rather than using MogoneEngine/Mongodb. I will need 2 x number of scan instances to insert (read + update with add_set_operator) + 1 ScanSettings save
Option C
Ignore Uniqueness. Given how my model will be structured I am pretty sure there will be no duplicates or if any, it will be negligible. Then deal with duplicates at reading time. For those like me coming from Relational databases this solution feels hitching.
I am a novice in Mongo so I appreciate any comments. Thanks.
PS: I am using latest MongoEngine and free Mongodb.
Thanks a lot in advance.
I finally went for Option A so I refactor my model to:
a) Create a Mixin class that inherits from a Document class to add two methods: overriding 'save' so that it only allows saves when the list of unique documents is empty and 'save_with_uniqueness' which allows saves and/or updates when the list of documents is empty. The idea is to enforce uniqueness.
b) Refactor both Scans and ScanSettings sot that the former redefine the 'scans' field as a ListField of references to Scans and the latter so that inherits from Document rather than Embedded Document.
c) The reality is that Scans and ScanSettings are now inheriting from the Mixin class as both classes need to guarantee uniqueness for both of their attribute 'documents' and 'scans', respectively. Hence the Mixin class.
With a) and b) I can guarantee uniqueness and save first each scan instance for it to later on be added to ScanSettings.scans in the usual way.
A few points for novices like me:
See that I am using inheritance. For it to work you need too add a attribute in the meta dictionary to allow inheritance as shown in the model below.
In my case I wanted to have Scans and ScanSettings in different collections so I had to make them 'abstract' as shown in the meta dictionary of the Mixin class too.
For the save_with_uniqueness I used upsert=True so that if a record can created if this does not exist. The idea is to use 'save_with_uniqueness' in the same way as 'save, creating or updating a document if this exist or not.
I also used 'full_result' flag as I need to return ObjectId of the latest record inserted.
Document._fields is a dictionary that contain the fields that compose that document. I actually wanted to create a general-purpose save_with_uniqueness method so I did not want to have to manually type in the fields of the Document or just duplicate unnecessary code - hence the Mixin.
Finally the code. It's not fully tested but enough to get the main idea right for what I need.
class UniquenessMixin(mongoengine.Document):
def save(self, *args, **kwargs):
try:
many_unique = kwargs['many_unique']
except KeyError:
pass
else:
attribute = getattr(self, many_unique)
self_name = self.__class__.__name__
if len(attribute):
raise errors.DbModelOperationError(f"It looks like you are trying to save a {self.__class__.__name__} "
f"object with a non-empty list of {many_unique}. "
f"Please use '{self_name.lower()}.save_with_uniqueness()' instead")
return super().save(*args, **kwargs)
def save_with_uniqueness(self, many_unique):
attribute = getattr(self, many_unique)
self_name = self.__class__.__name__
if not len(attribute):
raise errors.DbModelOperationError(f"It looks like you are trying to save a {self_name} object with an "
f"empty list {many_unique}. Please use '{self_name.lower()}.save()' "
f"instead")
updates, removals = self._delta()
if not updates:
raise errors.DbModelOperationError(f"It looks like you are trying to update '{self.__class__.__name__}' "
f"but no fields were modified since this object was created")
kwargs = {(key if key != many_unique else 'add_to_set__' + key): value for key, value in updates.items()}
pk = bson.ObjectId() if not self.id else self.id
result = self.__class__.objects(id=pk).update(upsert=True, full_result=True, **kwargs)
try:
self.id = result['upserted']
except KeyError:
pass
finally:
return self.id
meta = {'allow_inheritance': True, 'abstract': True}
class Scans(UniquenessMixin):
peer = mongoengine.ReferenceField(Peers, required=True)
site = mongoengine.ReferenceField(Sites, required=True)
process_name = mongoengine.StringField(default=None)
documents = mongoengine.ListField(mongoengine.ReferenceField('Documents'))
is_complete = mongoengine.BooleanField(default=False)
to_start_at = mongoengine.DateTimeField()
started = mongoengine.DateTimeField()
finished = mongoengine.DateTimeField()
meta = {'collection': 'Scans'}
class ScanSettings(UniquenessMixin):
site = mongoengine.ReferenceField(Sites, required=True)
max_links = mongoengine.IntField(default=100)
max_size = mongoengine.IntField(default=1024)
mime_types = mongoengine.ListField(default=['text/html'])
is_active = mongoengine.BooleanField(default=True)
created = mongoengine.DateTimeField(default=datetime.datetime.now)
repeat = mongoengine.StringField(choices=REPEAT_PATTERN)
scans = mongoengine.ListField(mongoengine.ReferenceField(Scans))
meta = {'collection': 'ScanSettings'}
I have two models:
class BusinessCard(models.Model):
name = models.CharField(_("name"),null=True,max_length=50)
class Contacts(models.Model):
businesscard_id = models.OneToOneField(BusinessCard,null=True,blank=True,related_name='contact_detail',db_column="businesscard_id")
bcard_json_data = JsonField(null=True)
I just want access contacts model data using business card model:
target_bacard=BusinessCard.objects.filter(id=target_bacard_id).select_related()
When we access the target_bacard.contact_detail it gives key errors.
How can I get the contacts data using target_bacard queryset.
use get() instead of filter() like:
target_bacard = BusinessCard.objects.get(id=target_bacard_id)
target_bacard.contact_detail
If you want to access the Contacts instance that is in the 1-to-1 relationship with a BusinessCard instance bacard, use the related name you specified in Contacts:
contact = bacard.contact_detail
Also, you have some misleading names: Contacts should rather be Contact since an instance of this model represents only one contact. And its field businesscard_id would better be named businesscard (note that the table column will be called businesscard_id at the database level automatically in that case and store the id of the related businesssscard) because in the ORM you get a BusinessCard model instance when you access it, and not just its id.
You have not passed related model (field) argument to select_related()
target_bacard=BusinessCard.objects.filter(id=target_bacard_id).select_related('contact_detail')
Assuming id of BusinessCard is unique, you may want to use ...objects.get(id=target_bacard_id) inplace of ...objects.filter(id=target_bacard_id). Anyway select_related() will work on both ways.
select_related() is used for saving database query.
here is the documentation
I'm currently building an app using Django and django-rest-framework.
My problem is relatively simple, but i got stuck at some point.
Basically, i manage Collection and Collectible objects. A Collectible object is assigned to a Collection. Both object have a field "created_at".
I would like to generate a view containing all Collections and for each, all Collectible. It works easily.
Now, i'm looking to generate the very same structure but with a filtering param "createdfrom" to have the new Collections and new Collectibles from the provided date.
Here is the code I have using django-filters:
class CollectionFilter(django_filters.FilterSet):
# /api/collections/?createdfrom=2013-11-20
createdfrom = django_filters.DateTimeFilter(name="collectibles__created_at", lookup_type='gt')
class Meta:
model = Collection
This works almost great. There is only a couple of issues:
It displays all Collectibles from a Collection in which at least one of them match the filter (basically, it also displays the outdated items along with the news ones)
It doesn't show new Collections created after such date.
Could anyone help me ?
Thanks a lot
If I understand your problem correctly, both the Collection and Collectible must have a date > the date provided for the filter. Thus, we will define an "action" to be taken with the QuerySet and value once provided. This is outlined in the django-filter documentation covering Core Arguments (specifically action).
def action(queryset, value):
return queryset.filter(
collectibles__created_at__gt=value,
created_at__gt=value,
)
class CollectionFilter(django_filters.FilterSet):
# /api/collections/?createdfrom=2013-11-20
createdfrom = django_filters.DateTimeFilter(
name="collectibles__created_at",
action=action,
)
class Meta:
model = Collection
We created an action function definition to be called with the QuerySet as the first argument, and the value (date) as the second argument.
action: An optional callable that tells the filter how to handle the queryset. It recieves a QuerySet and the value to filter on and should return a Queryset that is filtered appropriately. -From Documentation