I have two classes declared as follows:
from mongoengine import Document, fields, DynamicDocument
class Booking(DynamicDocument):
booking_id=fields.StringField(required=True)
pickup_timestamp=fields.DateTimeField()
class Assignment(Document):
created_timestamp=fields.DateTimeField(default=datetime.datetime.utcnow)
pickup_time=fields.DateTimeField()
bookings=fields.ListField(fields.ReferenceField(Booking))
My application allows a user to club Bookings together under an Assignment object, by selecting the Booking objects from a list on the UI.
When creating the assignment object I automatically set the pickup time to the least value from the Booking pickup_timestamp like so -
assignment.pickup_time = min([booking.pickup_timestamp for booking in assignment.bookings])
However I also need to set other attributes on the assignment object based on other fields in the earliest Booking object.
My question - Is there a way to sort a ListField containing ReferenceFields by a field on the referenced objects?
I did find this answer, however it does not talk about ReferenceFields in the ListField. Tried setting the field type to SortedListField as well, but I wasn't able to figure out how to specify which key to sort on.
Solved with -
assignment.bookings=sorted(assignment.bookings, key=lambda k: k.pickup_timestamp)
Which is pretty much the same as this answer. I didn't know that the MongoEngine ListField behaves exactly like a dictionary in this regard!
If there is a more efficient/better way to do this, would be very keen to know!
Related
Is there a way I could add an attribute to all query objects using annotate? I basically just need to get a value from an m2m relationship of the object and save it as an attribute of the object.
Something like this:
query.annotate(value_to_be_added=("value_from_m2m"))
Basically I have two different queries of the same model, one query A needs to have a "value" changed or added for all of its objects (and that value comes from the m2m relationship). Query B doesn't need to have those values changed.
How would I do this?
I solved it. What needs to be used is just a simple filter F().
from django.db.models import F
query.annotate(value_to_be_added=F("value_from_m2m"))
https://docs.djangoproject.com/en/3.0/ref/models/expressions/#f-expressions
I have been trying to define custom django model field in python. I referred the django docs at following location https://docs.djangoproject.com/en/1.10/howto/custom-model-fields/. However, I am confused over the following methods(which I have divided into groups as per my understanding) :-
Group 1 (Methods in this group are inter-related as per docs)
__init__()
deconstruct()
Group 2
db_type()
rel_db_type()
get_internal_type()
Group 3
from_db_value()
to_python()
get_prep_value()
get_db_prep_value()
get_db_prep_save()
value_from_object()
value_to_string()
Group 4
formfield
I am having following questions :-
When deconstruct() is used ? Docs says that, it's useful during migration, but it's not clearly explained. Moreover, when is it called ?
Difference between db_type() and get_internal_type()
Difference between get_prep_value() and get_db_prep_value()
Difference between value_from_object() and value_to_string(). value_from_object() is not given in docs.
Both from_db_value(), value_to_string() and to_python() gives python object from string. Then, why these different methods are exists ?
I know, I have asked a bit lengthy question. But couldn't find any other way to better ask this question.
Thanks in advance.
I'll try to answer them:
Q: When deconstruct() is used ?
A: This method is being used when you have instance of your Field to re-create it based on arguments you just passed in __init__.
As they mentioned in docs, if you are setting max_length arg to a static value in your __init__ method; you do not need it for your instances. So you can delete it in your deconstruct() method. With this, max_length won't show up in your instance while you are using it in your models. You can think deconstruct as a last clean-up and control place before use your field in model.
Q: Difference between db_type() and get_internal_type()
A: They are both related, but belong to different levels.
If your custom field's data type is depends on which DB you are using, db_type() is the place you can do your controls. Again, like they mentioned in docs, if your field is a kind of date/time value, you should / may check if current database is PostgreSQL or MySQL in this method. Because while date/time values called as timestamp in PostgreSQL, it is called datetime in MySQL.
get_internal_type method is kind of higher level version of db_type(). Let's go over date/time value example: If you don't want to check and control each data types belongs to different databases, you can inherit your custom field's data type from built-in Django fields. Instead of checking if it should be datetime or timestamp; you can return simply DateField in your get_internal_type method. As they mentioned in docs, If you've created db_type method already, in most cases, you do not need get_internal_type method.
Q: Difference between get_prep_value() and get_db_prep_value()
A: These guys also share almost same logic between db_type() and get_internal_type(). First of all, both these methods stands for converting db values to python objects. But, like in db_type method, get_db_prep_value() stands for backend specific field types.
Q: Difference between value_from_object() and value_to_string(). value_from_object() is not given in docs
A: From the docs:
To customize how the values are serialized by a serializer, you can
override value_to_string(). Using value_from_object() is the best way
to get the field’s value prior to serialization.
So, Actually we don't need value_from_object as documented. This method is used to get field's raw value before serialization. Get the value with this method, and customize how it should be serialized in value_to_string method. They even put an example code in docs
Q: Both from_db_value(), value_to_string() and to_python() gives python object from string. Then, why these different methods are exists ?
A: While to_python() converts field value to a valid python object, value_to_string() converts field values to string with your custom serialization. They stands for different jobs.
And from_db_value converts the value returned by database to python object. Never heard of it actually. But check this part from docs:
This method is not used for most built-in fields as the database
backend already returns the correct Python type, or the backend itself
does the conversion.
I don't know how to ask this but is very simple
I have an entity called "State" and another entity called "City".
I'm doing a query to get specific cities with a given parameter:
cities = City.objects.filter(code__exact=12345).values("id","name","state")
And then I serialize the list ( or dict? = in order to get them via JSON:
for c in cities:
result.append(c)
return HttpResponse(json.dumps(result))
The problem is I'm getting only the state ID but I need another attributes from this object, how I can initialize the state object inside the city or at least get specific attributes from state object.
The result from a values() call is a ValueQuerySet, which is special in a few ways. One of them is:
A ValuesQuerySet is useful when you know you’re only going to need
values from a small number of the available fields and you won’t need
the functionality of a model instance object. It’s more efficient to
select only the fields you need to use.
The part in bold is important. It means that the result of the queryset will not have the instances of the models, you have to tell it exactly what you need to be fetched.
So, if you know the fields of the state model you want in your result, you can add them in your values clause. If you just put state, it will give you the state id, which is the default identity field.
cities = City.objects.filter(code__exact=12345).values("id",
"name",
"state__name",
"state__id")
If you are doing this just to convert the results to json, use the built-in serializers:
from django.core import serializers
result = serializers.serialize('json',
City.objects.filter(code__exact=12345),
fields=('id', 'name', 'state__name', 'state__id'))
I'm currently building an app using Django and django-rest-framework.
My problem is relatively simple, but i got stuck at some point.
Basically, i manage Collection and Collectible objects. A Collectible object is assigned to a Collection. Both object have a field "created_at".
I would like to generate a view containing all Collections and for each, all Collectible. It works easily.
Now, i'm looking to generate the very same structure but with a filtering param "createdfrom" to have the new Collections and new Collectibles from the provided date.
Here is the code I have using django-filters:
class CollectionFilter(django_filters.FilterSet):
# /api/collections/?createdfrom=2013-11-20
createdfrom = django_filters.DateTimeFilter(name="collectibles__created_at", lookup_type='gt')
class Meta:
model = Collection
This works almost great. There is only a couple of issues:
It displays all Collectibles from a Collection in which at least one of them match the filter (basically, it also displays the outdated items along with the news ones)
It doesn't show new Collections created after such date.
Could anyone help me ?
Thanks a lot
If I understand your problem correctly, both the Collection and Collectible must have a date > the date provided for the filter. Thus, we will define an "action" to be taken with the QuerySet and value once provided. This is outlined in the django-filter documentation covering Core Arguments (specifically action).
def action(queryset, value):
return queryset.filter(
collectibles__created_at__gt=value,
created_at__gt=value,
)
class CollectionFilter(django_filters.FilterSet):
# /api/collections/?createdfrom=2013-11-20
createdfrom = django_filters.DateTimeFilter(
name="collectibles__created_at",
action=action,
)
class Meta:
model = Collection
We created an action function definition to be called with the QuerySet as the first argument, and the value (date) as the second argument.
action: An optional callable that tells the filter how to handle the queryset. It recieves a QuerySet and the value to filter on and should return a Queryset that is filtered appropriately. -From Documentation
I'd like to query a model for instances where the generic relation field is not empty (that is, in the example below I'm looking for instances where document.count() > 0):
class Report(models.Model):
document = generic.GenericRelation(Document)
Something like:
Report.objects.filter(date__gte=twomonths).exclude(document__isnull=True)
Unfortunately this doesn't work - the query returns objects that have no "document" (ie. it returns objects where document.count() is 0).
Is there a way to query for instances where the generic relationship is empty?
I believe there still may be some contradictions in your question. Note: "I'm looking for instances where document.count() == 0" and then later, "Unfortunately this doesn't work - the query returns objects that have no 'document' (ie. it returns objects where document.count() is 0)".
If you want Reports that have no documents, you can use:
Report.objects.filter(document__isnull=True)
Or
Report.objects.exclude(document__isnull=False)
If you want Reports that have at least one document, you can use:
Report.objects.filter(document__isnull=False)
Or
Report.objects.exclude(document__isnull=True)