Django REST framework serializing model combinations - python

I'm programming an online game with a JavaScript client and I use Django REST framework for the backend. I have written a quest system for it.
My quests objects are dynamically created from a django model QuestTemplate which stores information like the Quest desription and the titel (the part that is the same for every user); and another model QuestHistory where I put the information about the state of quest for a certain user: so it has fields like user and completed. They also have some nested objects: Tasks and, Rewards which are created in a similar way to the the Quest objects.
I added a pure python class Quest that combines all the fields of those models, and then I wrote a Serializer for this class. The drawback is that I have to define all the fields again in the QuestSerializer
I have seen that for the ModelSerializer you can use a inner class Meta where you specifiy the model and . Is there also a way to do this with a normal python class instead of a model (with my Quest class).
http://www.django-rest-framework.org/api-guide/serializers#specifying-nested-serialization
Or:
Is it possible to specify more than one model in this inner class, so that it takes fields from my model QuestTemplate and some other fields from my model QuestHistory?
(I'm also not sure about whether this structure makes sense and asked about it here: django models and OOP design )

In the class Meta of the ModelSerializer you can specify only one Model as far as I know. However there are possibilities to add custom fields to the serializer. In your case you could maybe try with:
custom_field = serializers.SerializerMethodField('some_method_in_your_serializer')
You should add the method to your serializer like this:
def some_method_in_your_serializer(self, obj):
# here comes your logic to get fields from other models, probably some query
return some_value # this is the value that comes into your custom_field
And add the custom_field to fields in the class Meta:
class Meta:
fields = ('custom_field', 'all_other_fields_you_need')
Take a look in the documentation about SerializerMethodField for deeper understanding.

Related

Django Like mechanism. Database performance question

I have CustomUser model and Post model. I consider adding a lightweight like mechanism to the posts.
What comes to my mind is defining a Like model in such fashion to connect the models to each other:
class LikeFeedback(models.Model):
likingUser = models.ForeignKey(CustomUser)
post_liked = models.ManyToManyField(Post)
But this design produces a new row in the database with each like.
Another option is to define CustomUser and Post models in a way that:
class Post(models.Model):
...
users_liked = models.ManyToManyField(CustomUser)
class CustomUser(models.Model):
...
posts_liked = models.ManyToManyField(Post)
I am not sure if this approach creates a new row or uses a different indexing mechanism, but it looks tidier.
In terms of DB performance what approach is the fastest? Do I need to define the ManyToMany connection in both models to speed up DB processes? Because 15 posts are to be displayed on the webpage at once and and with every post it is necessary to check if the visitor already liked the note. Also, with each like and takeback a write operation is to be performed on the DB.
I am not sure if this approach creates a new row or uses a different indexing mechanism, but it looks tidier.
A ManyToManyField will create an extra table called a junction table [wiki] with ForeignKeys to the model where you define the ManyToManyField, and the model that you target with the ManyToManyField.
You furthermore only need one ManyToManyField, otherwise you make two relations that act indepdently. You thus model this as:
from django.conf import settings
class Post(models.Model):
# ...
likes = models.ManyToManyField(
settings.AUTH_USER_MODEL,
related_name='liked_posts'
)
class CustomUser(models.Model):
# ...
# no ManyToManyField to Post
Note: It is normally better to make use of the settings.AUTH_USER_MODEL [Django-doc] to refer to the user model, than to use the User model [Django-doc] directly. For more information you can see the referencing the User model section of the documentation.

Call nested Serializer's .update() method

I have a JSONField in my model that stores some configuration data. I want to access this field (both read and write) with ability to make partial updates of inner fields and their values.
For purpose of example let a model be called MyModel with JSONField called config:
class MyModel(models.Model):
config = JSONField(default=dict())
...
I created a separate ViewSet to access information stored in config field. Assume that user model has ForeignKey relation to MyModel. Simplified version of this ViewSet is:
class ConfigurationFieldViewSet(mixins.RetrieveModelMixin, mixins.UpdateModelMixin, viewsets.GenericViewSet):
serializer_class = MyModelConfigurationSerializer
def get_object(self):
return self.request.user.my_model
Data stored in config has a certain structure with several possible inner objects:
{
"C1": {"counter": 42, "active": false},
"C2": {"counter": 13, "active": true}
}
To access and correctly serialize MyModel instance at all levels of nesting I have created serializers for each level of field. To acces config field in MyModel itself I'm using this serializer:
class MyModelConfigurationSerializer(serializers.ModelSerializer):
configuration = ConfigurationFieldSerializer(required=True)
class Meta:
model = MyModel
fields = ('configuration',)
To access and serialize first layer of configuration field there's second serializer:
class ConfigurationFieldSerializer(serializers.Serializer):
C1 = BaseConfigurationSerializer(required=True)
C2 = BaseConfigurationSerializer(required=True)
At last to access inner structure of each C1 and C2 fields there's third serializer:
class BaseConfigurationSerializer(serializers.Serializer):
counter = serializers.IntegerField(
required=False,
help_text=_('Some integer field help text')
)
active = serializers.BooleanField(
required=False,
help_text=_('Some boolean field description')
)
The code above works perfectly to read data stored in config field and correctly serializes it's inner objects. The problem appears when I try to perform a PUT on this field.
If I override update method at the level of MyModelConfigurationSerializer, then serializers verify data I'm submitting but as a chunk and I'm only able to save it all at once. If I'm trying to submit some inner field I still correctly receive validation errors by inner serializers.
def update(self, instance, validated_data):
instance.configuration = validated_data.get(
'configuration', instance.configuration
)
instance.save()
return instance
What I'm unable to do though is call update methods of inner serializers (ConfigurationFieldSerializer and BaseConfigurationSerializer in this case): if I implement their update methods they simply do not get called.
According to DRF Documentation writable nested representations are possible and corresponding update or create methods should be called whenever update is called on serializer of top level.
I've had this problem recently as well, and it looks like the way you are doing it is the "only way" when it comes to nested writable serializers.
From the same DRF docs you've probably already seen:
Because the behavior of nested creates and updates can be ambiguous, and may require complex dependencies between related models, REST framework 3 requires you to always write these methods explicitly. The default ModelSerializer .create() and .update() methods do not include support for writable nested representations.
There are however, third-party packages available such as DRF Writable Nested that support automatic writable nested representations.
Basically it means that when you have nesting it won't even try to call any of the nested serializer storage methods.
That may seem like a bit of a pain, but in retrospect it's probably better for the design of your application. Your example is pretty simple, but in other situations the ordering in which things are saved might be important. If update of each nested serializer was ran automatically then DRF would have to somehow know when to save each thing.
As an example, if your example was about create rather than update, it would mean that you need to first store your model MyModel before storing the configuration on top of it. However DRF cannot know that.
Also it could as easily have been that configuration was actually another related model which needed to be saved first before you could save a relation to it from MyModel. So DRF takes the route of just telling you to do it yourself, at the root serializer.
From my own experience this is also helpful to allow you to fine-tune performance later (ex. in your case you can avoid saving MyModel twice).
Finally, if you want to make your code more modular, you can still do it (send segments of the validated data to different handlers, eg to a new update_configurations() function), it just won't be done automatically using the nested serializers.

How to think about Django's normal class based views vs. using a REST API

I've been writing a webapp with Django to replace a clumsy, spreadsheet based sports picking game that I play with some friends. I've learned a lot, and had a great time getting to know Django and how to build something like this from scratch.
I recently realized that I wanted to use something more powerful on the frontend (Ember, Angular, etc) with the end goal being a single page app. To that end, I installed Django REST Framework (DRF) and started reading the docs and following the tutorial. It's super interesting, and I'm finally starting to see why a client-server model with an API is really the only way to achieve the smooth interactivity that's all over now.
I'm trying to implement one of my class based views as an API endpoint, and I've been having a lot of trouble conceptualizing it. I thought I'd start with a simple, GET-only endpoint- here's the simple CBV I'm trying to replicate in API form:
class MatchupDetail(DetailView):
template_name = 'app/matchups.html'
context_object_name = 'pick_sheet'
def get_object(self):
#logic to find and return object
def get_opponent(self,username,schedule,week, **kwargs):
#logic to find and return the opponent in the matchup
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
#logic to pull the opponents details and set them in the context
I feel like I have a handle on this flow- a user clicks a link, and this view retrieves the object at the heart of the requested page, supplements it with content in the context, then renders it.
As I began thinking about turning this into an API endpoint, it didn't make a whole lot of sense. Should I be putting all the user-related data into a single JSON response? Or should the frontend basically handle the flow of this logic and the API simply be composed of a collection of endpoints- for example, one to retrieve the object, and one or more to retrieve what's now being passed in the context?
What prompted me to make this post was some trouble with my (super basic) API implementation of the above view:
class MatchupDetailApi(generics.ListAPIView):
queryset = Sheet.objects.all()
serializer_class = SheetSerializer
With serializer:
class SheetSerializer(serializers.ModelSerializer):
user = serializers.ReadOnlyField()
class Meta:
model = Sheet
I added the user field when I noticed that without it, the returned serialized Sheet objects are literally just the row in the database- an integer ID, integer foreign key to the User object, and so on. With a 'traditional' CBV, the entire objects are returned to the template- so it's very intuitive to access related fields, and with Django it's also easy to traverse object relationships.
Does a REST implementation offer the same sort of thing? From what I've read, it seems like I'll need an extension to DRF (django-rest-multiple-models) to return more than one model in a single response, which leads me to think I should be creating endpoints for every model, and leaving presentation logic to when I take care of the frontend. Is that typical? Or is it feasible to have an API endpoint that does return something like an object and several related objects?
Note: the basic endpoint above stopped working when I added the user to the SheetSerializer. I realized I should have a UserSerializer as well, which is:
class UserSerializer(serializers.HyperlinkedModelSerializer):
class Meta:
model = User
However, when I try to browse the API, i get a TypeError that the first user isn't serializable. Specifically: <User: dkhaupt> is not JSON serializable. Isn't this what the UserSerializer is for?
Is it feasible to have an API endpoint that does return something like
an object and several related objects?
Yes!
And it sounds like you are off to a great start. I would structure it something like this:
class UserSerializer(serializers.ModelSerializer):
"""serializes a user"""
class Meta:
model = User
fields = ('id', 'first_name', 'last_name',)
class SheetSerializer(serializers.ModelSerializer):
"""serializes a sheet, and nests user relationship"""
user = UserSerializer(read_only=True)
class Meta:
model = Sheet
fields = ('id', 'sheet_name', 'user',)
I don't think you need django-rest-multiple-models for what you are trying to achieve. In my sketch (where I'm guessing fieldnames) you will serialize the sheet, and also the associated user object.
You can add fields from another related model using the source attribute.
for example:
class SheetSerializer(serializers.ModelSerializer):
user_id = serializers.ReadOnlyField(source='user.user_id')
username = serializers.ReadOnlyField(source='user.username')
class Meta:
model = Sheet
Here the serializer will return the information from the user model that is related to the Sheet model.

django models and OOP design

I wrote a quest system for an online game. My quests are serialized into json objects for a JavaScript client that fetches those quests then from a REST backend (I use django RestFramework)
Now I'm wondering on which class or django model I should put the "behaviour" that belongs to the data.
I stored the data that belongs to a quest in several separate models:
A model QuestHistory: with models.Fields like Boolean completed, and Datetime started where I put the information belonging to a specific user (it also as a field user).
Then I have a model QuestTemplate : The part that is always the same, fields like quest_title and quest_description
I also have a model Rewards and model Task and TaskHistory that are linked to a quest with a foreign Key field.
To combine this information back to quest I created a pure python class Quest(object): and defined methods on this class like check_quest_completion. This class is the then later serialized. The Problem with this approach is that It becomes quite verbose, for example when I instantiate this class or when I define the Serializer.
Is there a python or django "shortcut" to put all fields of a django model into another class (my Quest class here), something similar to the dict.update method maybe?
Or should I try to put the methods on the models instead and get rid of the Quest class?
I have some other places in my game that look very similar to the quest system for example the inventory system so I'm hoping for a more elegant solution.
You should put the methods of the Quest class on the model itself and get rid of the Quest class.

How do I apply Django model Meta options to models that I did not write?

I want to apply the "ordering" Meta option to the Django model User from django.contrib.auth.models. Normally I would just put the Meta class in the model's definition, but in this case I did not define the model. So where do I put the Meta class to modify the User model?
This is how the Django manual recommends you do it:
You could also use a proxy model to define a different default ordering on a model. The standard User model has no ordering defined on it (intentionally; sorting is expensive and we don't want to do it all the time when we fetch users). You might want to regularly order by the username attribute when you use the proxy. This is easy:
class OrderedUser(User):
class Meta:
ordering = ["username"]
proxy = True
Now normal User queries will be unorderd and OrderedUser queries will be ordered by username.
Note that for this to work you will need to have a trunk checkout of Django as it is fairly new.
If you don't have access to it, you will need to get rid of the proxy part and implement it that way, which can get cumbersome. Check out this article on how to accomplish this.
Paolo's answer is great; I wasn't previously aware of the new proxy support. The only issue with it is that you need to target your code to the OrderedUser model - which is in a sense similar to simply doing a User.objects.filter(....).order_by('username'). In other words, it's less verbose but you need to explicitly write your code to target it. (Of course, as mentioned, you'd also have to be on trunk.)
My sense is that you want all User queries to be ordered, including in third party apps that you don't control. In such a circumstance, monkeypatching the base class is relatively easy and very unlikely to cause any problems. In a central location (such as your settings.py), you could do:
from django.contrib.auth.models import User
User.Meta.ordering = ['username']
UPDATE: Django 1.5 now supports configurable User models.
You can either subclass User:
class OrderedUser(User):
class Meta:
ordering = ['-id', 'username']
Or you could use the ordering in ModelAdmin:
class UserAdmin(admin.ModelAdmin):
ordering = ['-id', 'username']
# unregister user since its already been registered by auth
admin.site.unregister(User)
admin.site.register(User, UserAdmin)
Note: the ModelAdmin method will only change the ordering in the admin, it won't change the ordering of queries.
Contact the author and ask them to make a change.

Categories

Resources