I have a JSONField in my model that stores some configuration data. I want to access this field (both read and write) with ability to make partial updates of inner fields and their values.
For purpose of example let a model be called MyModel with JSONField called config:
class MyModel(models.Model):
config = JSONField(default=dict())
...
I created a separate ViewSet to access information stored in config field. Assume that user model has ForeignKey relation to MyModel. Simplified version of this ViewSet is:
class ConfigurationFieldViewSet(mixins.RetrieveModelMixin, mixins.UpdateModelMixin, viewsets.GenericViewSet):
serializer_class = MyModelConfigurationSerializer
def get_object(self):
return self.request.user.my_model
Data stored in config has a certain structure with several possible inner objects:
{
"C1": {"counter": 42, "active": false},
"C2": {"counter": 13, "active": true}
}
To access and correctly serialize MyModel instance at all levels of nesting I have created serializers for each level of field. To acces config field in MyModel itself I'm using this serializer:
class MyModelConfigurationSerializer(serializers.ModelSerializer):
configuration = ConfigurationFieldSerializer(required=True)
class Meta:
model = MyModel
fields = ('configuration',)
To access and serialize first layer of configuration field there's second serializer:
class ConfigurationFieldSerializer(serializers.Serializer):
C1 = BaseConfigurationSerializer(required=True)
C2 = BaseConfigurationSerializer(required=True)
At last to access inner structure of each C1 and C2 fields there's third serializer:
class BaseConfigurationSerializer(serializers.Serializer):
counter = serializers.IntegerField(
required=False,
help_text=_('Some integer field help text')
)
active = serializers.BooleanField(
required=False,
help_text=_('Some boolean field description')
)
The code above works perfectly to read data stored in config field and correctly serializes it's inner objects. The problem appears when I try to perform a PUT on this field.
If I override update method at the level of MyModelConfigurationSerializer, then serializers verify data I'm submitting but as a chunk and I'm only able to save it all at once. If I'm trying to submit some inner field I still correctly receive validation errors by inner serializers.
def update(self, instance, validated_data):
instance.configuration = validated_data.get(
'configuration', instance.configuration
)
instance.save()
return instance
What I'm unable to do though is call update methods of inner serializers (ConfigurationFieldSerializer and BaseConfigurationSerializer in this case): if I implement their update methods they simply do not get called.
According to DRF Documentation writable nested representations are possible and corresponding update or create methods should be called whenever update is called on serializer of top level.
I've had this problem recently as well, and it looks like the way you are doing it is the "only way" when it comes to nested writable serializers.
From the same DRF docs you've probably already seen:
Because the behavior of nested creates and updates can be ambiguous, and may require complex dependencies between related models, REST framework 3 requires you to always write these methods explicitly. The default ModelSerializer .create() and .update() methods do not include support for writable nested representations.
There are however, third-party packages available such as DRF Writable Nested that support automatic writable nested representations.
Basically it means that when you have nesting it won't even try to call any of the nested serializer storage methods.
That may seem like a bit of a pain, but in retrospect it's probably better for the design of your application. Your example is pretty simple, but in other situations the ordering in which things are saved might be important. If update of each nested serializer was ran automatically then DRF would have to somehow know when to save each thing.
As an example, if your example was about create rather than update, it would mean that you need to first store your model MyModel before storing the configuration on top of it. However DRF cannot know that.
Also it could as easily have been that configuration was actually another related model which needed to be saved first before you could save a relation to it from MyModel. So DRF takes the route of just telling you to do it yourself, at the root serializer.
From my own experience this is also helpful to allow you to fine-tune performance later (ex. in your case you can avoid saving MyModel twice).
Finally, if you want to make your code more modular, you can still do it (send segments of the validated data to different handlers, eg to a new update_configurations() function), it just won't be done automatically using the nested serializers.
Related
I receive a number of similar objects as an input for my API, and I deserialize them using my own serializer with the parameter many=True like this:
serializer = MySerializer(data=request.data, many=True)
The serializer is an instance of ListSerializer.
Then I need to make sure that there are certain combinations of objects in that list. However, I don't seem to find a way to write a .validate() method on the ListSerializer of replace it by my own ListSerializer implementation.
Is there a way to do this validation in the serializer, or do I have to iterate over the deserialized objects and check them?
The Django REST frameworks documentation has a section on customizing ListSerializer behavior.
This entails creating a custom subclass of ListSerializer. You would probably want to create some custom validation in your subclass.
I have model with custom manager (replaced from properties) that annotate queryset by pseudo fields (adds some annotation inside get_queryset). And its cool because I have one database query per list and so on.
But in some cases this is a problem. For example I want to get my_annotated_field inside signal function or with Django Rest Framework on save ViewSet (where we didn't get object via my manager). Model class doesn't have those fields.
Is there way to solve this problem without extra query like obj = MyModel.objects.get(pk=obj.id)?
I'm writing an application in Django (which I'm very new to) where the admin area will be exposed to 'customers' of the application, not just staff/superusers, because of the nature of the application and the way Django automatically generates forms in the admin area with such little code..
As such I need to robust and manageable way to maintain authentication and separating data, so only data created by a user is seen by that user.
At the moment I'm just using the default admin package and changing permissions for 'client users' to filter what data they can see (I only want them to see data they've created) using code like the below:
class MyModelAdmin(admin.ModelAdmin):
def get_queryset(self, request):
qs = super(MyModelAdmin, self).get_queryset(request)
return qs.filter(user=request.user)
def save_model(self, request, obj, form, change):
# field not editable in admin area so handle it here...
obj.user = request.user
obj.save()
However as the application scales, I can see ensuring this type of data filtering becoming difficult to manage, for example if there are chains of foreign keys on certain tables(A->B B->C C->D), and to filter the table at the end of the chain I need to do various JOINs to get the rows which relate to the current user.
A couple of solutions I'm pondering are creating a separate admin app per user, but this feels like overkill and even more unmanageable.
Or just adding the user column to every Model where data filtering by user is required to make it easier to filter.
Any thoughts on the best approach to take?
First off, from experience, you're better off offering editing and creating functionality to your users in an actual django app, using Views. Generic views make this very easy. Once you let your users into the admin, they will get used to it and it's hard to get them to leave.
Additionally you should use contrib.auth.Group together with django-guardian to keep track of object-level permissions instead of implementing it yourself. It pays off in the long run.
If you want to make this experience on your own however, you have more than one sensible choice:
owner on root objects in the ForeignKey pyramid
owner on every model
To realize the first option, you should implement two methods on every model down the ForeignKey chain:
def get_parent(self):
"""Should return the object that should be queried for ownership information"""
pass
def owned_by(self, user):
"""Should return a boolean indicating whether `user` owns the object, by querying `self.get_parent().owned_by(user)`"""
pass
However, as you stated, this incurrs many JOINS if your schema is sufficiently complex.
I would advise you to store the information about the owner in every model, everything else is a maintanence nightmare in my experience.
Instead of adding the field manually to every model manually, you should use inheritance. However django provides bad built-in support for inheritance with relations: An abstract base model cannot define a models.ForeignKey, so you're stuck with table based inheritance.
Table based inheritance brings another problem with itself: Consider these models:
from django.db import models
from app.settings import AUTH_USER_MODEL
class Base(models.Model):
owner = models.ForeignKey(AUTH_USER_MODEL)
class ChildA(Base):
name = models.CharField(max_length=5)
class ChildB(Base):
location = models.CharField(max_length=5)
It is easy to find the owner of a given instance of ChildA or ChildB:
>>> obj = ChildA.objects.create(owner=Peter, name="alex")
>>> obj.owner
Peter
However it is non trivial to find all objects owned by a particular user:
>>> Base.objects.filter(owner=Peter)
<Base-object at 0xffffff>
The default manager returns a Base object, and doesn't contain information about whether it is a ChildA or ChildB instance, which can be troublesome.
To circumvent this, I recommend a polymorphic approach with django-polymorphic or django-model-utils, which is more lightweight. They both provide means to retrieve the child classes for a given Base model in the queries.
See my answer here for more information on polymorphism in django.
These also incur JOINs, but at least the complexity is manageable.
I'm programming an online game with a JavaScript client and I use Django REST framework for the backend. I have written a quest system for it.
My quests objects are dynamically created from a django model QuestTemplate which stores information like the Quest desription and the titel (the part that is the same for every user); and another model QuestHistory where I put the information about the state of quest for a certain user: so it has fields like user and completed. They also have some nested objects: Tasks and, Rewards which are created in a similar way to the the Quest objects.
I added a pure python class Quest that combines all the fields of those models, and then I wrote a Serializer for this class. The drawback is that I have to define all the fields again in the QuestSerializer
I have seen that for the ModelSerializer you can use a inner class Meta where you specifiy the model and . Is there also a way to do this with a normal python class instead of a model (with my Quest class).
http://www.django-rest-framework.org/api-guide/serializers#specifying-nested-serialization
Or:
Is it possible to specify more than one model in this inner class, so that it takes fields from my model QuestTemplate and some other fields from my model QuestHistory?
(I'm also not sure about whether this structure makes sense and asked about it here: django models and OOP design )
In the class Meta of the ModelSerializer you can specify only one Model as far as I know. However there are possibilities to add custom fields to the serializer. In your case you could maybe try with:
custom_field = serializers.SerializerMethodField('some_method_in_your_serializer')
You should add the method to your serializer like this:
def some_method_in_your_serializer(self, obj):
# here comes your logic to get fields from other models, probably some query
return some_value # this is the value that comes into your custom_field
And add the custom_field to fields in the class Meta:
class Meta:
fields = ('custom_field', 'all_other_fields_you_need')
Take a look in the documentation about SerializerMethodField for deeper understanding.
Does anyone can tell me if it's possible to create a Model class, with some model fields and some other fields taking their data from external data sources. The point is that I would like this model to be exploited the same way as another model by ModelForm for instance. I mean if I redefine "objects" Manager of the model by specifying the actions to get the datas for special fields (those who may not be linked to datas from the database), would the modelForm link the input with the fields not attached to the database ? Similar question about related objects. If I have a Model that has a relation with that special Model, can I get this Model instances through the classic way to get related objects (with both the classic model fields and the non-database fields) ?
Please tell me if I'm not clear, I'll reformulate.
Thanks.
EDIT: I tried to make a Model with custom fields, and then override the default Manager and its functions: all, get, ... to get objects like it would be with classical Model and Manager, it works. However, I don't use QuerySet, and it seems that the only way to get ModelForm, related objects and the admin functionnalities, working with it, is to build the QuerySet properly and let it being returned by the manager. That's why now I'm wondering if it's possible to properly and manually build a QuerySet with data got from external sources, or tell django-admin, model forms and related objects to take care of another class than queryset on this Model.
Thanks
The way is to define custom methods:
Define custom methods on a model to add custom "row-level"
functionality to your objects. Whereas Manager methods are intended to
do "table-wide" things, model methods should act on a particular model
instance.
This is a valuable technique for keeping business logic in one place
-- the model.
I have now a partial solution. I override the Manager and in particular its all() and get() functions (because I only need those functions for now). all() returns a queryset in which I added the result of some logics that give me objects build from external datas (taken through xmlrpc in my case). I added those objects to the qs through _result_cache attribute.
I think it's not clean and in fact my Model is now a custom Model and I don't have any database field. I may use it to fill database Models... However I can use it the same way as classic models: MyModel.objects.all() for example.
If anyone has another idea I'd really appreciate.
Regards