I have a BaseSchema and BaseDocument models that define necessary configuration for two types of models (one for application request-response side of things, and the other is for defining MongoDB documents using beanie) that I am going to use in my app. For a simple setting, I have made my models like this:
from datetime import datetime, timezone
from typing import Any
from beanie import Document
from pydash import camel_case
from pydantic import BaseModel, Extra
def format_datetime_into_isoformat(timestamp: datetime) -> str:
return timestamp.replace(tzinfo=timezone.utc).isoformat().replace("+00:00", "Z")
def format_dict_key_to_camel_case(key: str) -> str:
return camel_case(key)
class BaseDocument(BaseModel):
class Config:
orm_mode: bool = True
use_enum_values: bool = True
validate_assignment: bool = True
allow_population_by_field_name: bool = True
json_encoders: dict = {datetime: format_datetime_into_isoformat}
class BaseSchema(BaseModel):
class Config:
extra = Extra.forbid
allow_population_by_field_name: bool = True
alias_generator: Any = format_dict_key_to_camel_case
class UserBase(BaseDocument):
user_name: str
email: str
password: str
class User(Document, UserBase):
class Settings:
name = "users"
indexes = ["user_name", "email"]
class UserAPI(BaseSchema, UserBase):
_id: str
the problem is, when I try to use the UserAPI model, it only contains the fields defined on the UserBase model, but not the _id field that I defined in the model itself. I am unable to figure out what's going on with the inheritence here, but I am pretty sure that the fields defined in the model itself should have priority over any other fields in the models it inherited from and even override the ones having common names. Anyone knows what's wrong here?
I have also printed the schema for the UserAPI model, it outputs as such:
{'title': 'UserAPI', 'type': 'object', 'properties': {'userName': {'title': 'Username', 'type': 'string'}, 'email': {'title': 'Email', 'type': 'string'}, 'password': {'title': 'Password', 'type': 'string'}}, 'required': ['userName', 'email', 'password'], 'additionalProperties': False}
I am expecting to have a UserBase which will define the general schema, and User model which will map a MongoDB collection using beanie and it will inherit both Document and UserBase (this part works perfectly fine). And then I want to have a UserAPI model which will add another field on top of UserBase so that I can have the ids with my output after a record is saved in MongoDB collection.
The underscore in from the "id" makes it a class attribute and excludes it from the model. Pydantic info here
You can work around this by calling the attribute "id" and use an alias with an underscore
from pydantic import Field
class UserAPI(BaseSchema, UserBase):
id: ObjectId = Field(None, alias="_id", title="Primary key", description="mongodb's Primary key")
inspiration from this
Related
Currently I have schema with field "name". I use constr to specify it
from pydantic import BaseModel, constr
class MySchema(BaseModel):
name: constr(strict=True, min_length=1, max_length=50)
I want to use pydantic StrictStr type like this:
from pydantic import BaseModel, StrictStr, Field
class MySchema(BaseModel):
name: StrictStr = Field(min_length=1, max_length=50)
But it raises error:
E ValueError: On field "name" the following field constraints are set but not enforced: max_length, min_length.
E For more details see https://pydantic-docs.helpmanual.io/usage/schema/#unenforced-field-constraints
In docs, as i understand, it advices to use use raw attribute name like maxLength isntead of max_length(like exclusiveMaximum for int) but this constraints are not enforced so validations are not applied.
My question is: How I can use StrictStr type for name field and apply native validations like min_length, max_length?
You have multiple options here, either you create a new type based on StrictString, or you inherit from StrictString or you use constr with strict set to True. Creating a type as done below does the same as inheriting from StrictString, just a different syntax if you want. That should all give you the necessary type validations. In code, that would read like
MyStrictStr1 = type('MyStrictStr', (StrictStr,), {"min_length":1, "max_length":5})
class MyStrictStr2(StrictStr):
min_length = 1
max_length = 5
MyStrictStr3 = constr(min_length=1, max_length=5, strict=True)
I have a several comples mongo models. For example, a User that references a Role with some properties. Now when I retrieve users, I want the role property to be populated with those of the referenced role object, not the object id.
from mongoengine import *
connect('test_database')
class Role(Document):
name = StringField(required=True)
description = StringField(required=True)
class User(Document):
role = ReferenceField(Role, reverse_delete_rule=DENY)
r = Role(name='test', description='foo').save()
User(role=r).save()
print(User.objects().select_related()[0].to_mongo().to_dict())
# prints: {'_id': ObjectId('5c769af4e98fc24f4a82fd99'), 'role': ObjectId('5c769af4e98fc24f4a82fd98')}
# want: {'_id': '5c769af4e98fc24f4a82fd99', 'role': {'name' : 'test', 'description' : 'foo'}}
How do I go about achieving this, for any complex mongoengine object?
Mongoengine does not provide anything out of the box but you can either define a method (e.g to_dict(self)) on your Document class, or use a serialisation library like marshmallow-mongoengine
Using django rest framework I want to validate fields.
Correct input request:
{
test_field_a: {test_field_c: 25},
test_field_b: {}
}
My serializers.py (I don't have any associated models and the models.py itself):
from rest_framework import serializers
class TestSerializer(serializers.Serializer):
test_field_a = serializers.JSONField(label='test_field_a', allow_null=False, required=True)
test_field_b = serializers.JSONField(label='test_field_b', required=True)
test_field_c = serializers.IntegerField(label='test_field_c)
Wrong input request (which should state that int field is required) :
{
test_field_a: {test_field_c: 'wrong'},
test_field_b: {}
}
Now test_field_a and test_field_b are validated as required. But how to make validation of fields on different levels of the request? (in this case test_field_c)
JSONField just checks that a field contains correct JSON structure. You need to do it plus check values from this JSON.
There are several ways to do it:
You can write your own custom field type (it's nice if you are planning to do something similar in other serializers);
You can change field validation (try something like this):
from rest_framework import serializers
class TestSerializer(serializers.Serializer)::
test_field_a = serializers.JSONField(label='test_field_a', allow_null=False, required=True)
test_field_b = serializers.JSONField(label='test_field_b', required=True)
def validate_test_field_a(self, value):
"""
Check that test_field_a .
"""
if not isinstance(value.get('test_field_c'), int):
raise serializers.ValidationError("Some error message")
return value
You can try nested validation:
from rest_framework import serializers
class Test1Serializer(serializers.Serializer):
test_field_c = serializers.IntegerField(label='test_field_c')
class TestSerializer(serializers.Serializer):
test_field_a = Test1Serializer()
test_field_b = serializers.JSONField(label='test_field_b', required=True)
The serializer's JSONField does not have a validation for nested fields because it is not meant to nest explicitly declared fields and as far as I know, there is currently no way to specify a json schema to validate it.
What you can do is validate the field yourself by declaring a validate_test_field_a validation method.
For example:
def validate_test_field_a(self, value):
if 'test_field_c' not in value:
raise serializers.ValidationError('`test_field_c` is required')
return value
Generally, if you find yourself needing to validate the nested type inside the JSONField, then it is a sign of bad architecture and you should consider using nested serializers instead. Same applies to using JSONField in the model
Overview
I am using Flask-SqlAlchemy and now I am looking into marshmallow to help me serialize and deserialize request data.
I was able to successfully:
Create my models using Flask-SqlAlchemy
Use Flask-Marshmallow to serialize database objects using the same model, by using the Optional Flask-SqlAlchemy Integration
Use marshmallow-jsonapi to quickly generate Json API compliant responses. This required me to declare new Schemas to specify which attributes I want to include (this is duplicate from Flask-SqlAlchemy Models)
Code Samples
Flask-SqlAlchemy Declarative Model
class Space(db.Model):
__tablename__ = 'spaces'
id = sql.Column(sql.Integer, primary_key=True)
name = sql.Column(sql.String)
version = sql.Column(sql.String)
active = sql.Column(sql.Boolean)
flask_marshmallow Schema Declaration (Inherits from SqlAlchemy Model)
ma = flask_marshmallow.Marshmallow(app)
class SpaceSchema(ma.ModelSchema):
class Meta:
model = Space
# API Response
space = Space.query.first()
return SpaceSchema().dump(space).data
# Returns:
{
'id': 123,
'version': '0.1.0',
'name': 'SpaceName',
'active': True
}
marshmallow_json api - requires new Schema Declaration, must include each attribute and type manually
class SpaceJsonSchema(marshmallow_json.Schema):
id = fields.Str(dump_only=True)
name = fields.Str()
version = fields.Str()
active = fields.Bool()
class Meta:
type_ = 'spaces'
self_url = '/spaces/{id}'
self_url_kwargs = {'id': '<id>'}
self_url_many = '/spaces/'
strict = True
# Returns Json API Compliant
{
'data': {
'id': '1',
'type': 'spaces',
'attributes': {
'name': 'Phonebooth',
'active': True,
'version': '0.1.0'
},
'links': {'self': '/spaces/1'}
},
'links': {'self': '/spaces/1'}
}
Issue
As shown in the code, marshmallow-jsonapi allows me to create json api compliant responses, but I end up having to maintain a Declarative Model + Schema Response model.
flask-marshmallow allows me to create Schema responses from the SqlAlchemy models, so I don't have to maintain a separate set of properties for each model.
Question
Is it at all possible to use flask-marshmallow and marshmallow-jsonapi together so 1. Create Marshmallow Schema from a SqlAlchemy model, AND automatically generate json api responses?
I tried creating Schema declaration that inherited from ma.ModelSchema and marshmallow_json.Schema, in both orders, but it does not work (raises exception for missing methods and properties)
marshmallow-jsonapi
marshmallow-jsonapi provides a simple way to produce JSON
API-compliant data in any Python web framework.
flask-marshmallow
Flask-Marshmallow includes useful extras for integrating with
Flask-SQLAlchemy and marshmallow-sqlalchemy.
Not a solution to this exact problem but I ran into similar issues when implementing this library : https://github.com/thomaxxl/safrs (sqlalchemy + flask-restful + jsonapi compliant spec).
I don't remember exactly how I got around it, but if you try it and serialization doesn't work I can help you resolve it if you open an issue in github
I have parent-child relationships in DataStore model: Building entity with reference entity to Office. I perform query on Building model and I would like to limit fields of Office entity in JSON response.
Here is my code:
#Building.query_method(collection_fields=('id', 'name', 'office'), path='buildings', name='list')
def List(self, query):
return query
collection_fields attribute works great only to define parent entity fields (Building), but how to limit fields of child entity?
Here is my response message in JSON:
{ id : 5
name : 'building name'
office: {
name: 'office name',
field1 : 'test',
field1 : 'test',
field1 : 'test'
}
}
I would like to remove some fields from Office object (i.e field1,field2 etc) to reduce JSON response size.
Define limited_message_fields_schema of Office object is not good solution, because it works globally. I would like to format only this single query.
You can create EndpointsAliasProperty in the Building model, where you can transform self.office and use that value in collection_fields:
#EndpointsAliasProperty
def office_ltd(self):
limited = doSomethingWith(self.office)
return limited
#Building.query_method(collection_fields=('id', 'name', 'office_ltd'),
path='buildings', name='list')
def List(self, query):
return query