Overview
I am using Flask-SqlAlchemy and now I am looking into marshmallow to help me serialize and deserialize request data.
I was able to successfully:
Create my models using Flask-SqlAlchemy
Use Flask-Marshmallow to serialize database objects using the same model, by using the Optional Flask-SqlAlchemy Integration
Use marshmallow-jsonapi to quickly generate Json API compliant responses. This required me to declare new Schemas to specify which attributes I want to include (this is duplicate from Flask-SqlAlchemy Models)
Code Samples
Flask-SqlAlchemy Declarative Model
class Space(db.Model):
__tablename__ = 'spaces'
id = sql.Column(sql.Integer, primary_key=True)
name = sql.Column(sql.String)
version = sql.Column(sql.String)
active = sql.Column(sql.Boolean)
flask_marshmallow Schema Declaration (Inherits from SqlAlchemy Model)
ma = flask_marshmallow.Marshmallow(app)
class SpaceSchema(ma.ModelSchema):
class Meta:
model = Space
# API Response
space = Space.query.first()
return SpaceSchema().dump(space).data
# Returns:
{
'id': 123,
'version': '0.1.0',
'name': 'SpaceName',
'active': True
}
marshmallow_json api - requires new Schema Declaration, must include each attribute and type manually
class SpaceJsonSchema(marshmallow_json.Schema):
id = fields.Str(dump_only=True)
name = fields.Str()
version = fields.Str()
active = fields.Bool()
class Meta:
type_ = 'spaces'
self_url = '/spaces/{id}'
self_url_kwargs = {'id': '<id>'}
self_url_many = '/spaces/'
strict = True
# Returns Json API Compliant
{
'data': {
'id': '1',
'type': 'spaces',
'attributes': {
'name': 'Phonebooth',
'active': True,
'version': '0.1.0'
},
'links': {'self': '/spaces/1'}
},
'links': {'self': '/spaces/1'}
}
Issue
As shown in the code, marshmallow-jsonapi allows me to create json api compliant responses, but I end up having to maintain a Declarative Model + Schema Response model.
flask-marshmallow allows me to create Schema responses from the SqlAlchemy models, so I don't have to maintain a separate set of properties for each model.
Question
Is it at all possible to use flask-marshmallow and marshmallow-jsonapi together so 1. Create Marshmallow Schema from a SqlAlchemy model, AND automatically generate json api responses?
I tried creating Schema declaration that inherited from ma.ModelSchema and marshmallow_json.Schema, in both orders, but it does not work (raises exception for missing methods and properties)
marshmallow-jsonapi
marshmallow-jsonapi provides a simple way to produce JSON
API-compliant data in any Python web framework.
flask-marshmallow
Flask-Marshmallow includes useful extras for integrating with
Flask-SQLAlchemy and marshmallow-sqlalchemy.
Not a solution to this exact problem but I ran into similar issues when implementing this library : https://github.com/thomaxxl/safrs (sqlalchemy + flask-restful + jsonapi compliant spec).
I don't remember exactly how I got around it, but if you try it and serialization doesn't work I can help you resolve it if you open an issue in github
Related
I'm trying to do something pretty simple: get the current time, validate my object with marshmallow, store it in mongo
python 3.7
requirements:
datetime==4.3
marshmallow==3.5.1
pymongo==3.10.1
schema.py
from marshmallow import Schema, fields
...
class MySchema(Schema):
user_id = fields.Str(required=True)
user_name = fields.Str()
date = fields.DateTime()
account_type = fields.Str()
object = fields.Raw()
preapredata.py
from datetime import datetime
from schema.py import Myschema
...
dt = datetime.now()
x = dt.isoformat()
data = {
"user_id": '123123123',
"user_name": 'my cool name',
"date": x,
"account_type": 'another sting',
"trade": {'some':'dict'}
}
# validate the schema for storage
validator = MySchema().load(data)
if 'errors' in validator:
log.info('validator.errors')
log.info(validator.errors)
...
res = MyService().create(
data
)
myservice.py
def create(self, data):
log.info("in creating data service")
log.info(data)
self.repo.create(data)
return MySchema().dump(data)
connector to mongo is fine, am saving other data that has no datetime with no issue.
I seem to have gone through a hundred different variations of formatting the datetime before passing it to the date key, as well as specifying the 'format' option in the schema field both inline and in the meta class, example:
#class Meta:
# datetimeformat = '%Y-%m-%dT%H:%M:%S+03:00'
Most variations I try result in:
{'date': ['Not a valid datetime.']}
i've finally managing to pass validation going in by using simply
x = dt.isoformat()
and leaving the field schema as default ( date = fields.DateTime() )
but when i dump back through marshmallow i get
AttributeError: 'str' object has no attribute 'isoformat'
the record is created in mongo DB fine, but the field type is string, ideally I'd like to leverage the native mongo date field
if i try and pass
datetime.now()
to the date, it fails with
{'date': ['Not a valid datetime.']}
same for
datetime.utcnow()
Any guidance really appreciated.
Edit: when bypassing marshmallow, and using either
datetime.now(pytz.utc)
or
datetime.utcnow()
field data stored in mongo as expected as date, so the issue i think can be stated more succinctly as: how can i have marshmallow fields.DateTime() validate either of these formats?
Edit 2:
so we have already begun refactoring thanks to Jérôme's insightful answer below.
for anyone who wants to 'twist' marshmallow to behave like the original question stated, we ended up going with:
date = fields.DateTime(
#dump_only=True,
default=lambda: datetime.utcnow(),
missing=lambda: datetime.utcnow(),
allow_none=False
)
i.e. skip passing date at all, have marshmallow generate it from missing, which was satisfying our use case.
The point of marshmallow is to load data from serialized (say, JSON, isoformat string, etc.) into actual Python objects (int, datetime,...). And conversely to dump it from object to a serialized string.
Marshmallow also provides validation on load, and only on load. When dumping, the data comes from the application and shouldn't need validation.
It is useful in an API to load and validate data from the outside world before using it in an application. And to serialize it back to the outside world.
If your data is in serialized form, which is the case when you call isoformat() on your datetime, then marshmallow can load it, and you get a Python object, with a real datetime in it. This is what you should feed pymongo.
# load/validate the schema for storage
try:
loaded_data = MySchema().load(data)
except ValidationError as exc:
log.info('validator.errors')
log.info(exc.errors)
...
# Store object in database
res = MyService().create(loaded_data)
Since marshmallow 3, load always returns deserialized content and you need to try/catch validation errors.
If your data does not come to your application in deserialized form (if it is in object form already), then maybe marshmallow is not the right tool for the job, because it does not perform validation on deserialized objects (see https://github.com/marshmallow-code/marshmallow/issues/1415).
Or maybe it is. You could use an Object-Document Mapper (ODM) to manage the validation and database management. This is an extra layer other pymongo. umongo is a marshmallow-based mongoDB ODM. There are other ODMs out there: mongoengine, pymodm.
BTW, what is this
datetime==4.3
Did you install DateTime? You don't need this.
Disclaimer: marshmallow and umongo maintainer speaking.
I have the following Joi schema validation in my node project, which I am planning to convert into python using marshmallow library.
Joi Schema:
aws_access_key: Joi.string().label('AWS ACCESS KEY').required().token().min(20),
aws_secret_key: Joi.string().label('AWS SECRET KEY').required().base64().min(40),
encryption: Joi.string().label('AWS S3 server-side encryption').valid('SSE_S3', 'SSE_KMS', 'CSE_KMS').optional(),
kmsKey: Joi.string().label('AWS S3 server-side encryption KMS key').when('encryption', { is: Joi.valid('SSE_KMS', 'CSE_KMS'), then: Joi.string().required() })
Here is what I did so far using marshmallow in python
from marshmallow import Schema, fields
from marshmallow.validate import OneOf, Length
class AWSSchema(Schema):
aws_access_key = fields.String("title", required=True, validate=Length(min=20))
aws_secret_key = fields.String(required=True, validate=Length(min=40))
encryption = fields.String(required=False, validate=OneOf(['SSE_S3', 'SSE_KMS', 'CSE_KMS']))
kmskey = fields.String(validate=lambda obj: fields.String(required=True) if obj['encryption'] in ('SSE_KMS', 'CSE_KMS') else fields.String(required=False))
demo = {
"aws_access_key": "AKXXXXXXXXXXXXXXXXXXX",
"aws_secret_key": "YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY",
"encryption_type": "SSE_KMS"
}
schema = AWSSchema()
print(schema.dump(demo))
if encryption_type value is set to SSE_KMS or CSE_KMS, then I need kmskey field should be required field. But the validation is not working as expected. Any help is appreciated?
Marshmallow has methods you can overwrite to do top-level validation at various points in the dump or load process. The documentation for pre_dump can be found here. Also checkout pre_load and post_dump.
https://marshmallow.readthedocs.io/en/stable/api_reference.html#marshmallow.decorators.pre_dump
I want to have model that don't need to be created in database as table but use data from other sources like json files and other model objects. So I am using Non-Managed Model. This model is being created dynamically as described here in the docs
Now Let me explain how I want to create the fields in this non-managed model. I have a json file that defines what fields should be in model, Let's call it contact-model.json. fields in this json file are mapped to django fields and dynamic model is created. This part is done.
{
'model_name': 'Contact'
'fields': {
'name': 'CharField',
'email': 'EmailField'
}
}
Now I have model that is storing all data related to above contact-model.json. Code for that model is given below
class GenericAnswer(models.Model):
answer = JSONField()
model = CharField(default='Contact', max_length=30)
And value of that asnwer would be
{
'name': 'Adil Malik',
'email': 'sample#email.com'
}
What I want to do is that when I fetch Contact.objects.all(), It should return objects fetching from GenericAnswer based on model name, in this onctext, it is Contact.
Can I do it. If Yes, Plese Explain How ????
I'm using Flask-Restful to create an API to store geojson data. Geojson allows for storing 'properties', and makes no restrictions on what these parameters can be (could store a color, a nickname, etc.) I would like to transmit and store this data using flask-restful, but I'm not sure that I can do this with 'open-ended' data. It appears when I use 'marshal' for my data, I need to specify exactly the fields I expect.
from flask import Flask, abort
from flask.ext.restful import Api, Resource, fields, marshal, reqparse
class GeoAPI(Resource):
def get(self, id):
geo= session.query(data.Geo).filter_by(id= id).first()
if (geo):
return {'geo': marshal(geo, geo_fields)}
else:
abort(404)
geo_fields = {
"name": fields.String,
"coordinates": fields.List(fields.List(fields.List(fields.Float))),
"parameters": fields.String, # String may be the wrong type, tried nested?
'version': fields.String,
'uri': fields.Url('geo')
}
api.add_resource(GeoAPI, '/pathopt/api/0.1/geos/<int:id>', endpoint = 'geo')
The data for Geo pulls from a SQLAlchemy query.
Is it possible to state that 'properties' is an object which can contain many different fields, or does this require me to explicitly state my field names?
I was trying out mongokit and I'm having a problem. I thought it would be possible to add fields not present in the schema on the fly, but apparently I can't save the document.
The code is as below:
from mongokit import *
connection = Connection()
#connection.register
class Test(Document):
structure = {'title': unicode, 'body': unicode}
On the python shell:
test = connection.testdb.testcol.Test()
test['foo'] = u'bar'
test['title'] = u'my title'
test['body'] = u'my body'
test.save()
This gives me a
StructureError: unknown fields ['foo'] in Test
I have an application where, while I have a core of fields that are always present, I can't predict what new fields will be necessary beforehand. Basically, in this case, it's up to the client to insert what fields it find necessary. I'll just receive whatever he sends, do my thing, and store them in mongodb.
But there is still a core of fields that are common to all documents, so it would be nice to type and validate them.
Is there a way to solve this with mongokit?
According to the MongoKit structure documentation you can have optional fields if you use the Schemaless Structure feature.
As of version 0.7, MongoKit allows you to save partially structured documents.
So if you set up your class like this, it should work:
from mongokit import *
class Test(Document):
use_schemaless = True
structure = {'title': unicode, 'body': unicode}
required_fields = [ 'title', 'body' ]
That will require title and body but should allow any other fields to be present. According to the docs:
MongoKit will raise an exception only if required fields are missing