How to convert Pymodm objects to JSON? - python

I am using Pymodm as a mongoDB odm with python flask. I have looked through code and documentation (https://github.com/mongodb/pymodm and http://pymodm.readthedocs.io/en/latest) but could not find what I was looking for.
I am looking for an easy way to fetch data from the database without converting it to a pymodm object but as plain JSON. Is this possible with pymodm?
Currently, I am overloading the flask JSONEncoder to handle DateTime and ObjectID and use that to convert the pymodm Object to JSON.

It is not obvious from the PyMODM documentation, but here's how to do it:
pymodm_obj.to_son().to_dict()
Actually, I just re-read your question, and I don't think anything is forcing you to use PyMODM everywhere in your project once you have made the decision to use it. So if you are just looking for the JSON structures, you could just use the base pymongo package functionality.

Having:
from pymodm import MongoModel, fields
import json
class Foo(MongoModel):
name = fields.CharField(required=True)
a=Foo()
You can do:
jsonFooString=json.dumps(a.to_son().to_dict())

If you need to build CRUD api you might also want to check this little package, basically DRF for pymodm
So if you want to create CREATE/UPDATE/DELETE it would look like this
from api.pymodm_rest import viewsets
class ServiceAreaViewSet(viewsets.ModelViewSet):
queryset = ServiceArea.objects
instance_class = ServiceArea
lookup_field = '_id'
[https://github.com/lokoArt/pymodm_rest][1]

Related

Where to specify "use_natural_primary_keys" with generic class views?

I've been reading about natural_keys and have added the get_by_natural_key() and natural_key() methods to my model(s), but the Django docs (and several posts here in SO) say: "Then, when you call serializers.serialize(), you provide use_natural_foreign_keys=True or use_natural_primary_keys=True arguments" ...followed by this example:
>>> serializers.serialize('json', [book1, book2], indent=2,
... use_natural_foreign_keys=True, use_natural_primary_keys=True)
But that example is from running in a python shell, not in the actual context of where to put it in code. From DRF, I'm using generic class based views. Where should I specify those arguments in that case?
EDIT: The ultimate goal is to be able to import fixtures using natural_keys instead of actual IDs.
You would not do this at all. serializers.serialize is Django's built-in - and very basic - serialization functionality. But you are using DRF, which has much more powerful abilities to serialize. In DRF you would define your serializer to use the relevant relational field.
Edit But I don't understand your edit at all. What do DRF's generic views have to do with fixtures?

Convert Django models to normal objects

Disclaimer: I'm coming from PHP, where there's stdClass, whereas I don't know if something like that exists in Python.
I'm trying to add a few custom properties to Django models and converting them to JSON using json.dumps(). I tried converting them to dicts, but the custom properties don't get converted too. So I'm trying to convert the models to simple objects like PHP's stdClass, so that I can add whatever properties I like to it.
Is this possible, or is there an easier way to add custom properties to a model and JSON-encode it?
Do you want the JSON blob in the database aswell?
If not, then you simple define the properties as usual like normal python methods on the model class, and add the code to export them in your serialiser.
If so, then there is a nice field from django-extensions for this
from django_extensions.db.fields.json import JSONField
You just add the field on your model, and it should handle the conversion to and from python / database representations automagically for you.

Passing JSON (dict) vs. Model Object to Templates in Django

Question
In Django, when using data from an API (that doesn't need to be saved to the database) in a view, is there reason to prefer one of the following:
Convert API data (json) to a json dictionary and pass to the template
Convert API data (json) to the appropriate model object from models.py and then pass that to the template
What I've Considered So Far
Performance: I timed both approaches and averaged them over 25 iterations. Converting the API response to a model object was slower by approximately 50ms (0.4117 vs. 0.4583 seconds, +11%). This did not include timing rendering.
Not saving this data to the database does prevent me from creating many-to-many relationships with the API's data (must save an object before adding M2M relationships), however, I want the API to act as the store for this data, not my app
DRY: If I find myself using this API data in multiple views, I may find convenience in putting all my consumption/cleaning/etc. code in the appropriate object __init__ in models.
Thanks very much in advance.
Converting this to a model objects doesn't require storing it in database.
Also if you are sure you don't want to store it maybe placing it in models.py and making it Django models is wrong idea. Probably it should be just normal Python classes e.g. in resources.py or something like that, not to mistake it with models. I prefer such way because maybe converting is slower (very tiny) but it allows to add not only custom constructor but others methods and properties as well which is very helpful. It also is just convenient and organizes stuff when you use normal classes and objects.
Pass the list of dictionaries directly to the template. When you need to use it with models, use .values() to get a list of dictionaries from it instead.

Implementing OData JSON interface on Django (Python)

We would like to have a OData JSON interface on our Django (Python 2.5.4) website. At the moment of writing there seems to be no library available.
I'm thinking of writing "some" logic to handle this ourselves.
Would it be a good idea to extend the Django JSON serializer?
Where and how to store the URI's related to the models?
I think it would be a good idea to extend the Django JSON serializer, but have a look at django-piston this might be the better route to go.
The URI's will have to be defined in your urls.py for your app, and then in your models you could define a function
get_odata_uri()
Which would work like the Django's get_absolute_url(). Instead of hardcoding it into your model, make sure you make use of the reverse function from django.core.urlresolvers

How to store a dictionary on a Django Model?

I need to store some data in a Django model. These data are not equal to all instances of the model.
At first I thought about subclassing the model, but I’m trying to keep the application flexible. If I use subclasses, I’ll need to create a whole class each time I need a new kind of object, and that’s no good. I’ll also end up with a lot of subclasses only to store a pair of extra fields.
I really feel that a dictionary would be the best approach, but there’s nothing in the Django documentation about storing a dictionary in a Django model (or I can’t find it).
Any clues?
If it's really dictionary like arbitrary data you're looking for you can probably use a two-level setup with one model that's a container and another model that's key-value pairs. You'd create an instance of the container, create each of the key-value instances, and associate the set of key-value instances with the container instance. Something like:
class Dicty(models.Model):
name = models.CharField(max_length=50)
class KeyVal(models.Model):
container = models.ForeignKey(Dicty, db_index=True)
key = models.CharField(max_length=240, db_index=True)
value = models.CharField(max_length=240, db_index=True)
It's not pretty, but it'll let you access/search the innards of the dictionary using the DB whereas a pickle/serialize solution will not.
Another clean and fast solution can be found here: https://github.com/bradjasper/django-jsonfield
For convenience I copied the simple instructions.
Install
pip install jsonfield
Usage
from django.db import models
from jsonfield import JSONField
class MyModel(models.Model):
json = JSONField()
If you don't need to query by any of this extra data, then you can store it as a serialized dictionary. Use repr to turn the dictionary into a string, and eval to turn the string back into a dictionary. Take care with eval that there's no user data in the dictionary, or use a safe_eval implementation.
For example, in the create and update methods of your views, you can add:
if isinstance(request.data, dict) == False:
req_data = request.data.dict().copy()
else:
req_data = request.data.copy()
dict_key = 'request_parameter_that_has_a_dict_inside'
if dict_key in req_data.keys() and isinstance(req_data[dict_key], dict):
req_data[dict_key] = repr(req_data[dict_key])
I came to this post by google's 4rth result to "django store object"
A little bit late, but django-picklefield looks like good solution to me.
Example from doc:
To use, just define a field in your model:
>>> from picklefield.fields import PickledObjectField
>>> class SomeObject(models.Model):
>>> args = PickledObjectField()
and assign whatever you like (as long as it's picklable) to the field:
>>> obj = SomeObject()
>>> obj.args = ['fancy', {'objects': 'inside'}]
>>> obj.save()
As Ned answered, you won't be able to query "some data" if you use the dictionary approach.
If you still need to store dictionaries then the best approach, by far, is the PickleField class documented in Marty Alchin's new book Pro Django. This method uses Python class properties to pickle/unpickle a python object, only on demand, that is stored in a model field.
The basics of this approach is to use django's contibute_to_class method to dynamically add a new field to your model and uses getattr/setattr to do the serializing on demand.
One of the few online examples I could find that is similar is this definition of a JSONField.
I'm not sure exactly sure of the nature of the problem you're trying to solve, but it sounds curiously similar to Google App Engine's BigTable Expando.
Expandos allow you to specify and store additional fields on an database-backed object instance at runtime. To quote from the docs:
import datetime
from google.appengine.ext import db
class Song(db.Expando):
title = db.StringProperty()
crazy = Song(title='Crazy like a diamond',
author='Lucy Sky',
publish_date='yesterday',
rating=5.0)
crazy.last_minute_note=db.Text('Get a train to the station.')
Google App Engine currently supports both Python and the Django framework. Might be worth looking into if this is the best way to express your models.
Traditional relational database models don't have this kind of column-addition flexibility. If your datatypes are simple enough you could break from traditional RDBMS philosophy and hack values into a single column via serialization as #Ned Batchelder proposes; however, if you have to use an RDBMS, Django model inheritance is probably the way to go. Notably, it will create a one-to-one foreign key relation for each level of derivation.
This question is old, but I was having the same problem, ended here and the chosen answer couldn't solve my problem anymore.
If you want to store dictionaries in Django or REST Api, either to be used as objects in your front end, or because your data won't necessarily have the same structure, the solution I used can help you.
When saving the data in your API, use json.dump() method to be able to store it in a proper json format, as described in this question.
If you use this structure, your data will already be in the appropriate json format to be called in the front end with JSON.parse() in your ajax (or whatever) call.
I use a textfield and json.loads()/json.dumps()
models.py
import json
from django.db import models
class Item(models.Model):
data = models.TextField(blank=True, null=True, default='{}')
def save(self, *args, **kwargs):
## load the current string and
## convert string to python dictionary
data_dict = json.loads(self.data)
## do something with the dictionary
for something in somethings:
data_dict[something] = some_function(something)
## if it is empty, save it back to a '{}' string,
## if it is not empty, convert the dictionary back to a json string
if not data_dict:
self.data = '{}'
else:
self.data = json.dumps(data_dict)
super(Item, self).save(*args, **kwargs)
Django-Geo includes a "DictionaryField" you might find helpful:
http://code.google.com/p/django-geo/source/browse/trunk/fields.py?r=13#49
In general, if you don't need to query across the data use a denormalized approach to avoid extra queries. User settings are a pretty good example!
I agree that you need to refrain stuffing otherwise structured data into a single column. But if you must do that, Django has an XMLField build-in.
There's also JSONField at Django snipplets.
Being "not equal to all instances of the model" sounds to me like a good match for a "Schema-free database". CouchDB is the poster child for that approach and you might consider that.
In a project I moved several tables which never played very nice with the Django ORM over to CouchDB and I'm quite happy with that. I use couchdb-python without any of the Django-specific CouchDB modules. A description of the data model can be found here. The movement from five "models" in Django to 3 "models" in Django and one CouchDB "database" actually slightly reduced the total lines of code in my application.
I know this is an old question, but today (2021) the cleanest alternative is to use the native JSONfield (since django 3.1)
docs: https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.JSONField
you just create a field in the model called jsonfield inside the class model and voilá
Think it over, and find the commonalities of each data set... then define your model. It may require the use of subclasses or not. Foreign keys representing commonalities aren't to be avoided, but encouraged when they make sense.
Stuffing random data into a SQL table is not smart, unless it's truly non-relational data. If that's the case, define your problem and we may be able to help.
If you are using Postgres, you can use an hstore field: https://docs.djangoproject.com/en/1.10/ref/contrib/postgres/fields/#hstorefield.

Categories

Resources