Passing JSON (dict) vs. Model Object to Templates in Django - python

Question
In Django, when using data from an API (that doesn't need to be saved to the database) in a view, is there reason to prefer one of the following:
Convert API data (json) to a json dictionary and pass to the template
Convert API data (json) to the appropriate model object from models.py and then pass that to the template
What I've Considered So Far
Performance: I timed both approaches and averaged them over 25 iterations. Converting the API response to a model object was slower by approximately 50ms (0.4117 vs. 0.4583 seconds, +11%). This did not include timing rendering.
Not saving this data to the database does prevent me from creating many-to-many relationships with the API's data (must save an object before adding M2M relationships), however, I want the API to act as the store for this data, not my app
DRY: If I find myself using this API data in multiple views, I may find convenience in putting all my consumption/cleaning/etc. code in the appropriate object __init__ in models.
Thanks very much in advance.

Converting this to a model objects doesn't require storing it in database.
Also if you are sure you don't want to store it maybe placing it in models.py and making it Django models is wrong idea. Probably it should be just normal Python classes e.g. in resources.py or something like that, not to mistake it with models. I prefer such way because maybe converting is slower (very tiny) but it allows to add not only custom constructor but others methods and properties as well which is very helpful. It also is just convenient and organizes stuff when you use normal classes and objects.

Pass the list of dictionaries directly to the template. When you need to use it with models, use .values() to get a list of dictionaries from it instead.

Related

Django Model with API as data source

I want to create a new model which uses the https://developer.microsoft.com/en-us/graph/graph-explorer api as a data source as i want to have additional info on the user.
Using a computed property on the model does not work as it is going to query for each instance in the set.
So, i want to have the model relate to a new model which has the api as it´s data source.
I could not find anything on this topic, besides maybe abusing the from_db() method, if this even works.
It appears that what you're trying to do is to cache data from an external API that relates to, and augments/enriches, your user model data. If so, you can simply use a custom user model (instead of Django's default; this is a highly-recommended practice anyway) and then simply store the external API data in serialized form in a TextField attribute of your custom user model (let's call it user_extras; you can write model methods that serializes and deserializes this field for convenience upon access in your views).
The key challenge then is how to keep user_extras fresh, without doing something terrible performance-wise or hitting some constraint like API call limits. As you said, we can't do API queries in computed properties. (At least not synchronously.) One option then is to have a batch job/background task that regularly goes through your user database to update the user_extras in some controlled, predictable fashion.

Convert Django models to normal objects

Disclaimer: I'm coming from PHP, where there's stdClass, whereas I don't know if something like that exists in Python.
I'm trying to add a few custom properties to Django models and converting them to JSON using json.dumps(). I tried converting them to dicts, but the custom properties don't get converted too. So I'm trying to convert the models to simple objects like PHP's stdClass, so that I can add whatever properties I like to it.
Is this possible, or is there an easier way to add custom properties to a model and JSON-encode it?
Do you want the JSON blob in the database aswell?
If not, then you simple define the properties as usual like normal python methods on the model class, and add the code to export them in your serialiser.
If so, then there is a nice field from django-extensions for this
from django_extensions.db.fields.json import JSONField
You just add the field on your model, and it should handle the conversion to and from python / database representations automagically for you.

Querying through several models

I have a django project with 5 different models in it. All of them has date field. Let's say i want to get all entries from all models with today date. Of course, i could just filter every model, and put results in one big list, but i believe it's bad. What would be efficient way to do that?
I don't think that it's a bad idea to query each model separately - indeed, from a database perspective, I can't see how you'd be able to do otherwise, as each model will need a separate SQL query. Even if, as #Nagaraj suggests, you set up a common Date model every other model references, you'd still need to query each model separately. You are probably correct, however, that putting the results into a list is bad practice, unless you actually need to load every object into memory, as explained here:
Be warned, though, that [evaluating a QuerySet as a list] could have a large memory overhead, because Django will load each element of the list into memory. In contrast, iterating over a QuerySet will take advantage of your database to load data and instantiate objects only as you need them.
It's hard to suggest other options without knowing more about your use case. However, I think I'd probably approach this by making a list or dictionary of QuerySets, which I could then use in my view, e.g.:
querysets = [cls.objects.filter(date=now) for cls in [Model1, Model2, Model3]]
Take a look at using Multiple Inheritance (docs here) to define those date fields in a class that you can subclass in the classes you want to return in the query.
For example:
class DateStuff(db.Model):
date = db.DateProperty()
class MyClass1(DateStuff):
...
class MyClass2(DateStuff):
...
I believe Django will let you query over the DateStuff class, and it'll return objects from MyClass1 and MyClass2.
Thank #nrabinowitz for pointing out my previous error.

Python app to django web app

I've written some python code to accomplish a task. Currently, there are 4-5 classes that I'm storing in separate files. I'd now like to change this whole thing into a database-backed web app. I've been reading tutorials on Django, and so far I get the impression that I'll need to manually specify the fields and their types for every "model" that I use. This is a little surprising to me, since I was expecting some kind of ORM capability that would just take the existing classes I've already defined, and map them onto a database somehow, in a manner abstracted away from me.
Is this not the case? Am I missing something? It looks like I need to specify all the fields and types in the file 'models.py'.
Okay, now beyond those specifics, does anyone have any general tips on the best way to migrate an object-oriented desktop application to a web application?
Thanks!
That is Django's ORM: it maps classes to tables. What else did you expect? There needs to be some way of specifying what the fields are, though, before you can use them, and that's managed through the models.Model class and the various models.Field subclasses. You can certainly use your classes as mixins in order to use the existing business logic on top of the field definitions.
If you are thinking about a database backend based web app, you have to specify what fields of the data you want to store and what type of the value you want stored.
There is an abstraction that introspects the db to convert it into the django models.py format. But I know not of any that introspects a python class and stores arbitrary data into db. How would that even work? Are the objects, now, stored as a pickle?
You're going to have to check the output, but you can have Django automatically create models from existing databases through one-time introspection.
Taken from the link below, you would set up your database in settings.py, and then call
python manage.py inspectdb
This will dump the sample models.py file to standard out for your inspection. In order to create the file, simply redirect the output
python manage.py inspectdb > models.py
See for more:
http://docs.djangoproject.com/en/dev/howto/legacy-databases/?from=olddocs#auto-generate-the-models

How to store a dictionary on a Django Model?

I need to store some data in a Django model. These data are not equal to all instances of the model.
At first I thought about subclassing the model, but I’m trying to keep the application flexible. If I use subclasses, I’ll need to create a whole class each time I need a new kind of object, and that’s no good. I’ll also end up with a lot of subclasses only to store a pair of extra fields.
I really feel that a dictionary would be the best approach, but there’s nothing in the Django documentation about storing a dictionary in a Django model (or I can’t find it).
Any clues?
If it's really dictionary like arbitrary data you're looking for you can probably use a two-level setup with one model that's a container and another model that's key-value pairs. You'd create an instance of the container, create each of the key-value instances, and associate the set of key-value instances with the container instance. Something like:
class Dicty(models.Model):
name = models.CharField(max_length=50)
class KeyVal(models.Model):
container = models.ForeignKey(Dicty, db_index=True)
key = models.CharField(max_length=240, db_index=True)
value = models.CharField(max_length=240, db_index=True)
It's not pretty, but it'll let you access/search the innards of the dictionary using the DB whereas a pickle/serialize solution will not.
Another clean and fast solution can be found here: https://github.com/bradjasper/django-jsonfield
For convenience I copied the simple instructions.
Install
pip install jsonfield
Usage
from django.db import models
from jsonfield import JSONField
class MyModel(models.Model):
json = JSONField()
If you don't need to query by any of this extra data, then you can store it as a serialized dictionary. Use repr to turn the dictionary into a string, and eval to turn the string back into a dictionary. Take care with eval that there's no user data in the dictionary, or use a safe_eval implementation.
For example, in the create and update methods of your views, you can add:
if isinstance(request.data, dict) == False:
req_data = request.data.dict().copy()
else:
req_data = request.data.copy()
dict_key = 'request_parameter_that_has_a_dict_inside'
if dict_key in req_data.keys() and isinstance(req_data[dict_key], dict):
req_data[dict_key] = repr(req_data[dict_key])
I came to this post by google's 4rth result to "django store object"
A little bit late, but django-picklefield looks like good solution to me.
Example from doc:
To use, just define a field in your model:
>>> from picklefield.fields import PickledObjectField
>>> class SomeObject(models.Model):
>>> args = PickledObjectField()
and assign whatever you like (as long as it's picklable) to the field:
>>> obj = SomeObject()
>>> obj.args = ['fancy', {'objects': 'inside'}]
>>> obj.save()
As Ned answered, you won't be able to query "some data" if you use the dictionary approach.
If you still need to store dictionaries then the best approach, by far, is the PickleField class documented in Marty Alchin's new book Pro Django. This method uses Python class properties to pickle/unpickle a python object, only on demand, that is stored in a model field.
The basics of this approach is to use django's contibute_to_class method to dynamically add a new field to your model and uses getattr/setattr to do the serializing on demand.
One of the few online examples I could find that is similar is this definition of a JSONField.
I'm not sure exactly sure of the nature of the problem you're trying to solve, but it sounds curiously similar to Google App Engine's BigTable Expando.
Expandos allow you to specify and store additional fields on an database-backed object instance at runtime. To quote from the docs:
import datetime
from google.appengine.ext import db
class Song(db.Expando):
title = db.StringProperty()
crazy = Song(title='Crazy like a diamond',
author='Lucy Sky',
publish_date='yesterday',
rating=5.0)
crazy.last_minute_note=db.Text('Get a train to the station.')
Google App Engine currently supports both Python and the Django framework. Might be worth looking into if this is the best way to express your models.
Traditional relational database models don't have this kind of column-addition flexibility. If your datatypes are simple enough you could break from traditional RDBMS philosophy and hack values into a single column via serialization as #Ned Batchelder proposes; however, if you have to use an RDBMS, Django model inheritance is probably the way to go. Notably, it will create a one-to-one foreign key relation for each level of derivation.
This question is old, but I was having the same problem, ended here and the chosen answer couldn't solve my problem anymore.
If you want to store dictionaries in Django or REST Api, either to be used as objects in your front end, or because your data won't necessarily have the same structure, the solution I used can help you.
When saving the data in your API, use json.dump() method to be able to store it in a proper json format, as described in this question.
If you use this structure, your data will already be in the appropriate json format to be called in the front end with JSON.parse() in your ajax (or whatever) call.
I use a textfield and json.loads()/json.dumps()
models.py
import json
from django.db import models
class Item(models.Model):
data = models.TextField(blank=True, null=True, default='{}')
def save(self, *args, **kwargs):
## load the current string and
## convert string to python dictionary
data_dict = json.loads(self.data)
## do something with the dictionary
for something in somethings:
data_dict[something] = some_function(something)
## if it is empty, save it back to a '{}' string,
## if it is not empty, convert the dictionary back to a json string
if not data_dict:
self.data = '{}'
else:
self.data = json.dumps(data_dict)
super(Item, self).save(*args, **kwargs)
Django-Geo includes a "DictionaryField" you might find helpful:
http://code.google.com/p/django-geo/source/browse/trunk/fields.py?r=13#49
In general, if you don't need to query across the data use a denormalized approach to avoid extra queries. User settings are a pretty good example!
I agree that you need to refrain stuffing otherwise structured data into a single column. But if you must do that, Django has an XMLField build-in.
There's also JSONField at Django snipplets.
Being "not equal to all instances of the model" sounds to me like a good match for a "Schema-free database". CouchDB is the poster child for that approach and you might consider that.
In a project I moved several tables which never played very nice with the Django ORM over to CouchDB and I'm quite happy with that. I use couchdb-python without any of the Django-specific CouchDB modules. A description of the data model can be found here. The movement from five "models" in Django to 3 "models" in Django and one CouchDB "database" actually slightly reduced the total lines of code in my application.
I know this is an old question, but today (2021) the cleanest alternative is to use the native JSONfield (since django 3.1)
docs: https://docs.djangoproject.com/en/3.2/ref/models/fields/#django.db.models.JSONField
you just create a field in the model called jsonfield inside the class model and voilá
Think it over, and find the commonalities of each data set... then define your model. It may require the use of subclasses or not. Foreign keys representing commonalities aren't to be avoided, but encouraged when they make sense.
Stuffing random data into a SQL table is not smart, unless it's truly non-relational data. If that's the case, define your problem and we may be able to help.
If you are using Postgres, you can use an hstore field: https://docs.djangoproject.com/en/1.10/ref/contrib/postgres/fields/#hstorefield.

Categories

Resources