Related
I am trying to build an accounting database using flask as the front end. The main page is the ledger, with nine columns "date" "description" "debit" "credit" "amount" "account" "reference" "journal" and "year", I need to be able to query each and some times two at once, there are over 8000 entries, and growing. My code so far displays all the rows, 200 at a time with pagination, I have read "pep 8" which talks about readable code, I have read this multiple parameters and this multiple parameters and like the idea of using
request.args.get
But I need to display all the rows until I query, I have also looked at this nested ifs and I thought perhaps I could use a function for each query and "If" out side of the view function and then call each in the view function, but I am not sure how to. Or I could have a view function for each query. But I am not sure how that would work, here is my code so far,
#bp.route('/books', methods=['GET', 'POST'])
#bp.route('/books/<int:page_num>', methods=['GET', 'POST'])
#bp.route('/books/<int:page_num>/<int:id>', methods=['GET', 'POST'])
#bp.route('/books/<int:page_num>/<int:id>/<ref>', methods=['GET', 'POST'])
#login_required
def books(page_num, id=None, ref=None):
if ref is not None:
books = Book.query.order_by(Book.date.desc()).filter(Book.REF==ref).paginate(per_page=100, page=page_num, error_out=True)
else:
books = Book.query.order_by(Book.date.desc()).paginate(per_page=100, page=page_num, error_out=True)
if id is not None:
obj = Book.query.get(id) or Book()
form = AddBookForm(request.form, obj=obj)
if form.validate_on_submit():
form.populate_obj(obj)
db.session.add(obj)
db.session.commit()
return redirect(url_for('books.books'))
else:
form = AddBookForm()
if form.validate_on_submit():
obj = Book(id=form.id.data, date=form.date.data, description=form.description.data, debit=form.debit.data,\
credit=form.credit.data, montant=form.montant.data, AUX=form.AUX.data, TP=form.TP.data,\
REF=form.REF.data, JN=form.JN.data, PID=form.PID.data, CT=form.CT.data)
db.session.add(obj)
db.session.commit()
return redirect(url_for('books.books', page_num=1))
return render_template('books/books.html', title='Books', books=books, form=form)
With this code there are no error messages, this is a question asking for advice on how to keep my code as readable and as simple as possible and be able to query nine columns of the database whilst displaying all the rows queried and all the rows when no query is activated
All help is greatly appreciated. Paul
I am running this on debian 10 with python 3.7
Edit: I am used to working with Libre Office Base
My question is How do I search one or two columns at a time in My database where I have nine columns out of twelve that I want to be able to search, I want to be able to search one or more at a time, example: column "reference" labels a document reference like "A32", and "account" by a the name of the supplier "FILCUI", possibly both at the same time. I have carried out more research and found that most people advocate a "fulltext" search engine such as "Elastic or Whoosh", But in my case I feel if I search "A32" ( a document number) I will get anything in the model of 12 columns with A 1 2. I have looked at Flask Tutorial 101 search Whoosh all very good tutorials, by excellent people, I thought about trying to use SQLAlchemy as a way, but in the first "Flask Tutorial" he says
but given the fact that SQLAlchemy does not support this functionality,
I thought that this SQLAlchemy-Intergrations will not work either.
So therefor is there a way to "search" "query" "filter" multiple different columns of a model with possibly a form for each search without ending up with a "sack of knots" like code impossible to read or test? I would like to stick to SQLAlchemy if possible
I need just a little pointer in the right direction or a simple personal opinion that I can test.
Warm regards.
EDIT:
I have not answered my question but I have advanced, I can query one row at a time and display all the results on the one page, with out a single "if" statement, i think my code is clear and readable (?) I divided each query into its own view function returning to the same main page, each function has its own submitt button. This has enabled me to render the same page. here is my routes code.
#bp.route('/search_aux', methods=['GET', 'POST'])
#login_required
def search_aux():
page_num = request.args.get('page_num', default = 1, type = int)
books = Book.query.order_by(Book.date.desc()).paginate(per_page=100, page=page_num, error_out=True)
add_form = AddBookForm()
aux_form = SearchAuxForm()
date_form = SearchDateForm()
debit_form = SearchDebitForm()
credit_form = SearchCreditForm()
montant_form = SearchMontantForm()
jn_form = SearchJNForm()
pid_form = SearchPIDForm()
ref_form = SearchREForm()
tp_form = SearchTPForm()
ct_form = SearchCTForm()
des_form = SearchDescriptionForm()
if request.method == 'POST':
aux = aux_form.selectaux.data
books = Book.query.order_by(Book.date.desc()).filter(Book.AUX == str(aux)).paginate(per_page=100, page=page_num, error_out=True)
return render_template('books/books.html', books=books, add_form=add_form, aux_form=aux_form, date_form=date_form, debit_form=debit_form,
credit_form=credit_form, montant_form=montant_form, jn_form=jn_form, pid_form=pid_form, ref_form=ref_form,
tp_form=tp_form, ct_form=ct_form, des_form=des_form)
There is a simple form for each query, it works a treat for each single query. Here is the form and html code:
class SearchAuxForm(FlaskForm):
selectaux = QuerySelectField('Aux', query_factory=AUX, get_label='id')
submitaux = SubmitField('submit')
def AUX():
return Auxilliere.query
html:
<div class="AUX">
<form action="{{ url_for('books.search_aux') }}" method="post">
{{ aux_form.selectaux(class="input") }}{{ aux_form.submitaux(class="submit") }}
</form>
</div>
I tried to do this as a single function with one submit button, but it ended in disaster. I have not submitted this as an answer, Because it does not do all I asked but it is a start.
FINAL EDIT:
I would like to thank the person(s) who reopened this question, allowing mr Lucas Scott to provide a fascinating and informative answer to help me and others.
There are many ways to achieve your desired result of being able to query/filter multiple columns in a table. I will give you an example of how I would approach creating an endpoint that will allow you to filter on one column, or multiple columns.
Here is our basic Books model and the /books endpoint as a stub
import flask
from flask_sqlalchemy import SQLAlchemy
app = flask.Flask(__name__)
db = SQLAlchemy(app) # uses in memory sqlite3 db by default
class Books(db.Model):
__tablename__ = "book"
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
title = db.Column(db.String(255), nullable=False)
author = db.Column(db.String(255), nullable=False)
supplier = db.Column(db.String(255))
published = db.Column(db.Date, nullable=False)
db.create_all()
#app.routes("/books", methods=["GET"])
def all_books():
pass
The first step is to decide on a method of querying a collection by using url parameters. I will use fact that multiple instances of the same key in a query parameter are given as lists to allow us to filter on multiple columns.
For example /books?filter=id&filter=author will turn into {"filter": ["id", "author"]}.
For our querying syntax we will use comma separated values for the filter value.
example:
books?filter=author,eq,jim&suplier,eq,self published
Which turns into {"filter": ["author,eq,jim", "supplier,eq,self published"]}. Notice the space in self published. flask will handle the url-encoding for us and give back a string with a space instead of %20.
Let's clean this up a bit by adding a Filter class to represent our filter query parameter.
class QueryValidationError(Exception):
""" We can handle specific exceptions and
return a http response with flask """
pass
class Filter:
supported_operators = ("eq", "ne", "lt", "gt", "le", "ge")
def __init__(self, column, operator, value):
self.column = column
self.operator = operator
self.value = value
self.validate()
def validate(self):
if operator not in self.supported_operators:
# We will deal with catching this later
raise QueryValidationError(
f"operator `{operator}` is not one of supported "
f"operators `{self.supported_operators}`"
)
Now we will create a function for processing our list of filters into a list of Filter objects.
def create_filters(filters):
filters_processed = []
if filters is None:
# No filters given
return filters_processed
elif isinstance(filters, str):
# if only one filter given
filter_split = filters.split(",")
filters_processed.append(
Filter(*filter_split)
)
elif isinstance(filters, list):
# if more than one filter given
try:
filters_processed = [Filter(*_filter.split(",")) for _filter in filters]
except Exception:
raise QueryValidationError("Filter query invalid")
else:
# Programer error
raise TypeError(
f"filters expected to be `str` or list "
f"but was of type `{type(filters)}`"
)
return filters_processed
and now we can add our helper functions to our endpoint.
#app.route("/books", methods=["GET"])
def all_books():
args = flask.request.args
filters = create_filters(args.get("filter"))
SQLAlchemy allows us to do filtering by using operator overloading. That is using filter(Book.author == "some value"). The == here does not trigger the default == behaviour. Instead the creator of SQLAlchemy has overloaded this operator and instead it creates the SQL query that checks for equality and adds it to the
query. We can leverage this behaviour by using the Pythons operator module. For example:
import operator
from models import Book
authors = Book.query.filter(operator.eq(Book.author, "some author")).all()
This does not seem helpful by it's self, but gets us a step closer to creating a generic and dynamic filtering mechanism. The next important step to making this more dynamic is with the built-in getattr which allows us to look up attributes on a given object using strings. Example:
class Anything:
def say_hi(self):
print("hello")
# use getattr to say hello
getattr(Anything, "say_hi") # returns the function `say_hi`
getattr(Anything, "say_hi")() # calls the function `say_hi`
We can now tie this all together by creating a generic filtering function:
def filter_query(filters, query, model):
for _filter in filters:
# get our operator
op = getattr(operator, _filter.operator)
# get the column to filter on
column = getattr(model, _filter.column)
# value to filter for
value = _filter.value
# build up a query by adding multiple filters
query = query.filter(op(column, value))
return query
We can filter any model with our implementation, and not just by one column.
#app.route("/books", methods=["GET"])
def all_books():
args = flask.request.args
filters = create_filters(args.get("filter"))
query = Books.query
query = filter_query(filters, query, Books)
result = []
for book in query.all():
result.append(dict(
id=book.id,
title=book.title,
author=book.author,
supplier=book.supplier,
published=str(book.published)
))
return flask.jsonify(result), 200
Here is everything all together, and including the error handling of validation errors
import flask
import json
import operator
from flask_sqlalchemy import SQLAlchemy
app = flask.Flask(__name__)
db = SQLAlchemy(app) # uses in memory sqlite3 db by default
class Books(db.Model):
__tablename__ = "book"
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
title = db.Column(db.String(255), nullable=False)
author = db.Column(db.String(255), nullable=False)
supplier = db.Column(db.String(255))
published = db.Column(db.Date, nullable=False)
db.create_all()
class QueryValidationError(Exception):
pass
class Filter:
supported_operators = ("eq", "ne", "lt", "gt", "le", "ge")
def __init__(self, column, operator, value):
self.column = column
self.operator = operator
self.value = value
self.validate()
def validate(self):
if self.operator not in self.supported_operators:
raise QueryValidationError(
f"operator `{self.operator}` is not one of supported "
f"operators `{self.supported_operators}`"
)
def create_filters(filters):
filters_processed = []
if filters is None:
# No filters given
return filters_processed
elif isinstance(filters, str):
# if only one filter given
filter_split = filters.split(",")
filters_processed.append(
Filter(*filter_split)
)
elif isinstance(filters, list):
# if more than one filter given
try:
filters_processed = [Filter(*_filter.split(",")) for _filter in filters]
except Exception:
raise QueryValidationError("Filter query invalid")
else:
# Programer error
raise TypeError(
f"filters expected to be `str` or list "
f"but was of type `{type(filters)}`"
)
return filters_processed
def filter_query(filters, query, model):
for _filter in filters:
# get our operator
op = getattr(operator, _filter.operator)
# get the column to filter on
column = getattr(model, _filter.column)
# value to filter for
value = _filter.value
# build up a query by adding multiple filters
query = query.filter(op(column, value))
return query
#app.errorhandler(QueryValidationError)
def handle_query_validation_error(err):
return flask.jsonify(dict(
errors=[dict(
title="Invalid filer",
details=err.msg,
status="400")
]
)), 400
#app.route("/books", methods=["GET"])
def all_books():
args = flask.request.args
filters = create_filters(args.get("filter"))
query = Books.query
query = filter_query(filters, query, Books)
result = []
for book in query.all():
result.append(dict(
id=book.id,
title=book.title,
author=book.author,
supplier=book.supplier,
published=str(book.published)
))
return flask.jsonify(result), 200
I hope this answers your question, or gives you some ideas on how to tackle your problem.
I would also recommend looking at serialising and marshalling tools like marshmallow-sqlalchemy which will help you simplify turning models into json and back again. It is also helpful for nested object serialisation which can be a pain if you are returning relationships.
I'm trying to make this table with a clickable field which changes the boolean for the entry to its opposite value. It works, but I want an alternative text as "False" or "True" does not look nice, and the users are mainly Norwegian.
def bool_to_norwegian(boolean):
if boolean:
return "Ja"
else:
return "Nei"
class OrderTable(tables.Table):
id = tables.LinkColumn('admin_detail', args=[A('id')])
name = tables.Column()
address = tables.Column()
order = tables.Column()
order_placed_at = tables.DateTimeColumn()
order_delivery_at = tables.DateColumn()
price = tables.Column()
comment = tables.Column()
sent = tables.LinkColumn('status_sent', args=[A('id')])
paid = tables.LinkColumn('status_paid', args=[A('id')], text=[A('paid')])
class Meta:
attrs = {'class': 'order-table'}
If you look under the "paid" entry I am testing this right now, why can't I access the data with the same accessor as I do in the args? If I change the args to args=[A('paid')] and look at the link, it does indeed have the correct data on it. The model names are the same as the ones in this table, and "paid" and "sent" are BooleanFields.
This is kind of what I ultimately want:
text=bool_to_norwegian([A('paid')])
Here is what I send to the table:
orders = Order.objects.order_by("-order_delivery_at")
orders = orders.values()
table = OrderTable(orders)
RequestConfig(request).configure(table)
The text argument expects a callable that accepts a record, and returns a text value. You are passing it a list (which it will just ignore), and your function is expecting a boolean instead of a record. There is also no need for using accessors here.
Something like this should work:
def bool_to_norwegian(record):
if record.paid:
return "Ja"
else:
return "Nei"
Then in your column:
paid = tables.LinkColumn('status_paid', text=bool_to_norwegian)
(Note, it is not clear from your question where the data is coming from - is paid a boolean? You may need to adjust this to fit).
As an aside, the way you are passing args to your columns is weird (it seems the documentation also recommends this, but I don't understand why - it's very confusing). A more standard approach would be:
id = tables.LinkColumn('admin_detail', A('id'))
or using named arguments:
id = tables.LinkColumn('admin_detail', accessor=A('id'))
I'm having an issue with a Django 1.6 project.
I have a view that reports a bunch of web logs. When the view is loaded, it gets all logs from the DB. However, the user has access to two datepickers to limit the from_date and to_date on the query. To accomplish this, I added a check in the get_queryset() to verify if any of those datepickers had been used. This feels really stupid and clunky, and I'm sure there's a better way to do it, but I'm not sure what it is. Here's my view code, let me know if theres anything I'm missing here.
class SiteUsersView(generic.ListView):
"""
EX: localhost/reports/site/facebook
"""
model = Cleanedlog
template_name = "trafficreport/site_chunk_log.html"
def get_context_data(self, **kwargs):
context = super(SiteUsersView, self).get_context_data(**kwargs)
context["url_chunk"] = self.kwargs["url_chunk"]
return context
def get_queryset(self):
from_date = self.request.GET.get("fromDate")
to_date = self.request.GET.get("toDate")
if from_date is not None and to_date is not None:
return self.from_to_date()
elif from_date is not None and to_date is None:
return self.from_date()
elif to_date is not None and to_date is None:
return self.to_date()
else:
return self.all_dates()
def from_date(self, from_date):
return Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"],
time_received__gte=from_date).values('user__name').annotate(
count=Sum('size')).order_by('-count')
def to_date(self, to_date):
return Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"],
time_received__lte=to_date).values('user__name').annotate(
count=Sum('size')).order_by('-count')
def from_to_date(self, to_date, from_date):
return Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"],
time_received__gte=from_date,
time_received__lte=to_date).values('user__name').annotate(
count=Sum('size')).order_by('-count')
def all_dates(self):
return Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"]).values('user__name').annotate(
count=Sum('size')).order_by('-count')
What's more is that I have other reports (Users/IPs) that follow the exact same format, just with a different model and marginally different queries. The datepickers and all the data presentation is in a higher level template, and that works well, but it feels really stupid to just duplicate all this code. Am I missing something obvious?
Thanks!
I've solved problems like this in the past by inverting the logic and using exclude, eg:
return Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"]) \
.exclude(time_received__lt=from_date).exclude(time_received__gt=to_date)
And then take values and annotate and count as desired.
Comparisons against None always fail, so nothing gets excluded unless the date is provided.
That's probably the simplest way to write it, but you can also take advantage of the ability to chain querysets without evaluating them:
base_qs = Cleanedlog.objects.filter(dest_url__contains=self.kwargs["url_chunk"])
if from_date is not None:
base_qs = base_qs.filter(time_received_gte=from_date)
if to_date is not None:
base_qs = base_qs.filter(time_received_lte=to_date)
return base_qs.values(# as above
That avoids putting the unnecessary None comparisons into the query at all.
I was wondering if there was a way to use Django's filter() on query sets using a dynamically generated python property using property(). I have first_name and last_name of every user, and I want to filter based on their concatenated name first_name last_name. (The reason behind this is that when I do autocomplete I search to see if the query matches first name, last name, or part of the concatenation. I want John S to match John Smith, for example.
I created a property of name:
def _get_name(self):
return self.first_name + " " + self.last_name
name = property(_get_name)
This way I can call user.name to get the concatenated name.
However, if I try to do User.objects.filter(name__istartswith=query) I get the error Cannot resolve keyword 'name' into field.
Any ideas on how to do this? Do I have to create another field in the database to store the full name?
The accepted answer is not entirely true.
For many cases, you can override get() in the model manager to pop dynamic properties from the keyword arguments, then add the actual attributes you want to query against into the kwargs keyword arguments dictionary. Be sure to return a super so any regular get() calls return the expected result.
I'm only pasting my own solution, but for the __startswith and other conditional queries you could add some logic to split the double-underscore and handle appropriately.
Here was my work-around to allow querying by a dynamic property:
class BorrowerManager(models.Manager):
def get(self, *args, **kwargs):
full_name = kwargs.pop('full_name', None)
# Override #1) Query by dynamic property 'full_name'
if full_name:
names = full_name_to_dict(full_name)
kwargs = dict(kwargs.items() + names.items())
return super(BorrowerManager, self).get(*args, **kwargs)
In models.py:
class Borrower(models.Model):
objects = BorrowerManager()
first_name = models.CharField(null=False, max_length=30)
middle_name = models.CharField(null=True, max_length=30)
last_name = models.CharField(null=False, max_length=30)
created = models.DateField(auto_now_add=True)
In utils.py (for the sake of context):
def full_name_to_dict(full_name):
ret = dict()
values = full_name.split(' ')
if len(values) == 1:
raise NotImplementedError("Not enough names to unpack from full_name")
elif len(values) == 2:
ret['first_name'] = values[0]
ret['middle_name'] = None
ret['last_name'] = values[1]
return ret
elif len(values) >= 3:
ret['first_name'] = values[0]
ret['middle_name'] = values[1:len(values)-1]
ret['last_name'] = values[len(values)-1]
return ret
raise NotImplementedError("Error unpacking full_name to first, middle, last names")
filter() operates on the database level (it actually writes SQL), so it won't be possible to use it for any queries based on your python code (dynamic property in your question).
This is an answer put together from many other answers in this department : )
I had a similar problem and was looking for solution. Taking for granted that a search engine would be the best option (e.g. django-haystack with Elasticsearch), that's how I would implement some code for your needs using only the Django ORM (you can replace icontains with istartswith):
from django.db.models import Value
from django.db.models.functions import Concat
queryset = User.objects.annotate(full_name=Concat('first_name', Value(' '), 'last_name')
return queryset.filter(full_name__icontains=value)
In my case I didn't know whether the user would insert 'first_name last_name' or viceversa, so I used the following code.
from django.db.models import Q, Value
from django.db.models.functions import Concat
queryset = User.objects.annotate(first_last=Concat('first_name', Value(' '), 'last_name'), last_first=Concat('last_name', Value(' '), 'first_name'))
return queryset.filter(Q(first_last__icontains=value) | Q(last_first__icontains=value))
With Django <1.8, you would probably need to resort to extra with the SQL CONCAT function, something like the following:
queryset.extra(where=['UPPER(CONCAT("auth_user"."last_name", \' \', "auth_user"."first_name")) LIKE UPPER(%s) OR UPPER(CONCAT("auth_user"."first_name", \' \', "auth_user"."last_name")) LIKE UPPER(%s)'], params=['%'+value+'%', '%'+value+'%'])
Think it's not possible in django to filter on properties that does not present as a database filed, but what you can do to make cool autocomplete search is something like this:
if ' ' in query:
query = query.split()
search_results = list(chain(User.objects.filter(first_name__icontains=query[0],last_name__icontains=query[1]),
User.objects.filter(first_name__icontains=query[1],last_name__icontains=query[0])))
else:
search_results = User.objects.filter(Q(first_name__icontains=query)| Q(last_name__icontains=query))
This code gives the user of your system a flexibility to start typing either first name or last name and the user will be thankful to you for allowing this.
I want to create a new type of field for django models that is basically a ListOfStrings. So in your model code you would have the following:
models.py:
from django.db import models
class ListOfStringsField(???):
???
class myDjangoModelClass():
myName = models.CharField(max_length=64)
myFriends = ListOfStringsField() #
other.py:
myclass = myDjangoModelClass()
myclass.myName = "bob"
myclass.myFriends = ["me", "myself", "and I"]
myclass.save()
id = myclass.id
loadedmyclass = myDjangoModelClass.objects.filter(id__exact=id)
myFriendsList = loadedclass.myFriends
# myFriendsList is a list and should equal ["me", "myself", "and I"]
How would you go about writing this field type, with the following stipulations?
We don't want to do create a field which just crams all the strings together and separates them with a token in one field like this. It is a good solution in some cases, but we want to keep the string data normalized so tools other than django can query the data.
The field should automatically create any secondary tables needed to store the string data.
The secondary table should ideally have only one copy of each unique string. This is optional, but would be nice to have.
Looking in the Django code it looks like I would want to do something similar to what ForeignKey is doing, but the documentation is sparse.
This leads to the following questions:
Can this be done?
Has it been done (and if so where)?
Is there any documentation on Django about how to extend and override their model classes, specifically their relationship classes? I have not seen a lot of documentation on that aspect of their code, but there is this.
This is comes from this question.
There's some very good documentation on creating custom fields here.
However, I think you're overthinking this. It sounds like you actually just want a standard foreign key, but with the additional ability to retrieve all the elements as a single list. So the easiest thing would be to just use a ForeignKey, and define a get_myfield_as_list method on the model:
class Friends(model.Model):
name = models.CharField(max_length=100)
my_items = models.ForeignKey(MyModel)
class MyModel(models.Model):
...
def get_my_friends_as_list(self):
return ', '.join(self.friends_set.values_list('name', flat=True))
Now calling get_my_friends_as_list() on an instance of MyModel will return you a list of strings, as required.
What you have described sounds to me really similar to the tags.
So, why not using django tagging?
It works like a charm, you can install it independently from your application and its API is quite easy to use.
I also think you're going about this the wrong way. Trying to make a Django field create an ancillary database table is almost certainly the wrong approach. It would be very difficult to do, and would likely confuse third party developers if you are trying to make your solution generally useful.
If you're trying to store a denormalized blob of data in a single column, I'd take an approach similar to the one you linked to, serializing the Python data structure and storing it in a TextField. If you want tools other than Django to be able to operate on the data then you can serialize to JSON (or some other format that has wide language support):
from django.db import models
from django.utils import simplejson
class JSONDataField(models.TextField):
__metaclass__ = models.SubfieldBase
def to_python(self, value):
if value is None:
return None
if not isinstance(value, basestring):
return value
return simplejson.loads(value)
def get_db_prep_save(self, value):
if value is None:
return None
return simplejson.dumps(value)
If you just want a django Manager-like descriptor that lets you operate on a list of strings associated with a model then you can manually create a join table and use a descriptor to manage the relationship. It's not exactly what you need, but this code should get you started.
Thanks for all those that answered. Even if I didn't use your answer directly the examples and links got me going in the right direction.
I am not sure if this is production ready, but it appears to be working in all my tests so far.
class ListValueDescriptor(object):
def __init__(self, lvd_parent, lvd_model_name, lvd_value_type, lvd_unique, **kwargs):
"""
This descriptor object acts like a django field, but it will accept
a list of values, instead a single value.
For example:
# define our model
class Person(models.Model):
name = models.CharField(max_length=120)
friends = ListValueDescriptor("Person", "Friend", "CharField", True, max_length=120)
# Later in the code we can do this
p = Person("John")
p.save() # we have to have an id
p.friends = ["Jerry", "Jimmy", "Jamail"]
...
p = Person.objects.get(name="John")
friends = p.friends
# and now friends is a list.
lvd_parent - The name of our parent class
lvd_model_name - The name of our new model
lvd_value_type - The value type of the value in our new model
This has to be the name of one of the valid django
model field types such as 'CharField', 'FloatField',
or a valid custom field name.
lvd_unique - Set this to true if you want the values in the list to
be unique in the table they are stored in. For
example if you are storing a list of strings and
the strings are always "foo", "bar", and "baz", your
data table would only have those three strings listed in
it in the database.
kwargs - These are passed to the value field.
"""
self.related_set_name = lvd_model_name.lower() + "_set"
self.model_name = lvd_model_name
self.parent = lvd_parent
self.unique = lvd_unique
# only set this to true if they have not already set it.
# this helps speed up the searchs when unique is true.
kwargs['db_index'] = kwargs.get('db_index', True)
filter = ["lvd_parent", "lvd_model_name", "lvd_value_type", "lvd_unique"]
evalStr = """class %s (models.Model):\n""" % (self.model_name)
evalStr += """ value = models.%s(""" % (lvd_value_type)
evalStr += self._params_from_kwargs(filter, **kwargs)
evalStr += ")\n"
if self.unique:
evalStr += """ parent = models.ManyToManyField('%s')\n""" % (self.parent)
else:
evalStr += """ parent = models.ForeignKey('%s')\n""" % (self.parent)
evalStr += "\n"
evalStr += """self.innerClass = %s\n""" % (self.model_name)
print evalStr
exec (evalStr) # build the inner class
def __get__(self, instance, owner):
value_set = instance.__getattribute__(self.related_set_name)
l = []
for x in value_set.all():
l.append(x.value)
return l
def __set__(self, instance, values):
value_set = instance.__getattribute__(self.related_set_name)
for x in values:
value_set.add(self._get_or_create_value(x))
def __delete__(self, instance):
pass # I should probably try and do something here.
def _get_or_create_value(self, x):
if self.unique:
# Try and find an existing value
try:
return self.innerClass.objects.get(value=x)
except django.core.exceptions.ObjectDoesNotExist:
pass
v = self.innerClass(value=x)
v.save() # we have to save to create the id.
return v
def _params_from_kwargs(self, filter, **kwargs):
"""Given a dictionary of arguments, build a string which
represents it as a parameter list, and filter out any
keywords in filter."""
params = ""
for key in kwargs:
if key not in filter:
value = kwargs[key]
params += "%s=%s, " % (key, value.__repr__())
return params[:-2] # chop off the last ', '
class Person(models.Model):
name = models.CharField(max_length=120)
friends = ListValueDescriptor("Person", "Friend", "CharField", True, max_length=120)
Ultimately I think this would still be better if it were pushed deeper into the django code and worked more like the ManyToManyField or the ForeignKey.
I think what you want is a custom model field.