Postgres Sequences as Default Value for Django Model Field - python

I have a PostgreSQL database that is being used by a front-end application built with Django, but being populated by a scraping tool in Node.js. I have made a sequence that I want to use across two different tables/entities, which can be accessed by a function (nexval(serial)) and is called on every insert. This is not the primary key for these tables, but simply a way to maintain order through some metadata. Using it in Node.js during the insertion of the data into the tables is trivial, as I am using raw SQL queries. However, I am struggling with how to represent this using Django models. There does not seem to be any way to associate this Postgres function with a model's field.
Question:
Is there a way to use a Postgres function as the default value of a Django model field?

you can also set your own function for the default
from django.db import connection, models
def sequence_id():
with connection.cursor() as cursor:
cursor.execute("""SELECT nextval('model_someid_seq')""")
return cursor.fetchone()[0]
class MyModel(models.Model):
field_id = models.IntegerField(default=sequence_id)

My eventual solution:
Override the save method of the model, using a raw query to SELECT nextval('serial') inside the override, setting that as the value of the necessary field, then call save on the parent (super(PARENT, self).save()).

Related

Django - best way to manage existing, auto increment fields with existing data?

This is still in dev, so I am relatively flexible in how to approach this.
So I'm importing data (via fixtures, the import works fine) from an existing database into a django application. The source data is not uniformed in how it manages ID & primary keys. Some tables use what seems to be an auto-increment (similar to what django would produce by default). Others use some sort of integers. The relationships in the data dump are established based on those fields. Seems I can keep on using auto-increments in all cases.
They are conveniently not named uniformly: id, pk, pk_sometablename, etc.
The fixtures I use to import look like this (I generated them using a script based on the datadump, so this can be changed if needs be):
{
"model": "admin_account.client",
"pk": "168",
"fields":
{
"pk_client": "168",
My django model:
class Client(models.Model):
pk_client = models.IntegerField(verbose_name='Pk_client', blank=True, null=True)
I need to be able to import the data in such a way that this field, the pk_client field is used as the primary key (it can still remain as an auto-increment). So I tried to change to this:
class Client(models.Model):
pk_client = models.AutoField(primary_key=True, verbose_name="pk_client", default=-9999)
However if I try this migration with my currently populated dev DB, I get an error:
django.db.utils.OperationalError: foreign key mismatch - "purchase_orders_apent" referencing "admin_client"
I assume django complains because the apent table used to try to lookup client.id, and since I know tell django to use pk_client as the primary key, perhaps those tables referencing it are now unable to find their match). But there's a lot of tables involved.
What the easiest way to manage this?
Should I completely define those models with an empty DB (e.g. define the AutoFields for each model, assign the old db auto-increment value to that same field, and only THEN import the data)?
Or is there something I should change in my fixture definition/the way I import the data?

Django model setup on existing Oracle table

I have a particular situation which I need help clarifying.
I have an existing Oracle table with an auto increment ID as a primary key
I am creating a django model to sync with that table so i can make use of django's ORM methods such as save(), filter() etc.
I read from the django docs the .save() method can perform both a UPDATE and INSERT depending on if the values passed to the primary key results in a True value (i.e. not a None or null).
In my table I have two columns which together will form a composite primary key.
If I specify primary_key = True on the two attributes on the django model, do I need to remove the primary key tag from oracle table?
Also, do i need to specify the unique_together to tell the django model that they are unique or will it be able to derive the index i created in the django oracle table?
Thanks.

How can I load data variables that accessible to all views but get initialised after data available in database?

I'm creating API with Django rest framework. I've one table which will be used as types for the products. Another table which will map that types with the product. consider it as producttypesmapping table. So, I'm creating product type update endpoint. which will only update the producttypesmapping.
The issue is I've used ChoiceField() in the serializer. So I need the tuple of tuples variable to prevent from storing unwanted values. which will be initialized in util.py and to make it dynamic it loads directly by querying the producttypes table. So, I only have to query the data only once.
TAG_CHOICES_TYPE_ONE = []
tags = ProductTypes.objects.filter(tag_type_id=1).values('id', 'value')
for index, item in enumerate(tags):
TAG_CHOICES_TYPE_ONE.append((item["id"], item["value"]))
TAG_CHOICES_TYPE_ONE = tuple(TAG_CHOICES_TYPE_ONE)
But the problem is utils.py executed before the even the producttypes initialized with any data.
First of all, unless you want to attach some additional data about the mapping, your mapping model shouldn't be necessary. Simply use a ForeignKey or a ManyToManyField.
Then, you should consider using a PrimaryKeyRelatedField instead of a ChoiceField. This field takes a queryset as an argument, which will help you limit your choices. If you chose a ChoiceField to be able to get a nice display in the browsable API, you can achieve the same by creating a string representation for your model.

Using Django ORM get_or_create with multiple databases

Django's ORM supports querying from a specific database (when multiple are defined in your project) via the .using() function for filter-based operations.
e.g. MyModel.objects.filter(name='Bob').using('my_non_default_database')
How would you do the equivalent when creating new records, via the class MyModel() or shortcut like get_or_create()?
using is a method on the MyModel.objects manager, so you can do
MyModel.objects.using('my_non_default_database').get_or_create(name="Bob")
If you have a MyModel instance, you can use the using keyword to specify the database to save to. The django docs point out some gotchas.
my_model = MyModel(name="Bob")
my_model.save(using='my_non_default_database')
using is just part of the standard query chain. So, you can use it anywhere before the query is sent to the DB. get_or_create, unlike filter, is atomic: when you call it, it does it's thing immediately, so you just need to call using before that.
MyModel.objects.using('my_non_default_database').get_or_create(name='Bob')

Using 'old' database with django

I'm using a hand built (Postgres) database with Django. With "inspectdb" I was able to automatically create a model for it. The problem is that some tables have multiple primary keys (for many-to-many relations) and they are not accessible via Django.
What's the best way to access these tables?
There is no way to use composite primary keys in Django's ORM as of now (up to v1.0.2).
I can only think of three solutions/workarounds:
There is a fork of django with a composite pk patch at github that you might want to try.
You could use SQLAlchemy together with Django.
You have to add a single field primary key field to those tables.
Django does have support for many-to-many relationships. If you want to use a helper table to manage this relationships, the ManyToManyField takes a through argument which specifies the table to use. You can't model anything terribly complex this way, but it is good for most simple applications.

Categories

Resources