For our Django App, we'd like to get an AutoField to start at a number other than 1. There doesn't seem to be an obvious way to do this. Any ideas?
Like the others have said, this would be much easier to do on the database side than the Django side.
For Postgres, it'd be like so: ALTER SEQUENCE sequence_name RESTART WITH 12345; Look at your own DB engine's docs for how you'd do it there.
For MySQL i created a signal that does this after syncdb:
from django.db.models.signals import post_syncdb
from project.app import models as app_models
def auto_increment_start(sender, **kwargs):
from django.db import connection, transaction
cursor = connection.cursor()
cursor = cursor.execute("""
ALTER table app_table AUTO_INCREMENT=2000
""")
transaction.commit_unless_managed()
post_syncdb.connect(auto_increment_start, sender=app_models)
After a syncdb the alter table statement is executed. This will exempt you from having to login into mysql and issuing it manually.
EDIT: I know this is an old thread, but I thought it might help someone.
A quick peek at the source shows that there doesn't seem to be any option for this, probably because it doesn't always increment by one; it picks the next available key: "An IntegerField that automatically increments according to available IDs" — djangoproject.com
Here is what I did..
def update_auto_increment(value=5000, app_label="xxx_data"):
"""Update our increments"""
from django.db import connection, transaction, router
models = [m for m in get_models() if m._meta.app_label == app_label]
cursor = connection.cursor()
for model in models:
_router = settings.DATABASES[router.db_for_write(model)]['NAME']
alter_str = "ALTER table {}.{} AUTO_INCREMENT={}".format(
_router, model._meta.db_table, value)
cursor.execute(alter_str)
transaction.commit_unless_managed()
I found a really easy solution to this! AutoField uses the previous value used to determine what the next value assigned will be. So I found that if I inserted a dummy value with the start AutoField value that I want, then following insertions will increment from that value.
A simple example in a few steps:
1.)
models.py
class Product(models.Model):
id = model.AutoField(primaryKey=True) # this is a dummy PK for now
productID = models.IntegerField(default=0)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
makemigrations
migrate
Once that is done, you will need to insert the initial row where "productID" holds a value of your desired AutoField start value. You can write a method or do it from django shell.
From view the insertion could look like this:
views.py
from app.models import Product
dummy = {
'productID': 100000,
'productName': 'Item name',
'price': 5.98,
}
Products.objects.create(**product)
Once inserted you can make the following change to your model:
models.py
class Product(models.Model):
productID = models.AutoField(primary_key=True)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
All following insertions will get a "productID" incrementing starting at 100000...100001...100002...
The auto fields depend, to an extent, on the database driver being used.
You'll have to look at the objects actually created for the specific database to see what's happening.
I needed to do something similar. I avoided the complex stuff and simply created two fields:
id_no = models.AutoField(unique=True)
my_highvalue_id = models.IntegerField(null=True)
In views.py, I then simply added a fixed number to the id_no:
my_highvalue_id = id_no + 1200
I'm not sure if it helps resolve your issue, but I think you may find it an easy go-around.
In the model you can add this:
def save(self, *args, **kwargs):
if not User.objects.count():
self.id = 100
else:
self.id = User.objects.last().id + 1
super(User, self).save(*args, **kwargs)
This works only if the DataBase is currently empty (no objects), so the first item will be assigned id 100 (if no previous objects exist) and next inserts will follow the last id + 1
For those who are interested in a modern solution, I found out to be quite useful running the following handler in a post_migrate signal.
Inside your apps.py file:
import logging
from django.apps import AppConfig
from django.db import connection, transaction
from django.db.models.signals import post_migrate
logger = logging.getLogger(__name__)
def auto_increment_start(sender, **kwargs):
min_value = 10000
with connection.cursor() as cursor:
logger.info('Altering BigAutoField starting value...')
cursor.execute(f"""
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplate"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplate";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecollection"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecollection";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecategory"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecategory";
""")
transaction.atomic()
logger.info(f'BigAutoField starting value changed successfully to {min_value}')
class Apiv1Config(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'apiV1'
def ready(self):
post_migrate.connect(auto_increment_start, sender=self)
Of course the downside of this, as some already have pointed out, is that this is DB specific.
Related
I am working on a Django app and I want to make several lookups in a queryset.
My problem is subsequent database hits in finding .count()
I tried using Django's cache framework but it doesn't seem to work.
This is what I've done so far
# app/models.py
from django.core.cache.backends.base import DEFAULT_TIMEOUT
from django.views.decorators.cache import cache_page
from django.core.cache import cache
class my_table(models.Model):
class Meta:
db_table = 'table_name'
name = models.CharField(max_length=200, blank=True)
date_created = models.DateTimeField(auto_now_add=True, blank=True)
ip_address = models.CharField(max_length=100, null=True, blank=True)
user = models.ForeignKey(User, on_delete=models.CASCADE)
def save(self, *args, **kwargs):
cache.set(self.user , my_table.objects.filter(user=self.user))
super(my_table, self).save(*args, **kwargs)
I am updating the cache every time the database is updated.
I tried printing connection.queries in my views
# views.py
def myview(request):
print(len(connection.queries)) # prints 0
records = my_table.objects.filter(user=request.user)
print(records)
print(len(connection.queries)) # prints 1
if record.count() > 0:
# ... some code here
print(len(connection.queries)) # prints 2
print() and .count() making extra db hits.
Now I tried getting results from the cache
# views.py
def myview(request):
print(len(connection.queries)) # prints 0
records = cache.get(request.user)
print(records)
print(len(connection.queries)) # prints 0
if record.count() > 0:
# ... some code here
print(len(connection.queries)) # prints 1
There was no extra query for print() but .count() still hitting database.
how can I perform ORM operations on cached queries without hitting the database multiple times?
I want to perform filtering, aggregations, and count/exist on this queryset without hitting the database.
Also, cache.get(request.user) returns None after some time.
Any help would be appreciated
The Django source code suggests that calling .count() should not hit the database if the queryset has already been fully retrieved (docs were only updated in Django 3.2, see ticket, but 2.2 has the same code).
I'm not sure about explicit caches and how that might interact with it, but the above is true if you don't use any explicit cache (it relies on the temporary cache built in to QuerySet).
Presumably you're calling records.count() rather than record.count() (in case record happens to exist in your full code and be something else)?
Despite numerous recipes and examples in peewee's documentation; I have not been able to find how to accomplish the following:
For finer-grained control, check out the Using context manager / decorator. This allows you to specify the database to use with a given list of models for the duration of the wrapped block.
I assume it would go something like...
db = MySQLDatabase(None)
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(BaseModelThing):
'''imagine all the fields'''
class Meta:
db_table = 'table_name'
runtime_db = MySQLDatabase('database_name.db', fields={'''imagine field mappings here''', **extra_stuff)
#Using(runtime_db, [SubModelThing])
#runtime_db.execution_context()
def some_kind_of_query():
'''imagine the queries here'''
but I have not found examples, so an example would be the answer to this question.
Yeah, there's not a great example of using Using or the execution_context decorators, so the first thing is: don't use the two together. It doesn't appear to break anything, just seems to be redundant. Logically that makes sense as both of the decorators cause the specified model calls in the block to run in a single connection/transaction.
The only(/biggest) difference between the two is that Using allows you to specify the particular database that the connection will be using - useful for master/slave (though the Read slaves extension is probably a cleaner solution).
If you run with two databases and try using execution_context on the 'second' database (in your example, runtime_db) nothing will happen with the data. A connection will be opened at the start of the block and closed and the end, but no queries will be executed on it because the models are still using their original database.
The code below is an example. Every run should result in only 1 row being added to each database.
from peewee import *
db = SqliteDatabase('other_db')
db.connect()
runtime_db = SqliteDatabase('cmp_v0.db')
runtime_db.connect()
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(Model):
first_name = CharField()
class Meta:
db_table = 'table_name'
db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute() # Cleaning out previous runs
with Using(runtime_db, [SubModelThing]):
runtime_db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute()
#Using(runtime_db, [SubModelThing], with_transaction=True)
def execute_in_runtime(throw):
SubModelThing(first_name='asdfasdfasdf').save()
if throw: # to demo transaction handling in Using
raise Exception()
# Create an instance in the 'normal' database
SubModelThing.create(first_name='name')
try: # Try to create but throw during the transaction
execute_in_runtime(throw=True)
except:
pass # Failure is expected, no row should be added
execute_in_runtime(throw=False) # Create a row in the runtime_db
print 'db row count: {}'.format(len(SubModelThing.select()))
with Using(runtime_db, [SubModelThing]):
print 'Runtime DB count: {}'.format(len(SubModelThing.select()))
I have this small project to create my bills through Django and Latex which worked flawlessly until today. Now when I try to add another costumer, Django throws
duplicate key value violates unique constraint "kunden_kundearbeitsamt_pkey"
DETAIL: Key (id)=(4) already exists.
These are the model definitions in question:
class Kunde(models.Model):
name = models.CharField('Name', max_length = 200)
vorname = models.CharField('Vorname', max_length = 200)
geburtsdatum = models.DateField('Geburtsdatum', max_length = 200)
untersuchungsdatum = models.DateField('Untersuchungsdatum', max_length = 200)
class Meta:
abstract = True
class KundeArbeitsamt(Kunde):
kundennummer = models.CharField('Kundennummer', max_length = 100)
bglnummer = models.CharField('BGL-Nummer', max_length = 100)
empfaenger = models.ForeignKey('rechnungen.NumberToEmpfaenger', blank = True, null = True)
class Meta:
verbose_name = "Proband Arbeitsamt"
verbose_name_plural = "Proband Arbeitsamt"
def __str__(self):
return '{}, {}'.format(self.name, self.vorname)
The admin part where the object is created (nothing special, I guess):
from django.contrib import admin
from .models import KundeArbeitsamt
class KundeArbeitsamtAdmin(admin.ModelAdmin):
ordering = ('name',)
admin.site.register(KundeArbeitsamt, KundeArbeitsamtAdmin)
I swear, I did not make any migrations or other changes to the database (Postgres) whatsoever. Django is handling the creation of the objects. What is causing this error and how to fix it?
This error is raised by your database, because django wants to add an new column with an ID (=4) already in use.
To investigate further you need to find the part of your app responsible for creating the IDs. Django usually delegates this task to your database. In case of postgres the datatype serial is used. Postgres uses so called sequences for this purpose and generates and executes the following SQL for you:
CREATE SEQUENCE tablename_colname_seq;
CREATE TABLE tablename (
colname integer NOT NULL DEFAULT nextval('tablename_colname_seq')
);
ALTER SEQUENCE tablename_colname_seq OWNED BY tablename.colname;
I would now start with checking the database sanity like that:
-- views contents of the table
SELECT * FROM kunden_kundearbeitsamt;
-- check the sequence
select currval('kunden_kundearbeitsamt_id_seq');
If the first shows 4 records with IDs 1, 2, 3 and 4 and the sequence answers with 4 everything is alright. I would proceed with the django sources to figure out why they pass an ID on object creating without relying on the sequence. The django shell might be a good place to start with in that case.
Otherwise I would fix the sequence and ask myself how this happend as it is barely the case that postgres makes mistakes at this point.
SELECT setval('kunden_kundearbeitsamt_id_seq', (SELECT max(id) FROM kunden_kundearbeitsamt));
I've added a UUID field to some of my models and then migrated with South. Any new objects I create have the UUID field populated correctly. However the UUID fields on all my older data is null.
Is there any way to populate UUID data for existing data?
For the following sample class:
from django_extensions.db.fields import UUIDField
def MyClass:
uuid = UUIDField(editable=False, blank=True)
name = models.CharField()
If you're using South, create a data migration:
python ./manage.py datamigration <appname> --auto
And then use the following code to update the migration with the specific logic to add a UUID:
from django_extensions.utils import uuid
def forwards(self, orm):
for item in orm['mypp.myclass'].objects.all():
if not item.uuid:
item.uuid = uuid.uuid4() #creates a random GUID
item.save()
def backwards(self, orm):
for item in orm['mypp.myclass'].objects.all():
if item.uuid:
item.uuid = None
item.save()
You can create different types of UUIDs, each generated differently. the uuid.py module in Django-extensions has the complete list of the types of UUIDs you can create.
It's important to note that if you run this migration in an environment with a lot of objects, it has the potential to time out (for instance, if using fabric to deploy). An alternative method of filling in already existing fields will be required for production environments.
It's possible to run out of memory while trying to do this to a large number of objects (we found ourselves running out of memory and having the deployment fail with 17,000+ objects).
To get around this, you need to create a custom iterator in your migration (or stick it where it's really useful, and refer to it in your migration). It would look something like this:
def queryset_iterator(queryset, chunksize=1000):
import gc
pk = 0
last_pk = queryset.order_by('-pk')[0].pk
queryset=queryset.order_by('pk')
if queryset.count() < 1
return []
while pk < last_pk:
for row in queryset.filter(pk__gt=pk)[:chunksize]:
pk = row.pk
yield row
gc.collect()
And then your migrations would change to look like this:
class Migration(DataMigration):
def forwards(self, orm):
for item in queryset_iterator(orm['myapp.myclass'].objects.all()):
if not item.uuid:
item.uuid = uuid.uuid1()
item.save()
def backwards(self, orm):
for item in queryset_iterator(orm['myapp.myclass'].objects.all()):
if item.uuid:
item.uuid = None
item.save()
To add UUID values to all existing records first you will need to make sure your model has the UUID filed with blank=True, null=True
Then Run the schemamigration command with south and then open up the resulting migration file.
And then Edit your migration file with the following as shown in this post
Quote:
You'll need to import the following import uuid
At the end of the forwards() function add the following def forwards(self, orm):
...
for a in MyModel.objects.all():
a.uuid = u'' + str(uuid.uuid1().hex)
a.save()
As stated that will loop through existing instances and add a uuid to it as part of the migration.
There is now an excellent, updated answer for Django 1.9 to this exact question in the Django docs.
Saved me a lot of time!
Basically, i've created a view to populate my database with Serial models from 0000 to 9999. below is the code i'm using for the view.
def insert_serials(request):
for i in range(0,10000):
serial = Serial(i,False)
serial.save()
else:
print 'The for loop is over'
what is the right way to do this, and i'm getting an IntegrityError, duplicate keys, my model defination is below:
class Serial(models.Model):
serial = models.CharField(max_length=4)
closed = models.BooleanField()
def __unicode__(self):
return "%s" %(self.serial)
def get_absolute_url(self):
return "/draw/serial/%s/" % (self.serial)
Your code is working on my site - Mac OS X, Python 2.6.3, django from trunk, sqlite3
I changed your view function code a bit, though -
from django.http import HttpResponse
from models import Serial
def insert_serials(request):
for i in range(0,10000):
serial = Serial(i,False)
serial.save()
return HttpResponse("Serials are inserted")
There may be positional default arguments, try using keywords:
from django.db import transaction
#transaction.commit_manually
def insert_serials(request):
for i in range(0,10000):
serial = Serial(serial=str(i),closed=False)
serial.save()
transaction.commit()
print 'The for loop is over'
It's wrapped in a transaction should speed it up a bit.
See transaction.commit_manually for details
Your id field (implied by the absence of a PK definition in your model) is not being autonumbered correctly and therefore every INSERT after the first is failing with a duplicate id value. What's your database? Did you have Django create the table, or did you do it yourself?
Try adding unique=False in the closed field declaration.
Also, you're trying to put integers into a string field. You should do it like Serial('%04d' % i, False) to put values from '0000' to '9999'.