I have this small project to create my bills through Django and Latex which worked flawlessly until today. Now when I try to add another costumer, Django throws
duplicate key value violates unique constraint "kunden_kundearbeitsamt_pkey"
DETAIL: Key (id)=(4) already exists.
These are the model definitions in question:
class Kunde(models.Model):
name = models.CharField('Name', max_length = 200)
vorname = models.CharField('Vorname', max_length = 200)
geburtsdatum = models.DateField('Geburtsdatum', max_length = 200)
untersuchungsdatum = models.DateField('Untersuchungsdatum', max_length = 200)
class Meta:
abstract = True
class KundeArbeitsamt(Kunde):
kundennummer = models.CharField('Kundennummer', max_length = 100)
bglnummer = models.CharField('BGL-Nummer', max_length = 100)
empfaenger = models.ForeignKey('rechnungen.NumberToEmpfaenger', blank = True, null = True)
class Meta:
verbose_name = "Proband Arbeitsamt"
verbose_name_plural = "Proband Arbeitsamt"
def __str__(self):
return '{}, {}'.format(self.name, self.vorname)
The admin part where the object is created (nothing special, I guess):
from django.contrib import admin
from .models import KundeArbeitsamt
class KundeArbeitsamtAdmin(admin.ModelAdmin):
ordering = ('name',)
admin.site.register(KundeArbeitsamt, KundeArbeitsamtAdmin)
I swear, I did not make any migrations or other changes to the database (Postgres) whatsoever. Django is handling the creation of the objects. What is causing this error and how to fix it?
This error is raised by your database, because django wants to add an new column with an ID (=4) already in use.
To investigate further you need to find the part of your app responsible for creating the IDs. Django usually delegates this task to your database. In case of postgres the datatype serial is used. Postgres uses so called sequences for this purpose and generates and executes the following SQL for you:
CREATE SEQUENCE tablename_colname_seq;
CREATE TABLE tablename (
colname integer NOT NULL DEFAULT nextval('tablename_colname_seq')
);
ALTER SEQUENCE tablename_colname_seq OWNED BY tablename.colname;
I would now start with checking the database sanity like that:
-- views contents of the table
SELECT * FROM kunden_kundearbeitsamt;
-- check the sequence
select currval('kunden_kundearbeitsamt_id_seq');
If the first shows 4 records with IDs 1, 2, 3 and 4 and the sequence answers with 4 everything is alright. I would proceed with the django sources to figure out why they pass an ID on object creating without relying on the sequence. The django shell might be a good place to start with in that case.
Otherwise I would fix the sequence and ask myself how this happend as it is barely the case that postgres makes mistakes at this point.
SELECT setval('kunden_kundearbeitsamt_id_seq', (SELECT max(id) FROM kunden_kundearbeitsamt));
Related
My constrains method not working for id_number. can't figure it out why.
from odoo import models, fields, api
from odoo.exceptions import ValidationError
class KindeGarden(models.Model):
_inherits = {'res.partner': 'partner_id'}
_name = 'kindergarten.model'
_description = 'Kindergarten'
age = fields.Integer(string="Amžius", required=False, default="1")
group = fields.Char(string="Grupė", compute="_compute_group", store=True)
height = fields.Float(string="Ūgis", required=False)
weight = fields.Float(string="Svoris", required=False)
id_number = fields.Integer(string="Registravimo Nr", required=True)
#api.constrains('id_number')
def _check_id_number_field(self):
for i in self:
if i.id_number < 10:
raise ValidationError("Number is to small")
and i'm also having this
WARNING -Kindegarden odoo.models.schema: Table 'kindergarten_model': unable to set a NOT NULL constraint on column 'id_number' !
If you want to have it, you should update the records and execute manually:
ALTER TABLE kindergarten_model ALTER COLUMN id_number SET NOT NULL
Like mentioned above, it looks like it is some data are null already before you set required parameter to true.
odoo has a shell you can use to access your DB if you are not familiar with SQL.
odoo-bin -d <database_name> shell
inside the shell, do as follow so you will see.
>> records = env['kindergarten.model'].search([('id_number','=',False)])
>> len(records)
if it returns a number aside from 0, it means that those are NULL value. so do like.
>> for record in records:
record.write({'id_number': 0.0})
>>env.cr.commit()
Then update your module again.
If this doesn't work you will need to do it manually with SQL.
Did you add constraint after few records were added ?
The error you got generally comes when postgres is unable to set "NOT NULL" to the column because it already has null values
I am building a service that makes short URLs. I have the models:
from django.db import models
class ShortURL(models.Model):
url = models.CharField(max_length = 50)
class LongURL(models.Model):
name = models.CharField(max_length = 100, null=True)
url_to_short = models.ForeignKey(ShortURL)
I have already run the command: python manage.py migrate
If I open the interpreter, using python manage.py shell and run this code:
>>> from appshort.models import LongURL
>>> a = LongURL(name = 'hello_long_link')
>>> a.save()
then I get the error:
django.db.utils.IntegrityError: NOT NULL constraint failed: appshort_longurl.url_to_short_id
What did I do wrong?
class LongURL(models.Model):
name = models.CharField(max_length = 100, null=True)
url_to_short = models.ForeignKey(ShortURL)
The way you have set it up, the url_to_short foreign key is not optional. So when you try to save:
>>> a = LongURL(name = 'hello_long_link')
>>> a.save()
Django is trying to tell you that you didn't provide the url_to_short relation on your a model instance.
You'll need to either
Provide the ShortURL relation when you create the LongURL instance
Make the url_to_short relation optional with null=True, blank=True.
While creating an entry for LongURL you must create an object of ShortURL or filter out already existing (because ForeignKey field cannot be left blank). Additionally, you say that sometimes you have been able to achieve the desired behaviour. This can be so because at those places you would have got an object of ShortURL which is not null. However, the error in the discussion arises, when you try to send a null object during the creation of LongURL. For example:
...
short_url_obj = ShortURL.objects.filter(...).first()
# you have to ensure that this above query is not null
try:
new_long_url = LongURL(url_to_short=short_url_obj, name="some_name")
new_long_url.save()
except:
# if the short_url_obj is null
print("The short_url_obj was null, cannot save to database")
...
One can also use if-else block instead, but I would not advice that.
class PersonSite(models.Model):
vps_id = models.AutoField(primary_key=True)
person = models.ForeignKey(CanonPerson, db_column='p_id',null=True)
site = models.ForeignKey(CanonSite, db_column='s_id',null=True)
person_sites = PersonSite.objects.filter(person=cp)
for person_site in person_sites:
if person_site and person_site.site_id and person_site.site.s_id:
# crashes for some records
We have a problem with the data, where PersonSite may point to a site that no longer exists.
In the debugger I can see that person_site.site_id has a value of 5579, however that id doesn't exist in the database:
select * from tbl_vpd_sites where s_id = 5579
Hence person_site.site_id is not null, yet just accessing person_site.site within the conditional crashes the app with the message:
DoesNotExist: CanonSite matching query does not exist.
This is a very difficult situation, I can't even check for this case to bypass it.
PersonSite.site has null=True, so it makes sense that you have to check if a object exists before accessing it.
In stead of doing all those checks if person_site and person_site.site_id and person_site.site.s_id: you can just query the db and filter the empty sites out.
person_sites = PersonSite.objects.filter(person=cp).filter(site__isnull=False)
This will return only the PersonSite objects where site IS NOT NULL and therefore have a pk.
Django 1.2.5
Python: 2.5.5
My admin list of a sports model has just gone really slow (5 minutes for 400 records). It was returning in a second or so until we got 400 games, 50 odd teams and 2 sports.
I have fixed it in an awful way so I'd like to see if anyone has seen this before. My app looks like this:
models:
Sport( models.Model )
name
Venue( models.Model )
name
Team( models.Model )
name
Fixture( models.Model )
date
sport = models.ForeignKey(Sport)
venue = models.ForeignKey(Venue)
TeamFixture( Fixture )
team1 = models.ForeignKey(Team, related_name="Team 1")
team2 = models.ForeignKey(Team, related_name="Team 2")
admin:
TeamFixture_ModelAdmin (ModelAdmin)
list_display = ('date','sport','venue','team1','team2',)
If I remove any foreign keys from list_display then it's quick. As soon as I add any foreign key then slow.
I fixed it by using non foreign keys but calculating them in the model init so this works:
models:
TeamFixture( Fixture )
team1 = models.ForeignKey(Team, related_name="Team 1")
team2 = models.ForeignKey(Team, related_name="Team 2")
sport_name = ""
venue_name = ""
team1_name = ""
team2_name = ""
def __init__(self, *args, **kwargs):
super(TeamFixture, self).__init__(*args, **kwargs)
self.sport_name = self.sport.name
self.venue_name = self.venue.name
self.team1_name = self.team1.name
self.team2_name = self.team2.name
admin:
TeamFixture_ModelAdmin (ModelAdmin)
list_display = ('date','sport_name','venue_name','team1_name','team2_name',)
Administration for all other models are fine with several thousand records at the moment and all views in the actual site is functioning fine.
It's driving me crazy. list_select_related is set to True, however adding a foreign key to User in the list_display generates one query per row in the admin, which makes the listing slow. Select_related is True, so the Django admin shouldn't call this query on each row.
What is going on ?
The first thing I would look for, are the database calls. If you shouldn't have done that already, install django-debug-toolbar. That awesome tool lets you inspect all sql queries done for the current request. I assume there are lots of them. If you look at them, you will know where to look for the problem.
One problem I myself have run into: When the __unicode__ method of a model uses a foreign key, that leads to one database hit per instance. I know of two ways to overcome this problem:
use select_related, which usually is your best bet.
make your __unicode__ return a static string and override the save method to update this string accordingly.
This is a very old problem with django admin and foreign keys. What happens here is that whenever you try to load an object it tries to get all the objects of that foreign key. So lets say you are trying to load a fixture with a some teams (say the number of teams is about 100), its going to keep on including all the 100 teams in one go. You can try to optimize them by using something called as raw_fields. What this would do is instead of having to calling everything at once, it will limit the number of calls and make sure that the call is only made when an event is triggered (i.e. when you are selecting a team).
If that seems a bit like a UI mess you can try using this class:
"""
For Raw_id_field to optimize django performance for many to many fields
"""
class RawIdWidget(ManyToManyRawIdWidget):
def label_for_value(self, value):
values = value.split(',')
str_values = []
key = self.rel.get_related_field().name
for v in values:
try:
obj = self.rel.to._default_manager.using(self.db).get(**{key: v})
x = smart_unicode(obj)
change_url = reverse(
"admin:%s_%s_change" % (obj._meta.app_label, obj._meta.object_name.lower()),
args=(obj.pk,)
)
str_values += ['<strong>%s</strong>' % (change_url, escape(x))]
except self.rel.to.DoesNotExist:
str_values += [u'No input or index in the db']
return u', '.join(str_values)
class ImproveRawId(admin.ModelAdmin):
raw_id_fields = ('created_by', 'updated_by')
def formfield_for_dbfield(self, db_field, **kwargs):
if db_field.name in self.raw_id_fields:
kwargs.pop("request", None)
type = db_field.rel.__class__.__name__
kwargs['widget'] = RawIdWidget(db_field.rel, site)
return db_field.formfield(**kwargs)
return super(ImproveRawId, self).formfield_for_dbfield(db_field, **kwargs)
Just make sure that you inherit the class properly. I am guessing something like TeamFixture_ModelAdmin (ImproveRawIdFieldsForm). This will most likely give you a pretty cool performance boost in your django admin.
I fixed my problem by setting list_select_related to the list of related model fields instead of just True
For our Django App, we'd like to get an AutoField to start at a number other than 1. There doesn't seem to be an obvious way to do this. Any ideas?
Like the others have said, this would be much easier to do on the database side than the Django side.
For Postgres, it'd be like so: ALTER SEQUENCE sequence_name RESTART WITH 12345; Look at your own DB engine's docs for how you'd do it there.
For MySQL i created a signal that does this after syncdb:
from django.db.models.signals import post_syncdb
from project.app import models as app_models
def auto_increment_start(sender, **kwargs):
from django.db import connection, transaction
cursor = connection.cursor()
cursor = cursor.execute("""
ALTER table app_table AUTO_INCREMENT=2000
""")
transaction.commit_unless_managed()
post_syncdb.connect(auto_increment_start, sender=app_models)
After a syncdb the alter table statement is executed. This will exempt you from having to login into mysql and issuing it manually.
EDIT: I know this is an old thread, but I thought it might help someone.
A quick peek at the source shows that there doesn't seem to be any option for this, probably because it doesn't always increment by one; it picks the next available key: "An IntegerField that automatically increments according to available IDs" — djangoproject.com
Here is what I did..
def update_auto_increment(value=5000, app_label="xxx_data"):
"""Update our increments"""
from django.db import connection, transaction, router
models = [m for m in get_models() if m._meta.app_label == app_label]
cursor = connection.cursor()
for model in models:
_router = settings.DATABASES[router.db_for_write(model)]['NAME']
alter_str = "ALTER table {}.{} AUTO_INCREMENT={}".format(
_router, model._meta.db_table, value)
cursor.execute(alter_str)
transaction.commit_unless_managed()
I found a really easy solution to this! AutoField uses the previous value used to determine what the next value assigned will be. So I found that if I inserted a dummy value with the start AutoField value that I want, then following insertions will increment from that value.
A simple example in a few steps:
1.)
models.py
class Product(models.Model):
id = model.AutoField(primaryKey=True) # this is a dummy PK for now
productID = models.IntegerField(default=0)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
makemigrations
migrate
Once that is done, you will need to insert the initial row where "productID" holds a value of your desired AutoField start value. You can write a method or do it from django shell.
From view the insertion could look like this:
views.py
from app.models import Product
dummy = {
'productID': 100000,
'productName': 'Item name',
'price': 5.98,
}
Products.objects.create(**product)
Once inserted you can make the following change to your model:
models.py
class Product(models.Model):
productID = models.AutoField(primary_key=True)
productName = models.TextField()
price = models.DecimalField(max_digits=6, decimal_places=2)
All following insertions will get a "productID" incrementing starting at 100000...100001...100002...
The auto fields depend, to an extent, on the database driver being used.
You'll have to look at the objects actually created for the specific database to see what's happening.
I needed to do something similar. I avoided the complex stuff and simply created two fields:
id_no = models.AutoField(unique=True)
my_highvalue_id = models.IntegerField(null=True)
In views.py, I then simply added a fixed number to the id_no:
my_highvalue_id = id_no + 1200
I'm not sure if it helps resolve your issue, but I think you may find it an easy go-around.
In the model you can add this:
def save(self, *args, **kwargs):
if not User.objects.count():
self.id = 100
else:
self.id = User.objects.last().id + 1
super(User, self).save(*args, **kwargs)
This works only if the DataBase is currently empty (no objects), so the first item will be assigned id 100 (if no previous objects exist) and next inserts will follow the last id + 1
For those who are interested in a modern solution, I found out to be quite useful running the following handler in a post_migrate signal.
Inside your apps.py file:
import logging
from django.apps import AppConfig
from django.db import connection, transaction
from django.db.models.signals import post_migrate
logger = logging.getLogger(__name__)
def auto_increment_start(sender, **kwargs):
min_value = 10000
with connection.cursor() as cursor:
logger.info('Altering BigAutoField starting value...')
cursor.execute(f"""
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplate"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplate";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecollection"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecollection";
SELECT setval(pg_get_serial_sequence('"apiV1_workflowtemplatecategory"','id'), coalesce(max("id"), {min_value}), max("id") IS NOT null) FROM "apiV1_workflowtemplatecategory";
""")
transaction.atomic()
logger.info(f'BigAutoField starting value changed successfully to {min_value}')
class Apiv1Config(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'apiV1'
def ready(self):
post_migrate.connect(auto_increment_start, sender=self)
Of course the downside of this, as some already have pointed out, is that this is DB specific.