I'm using the peewee ORM to manage a few Postgres databases. I've recently had a problem where the primary keys are not being automatically added when save() or execute() is called like it should be.
Here's the code that's being called:
Macro.insert(name=name, display_text=text).on_conflict(conflict_target=(Macro.name,), preserve=(Macro.display_text,), update={Macro.name: name}).execute()
Here's the error:
Command raised an exception: IntegrityError: null value in column "id" violates non-null constraint;
DETAIL: Failing row contains (null, nametexthere, displaytexthere)
The macro class has an id (AutoField [set to be primary key]), name (CharField), and display_text (CharField). I've tried using the built in PrimaryKeyField and an IntegerField set to primary key to no change.
Before, I was using Heroku with no issue. I've since migrated my apps to my Raspberry Pi and that's when this issue popped up.
This also isn't the only case where I've had this problem. I have another database with the same AutoField primary key that seems to have broken from the transition from Heroku to Pi. That one uses the save() method rather than insert()/execute(), but the failing row error still shows up.
Should also mention that other non-insert queries work fine. I can still select without issue.
The problem didn't have anything to do with Peewee, it had to do with the dump. Heroku does not dump sequences for you automatically, so I had to add them all again manually. Once those was added the connections worked fine.
Related
So I've made some changes to my schema on a Flask Server, using SQLite and SQLAlchemy. The database is generally functional and I'm able to add some of the models, as well as update and query all models without issues.
I've changed the id in two of my models from Integer to String in order to implement uuid ids, and since I've received IntegrityError for the mismatching parameter types when I do db.session.add(new_post) and db.session.commit().
If I do flask db migrate, it reports that no changes have been detected. Should I manually fill out a revision file or is there something else I am missing?
In addition to migrating your database, you'll need to upgrade:
flask db upgrade
If this results in an error, you might be encountering database specific constraints. For example, some primary keys cannot be removed without setting cascade rules on deletion for relationships.
I have been working on an offline version of my Django web app and have frequently deleted model instances for a certain ModelX.
I have done this from the admin page and have experienced no issues. The model only has two fields: name and order and no other relationships to other models.
New instances are given the next available pk which makes sense, and when I have deleted all instances, adding a new instance yields a pk=1, which I expect.
Moving the code online to my actual database I noticed that this is not the case. I needed to change the model instances so I deleted them all but to my surprise the primary keys kept on incrementing without resetting back to 1.
Going into the database using the Django API I have checked and the old instances are gone, but even adding new instances yield a primary key that picks up where the last deleted instance left off, instead of 1.
Wondering if anyone knows what might be the issue here.
I wouldn't call it an issue. This is default behaviour for many database systems. Basically, the auto-increment counter for a table is persistent, and deleting entries does not affect the counter. The actual value of the primary key does not affect performance or anything, it only has aesthetic value (if you ever reach the 2 billion limit you'll most likely have other problems to worry about).
If you really want to reset the counter, you can drop and recreate the table:
python manage.py sqlclear <app_name> > python manage.py dbshell
Or, if you need to keep the data from other tables in the app, you can manually reset the counter:
python manage.py dbshell
mysql> ALTER TABLE <table_name> AUTO_INCREMENT = 1;
The most probable reason you see different behaviour in your offline and online apps, is that the auto-increment value is only stored in memory, not on disk. It is recalculated as MAX(<column>) + 1 each time the database server is restarted. If the table is empty, it will be completely reset on a restart. This is probably very often for your offline environment, and close to none for your online environment.
As others have stated, this is entirely the responsibility of the database.
But you should realize that this is the desirable behaviour. An ID uniquely identifies an entity in your database. As such, it should only ever refer to one row. If that row is subsequently deleted, there's no reason why you should want a new row to re-use that ID: if you did that, you'd create a confusion between the now-deleted entity that used to have that ID, and the newly-created one that's reused it. There's no point in doing this and you should not want to do so.
Did you actually drop them from your database or did you delete them using Django? Django won't change AUTO_INCREMENT for your table just by deleting rows from it, so if you want to reset your primary keys, you might have to go into your db and:
ALTER TABLE <my-table> AUTO_INCREMENT = 1;
(This assumes you're using MySQL or similar).
There is no issue, that's the way databases work. Django doesn't have anything to do with generating ids it just tells the database to insert a row and gets the id in response from database. The id starts at 1 for each table and increments every time you insert a row. Deleting rows doesn't cause the id to go back. You shouldn't usually be concerned with that, all you need to know is that each row has a unique id.
You can of course change the counter that generates the id for your table with a database command and that depends on the specific database system you're using.
If you are using SQLite you can reset the primary key with the following shell commands:
DELETE FROM your_table;
DELETE FROM SQLite_sequence WHERE name='your_table';
Another solution for 'POSTGRES' DBs is from the UI.
Select your table and look for 'sequences' dropdown and select the settings and adjust the sequences that way.
example:
I'm not sure when this was added, but the following management command will delete all data from all tables and will reset the auto increment counters to 1.
./manage.py sqlflush | psql DATABASE_NAME
I have inserted data into table from postgresql directly. Now when I try to insert data from django application, it's generating primary key duplication error. How can I resolve this issue?
Run
python manage.py sqlsequencereset [app_name]
and execute all or just one for the required table SQL statements in the database to reset sequences.
Explanation:
You probably inserted with primary keys already present in it, not letting postgresql to auto-generate ids. This is ok.
This means, internal Postgresql sequence used to get next available id has old value. You need to rest with sequence to start with maximum id present in the table.
Django manage.py has command intended just for that - print sql one can execute in db to reset sequences.
I think problem is not in database. please check your django code probably you use get_or_create
I'm trying to add a new entry by using the admin panel in Django
The problem is that I've already populated my DB with 200 records and if I try to add a new entry from admin I get a duplicated key error msg that keep increasing whenever I try the process again
error:
duplicate key value violates unique constraint "app_entry_pkey"
admin.py:
admin.site.register(Entry)
model:
class Entry(models.Model):
title = models.CharField(max_length=255)
url = models.TextField(max_length=255)
img = models.CharField(max_length=255)
def __unicode__(self):
return self.title
If you created the database table using Django, then most likely your auto_increment value was not updated when you imported the data outside of Django.
It may also be that when you imported the data you did not give the 200 records each their own unique primary key. I think that (some versions of) SQLite will sometimes allow that in mass imports.
MySQL
For example, I’m looking at a MySQL table in Sequel Pro and see that it has an “auto_increment” value of 144. This means that the next primary key value will be 144.
You can see this value for your table (in MySQL) using:
SHOW TABLE STATUS FROM databaseName where name="entry"
Replacing “databaseName” with the name of your Django database. Other database software will likely have different syntax.
You can set the next auto_increment value (in MySQL) using:
ALTER TABLE databaseName.entry AUTO_INCREMENT ###
Again replacing databaseName with the name of your database; and as before, the syntax may vary depending on the database software you’re using.
If this doesn’t help, you may find it useful to show the table’s status and copy that into your question. This might also be useful in tracking down the issue:
SHOW CREATE TABLE databaseName.entry
Postgres
In Postgres, you can get the current value of the auto increment variable (called sequences in Postgres) using something like:
SELECT last_value FROM app_entry_pkey;
And you will likely set it to a new value with something like:
ALTER SEQUENCE app_entry_pkey RESTART WITH ###
or
SELECT setval('app_entry_pkey', ###)
Note, though, that I do not have a Postgres database handy to test these on. You may also find the following commands useful:
SELECT MAX(id) FROM entry
SELECT nextval('app_entry_pkey')
The latter should generally be larger than the former, and note that “id” is the name of the column in your “entry” model’s table; it may be different in your table. See http://www.postgresql.org/docs/8.1/static/functions-sequence.html for more information.
I create my trac enviromnets using a sqlite database, it works very well.
Now i want to get some information directly from the database and i'm using C# to do it using System.Data.SQLite. The problem i have is an error in the designer cause the tables don't have primary keys.
After get this error i went and noticed that all tables that have more than one primary key defined in the schema were not 'converted' to sqlite, that information is lost.
I believe the problem is in sqlite_backend.py but python isn't my speciality and i'm in a hurry so if you can guide me to a quick fix.
UPDATE (litle more detail):
System.Data.SQLite
"Support for the ADO.NET 3.5 Entity
Framework Supports nearly all the
entity framework functionality that
Sql Server supports, and passes 99% of
the tests in MS's EFQuerySamples demo
application."
Visual Studio 2005/2008 Design-Time
Support You can add a SQLite
connection to the Server Explorer,
create queries with the query
designer, drag-and-drop tables onto a
Typed DataSet and more!
When i drag the tables to the designer, some tables don't make it to the designer. The reasos is,
"The table/view 'main.attachment' does
not have a primary key defined and no
valid primary key could be inferred.
This table/view has been excluded. To
use the entity, you will need to
review your schema, add the correct
keys, and uncomment it."
The problem is this, no entitys = no data.
UPDATE (more info):
My objective isn't change datamodel.
In trac schema the tables attachment, auth_cookie, enum, node_change, permission, session, session_attribute, ticket_change, ticket_custom are defined with primary keys.
When i browse the file trac.db (default) the tables aren't defined with the primary_keys specified in the schema.
I want a solution to solve this litle feature of trac sqlite db.
I don't think it's the best solution edit the table after creation to add pk that aren't created.
UPDATE
Any ideia?!
You may want to look at the Trac Database API. It's written in Python, but you could probably rewrite it in C# fairly easily. At the very least it'll give you a starting point for finding your solution.
http://trac.edgewall.org/wiki/TracDev/DatabaseApi