I added orgname = models.CharField(max_length=50) to my an existing class in my models.py and I ran python manage.py syncdb but figured out that it doesn't create columns (I'm using PostgreSQL by the way), so I needed to do python manage.py sqlall <myapp> which I did and it outputted the following:
BEGIN;
CREATE TABLE "file_uploader_files" (
"id" serial NOT NULL PRIMARY KEY,
"file" varchar(100) NOT NULL,
"orgname" varchar(50) NOT NULL
)
;
COMMIT;
Yet, when I go into the shell for Django or look in pgAdmin3, the column is still not created. What am I doing wrong? I'd add it manually but I'm not sure how.
P.S. The table was already created before hand and so was the file varchar, orgname came after I initial made that column.
The documentation for the sqlall command says:
Prints the CREATE TABLE and initial-data SQL statements for the given app name(s).
It prints the SQL, it doesn't run anything. Django will never modify your schema, you'll need to do it yourself - the output above can help by showing you the type of the orgname field. Or use something like South.
Also see this SO question: update django database to reflect changes in existing models (the top two answers cover your question).
From the accepted answer:
note: syncdb can't update your existing tables. Sometimes it's impossible to decide what to do automagicly - that's why south scripts are this great.
And in the other another answer...
python manage.py reset <your_app>
This will update the database tables for your app, but will completely destroy any data that existed in those tables.
Related
I have inserted data into table from postgresql directly. Now when I try to insert data from django application, it's generating primary key duplication error. How can I resolve this issue?
Run
python manage.py sqlsequencereset [app_name]
and execute all or just one for the required table SQL statements in the database to reset sequences.
Explanation:
You probably inserted with primary keys already present in it, not letting postgresql to auto-generate ids. This is ok.
This means, internal Postgresql sequence used to get next available id has old value. You need to rest with sequence to start with maximum id present in the table.
Django manage.py has command intended just for that - print sql one can execute in db to reset sequences.
I think problem is not in database. please check your django code probably you use get_or_create
I use Django 1.11, PostgreSQL 9.6 and Django migration tool. I couldn't have found a way to specify the column orders. In the initial migration, changing the ordering of the fields is fine but what about migrations.AddField() calls? AddField calls can also happen for the foreign key additions for the initial migration. Is there any way to specify the ordering or am I just obsessed with the order but I shouldn't be?
Update after the discussion
PostgreSQL DBMS doesn't support positional column addition. So it is practically meaningless to expect this facility from the migration tool for column addition.
AFAIK, there's no officially supported way to do this, because fields are supposed to be atomic and it shouldn't be relevant. However, it messes with my obsessive-compulsive side as well, and I like my columns to be ordered for when I need to debug things in dbshell, for example. Here's what I've found you can do:
Make a migration with python manage.py makemigrations
Edit the migration file and reorder the fields in migrations.createModel
I am not 100% sure about the PostgreSQL syntax but this is what it looks like in SQL after you have created the database. I'm sure PostgreSQL would have an equivalent:
ALTER TABLE yourtable.yourmodel
CHANGE COLUMN columntochange columntochange INT(11) NOT NULL AFTER columntoplaceunder;
Or if you have a GUI (mysql workbench in my case) you can go to the table settings and simply drag and drop colums as you wish and click APPLY.
I added a column to my models.py and it was giving me issues. On the road to trying to solve the problems I've done a couple things.
Dropped the table: ./manage.py sqlclear app | ./manage.py dbshell
Tried to "reset" the schema: ./manage.py schemamigration app --initial
Tried to migrate: ./manage.py migrate app
After doing all these things, I get this error after trying to migrate:
FATAL ERROR - The following SQL query failed: CREATE TABLE "projects_project" ("id" integer NOT NULL PRIMARY KEY)
The error was: table "projects_project" already exists
Question: How do I repair my database? I don't care about any of the data in the db.
Edit:
One of the related posts took me to this link Django South - table already exists . Apparently if you fake the migration all is well.
./manage.py migrate myapp --fake
I'm still unsure of all the reprocussions of this but I guess thats what docs are for.
Welll, the error points it out pretty clearly:
table "projects_project" already exists
You can either do it the quick and dirty way and drop the table. In that case log into your DMBS. If it's MySQL, you'll just open the terminal and typ:
mysql -u root -p YOURPASSWORD
Then select the database:
use your_database;
Finally, drop the table:
DROP TABLE projects_project;
You should be able to migrate now.
The elegant way would be to undo the migration. But every framework has it's own way to do that. You need to figure that out first - or give us more information.
I want to remove null=True from a TextField:
- footer=models.TextField(null=True, blank=True)
+ footer=models.TextField(blank=True, default='')
I created a schema migration:
manage.py schemamigration fooapp --auto
Since some footer columns contain NULL I get this error if I run the migration:
django.db.utils.IntegrityError: column "footer" contains null values
I added this to the schema migration:
for sender in orm['fooapp.EmailSender'].objects.filter(footer=None):
sender.footer=''
sender.save()
Now I get:
django.db.utils.DatabaseError: cannot ALTER TABLE "fooapp_emailsender" because it has pending trigger events
What is wrong?
Another reason for this maybe because you try to set a column to NOT NULL when it actually already has NULL values.
Every migration is inside a transaction. In PostgreSQL you must not update the table and then alter the table schema in one transaction.
You need to split the data migration and the schema migration. First create the data migration with this code:
for sender in orm['fooapp.EmailSender'].objects.filter(footer=None):
sender.footer=''
sender.save()
Then create the schema migration:
manage.py schemamigration fooapp --auto
Now you have two transactions and the migration in two steps should work.
At the operations I put SET CONSTRAINTS:
operations = [
migrations.RunSQL('SET CONSTRAINTS ALL IMMEDIATE;'),
migrations.RunPython(migration_func),
migrations.RunSQL('SET CONSTRAINTS ALL DEFERRED;'),
]
If you are adding a non-nullable field, you need to do it in two migrations:
AddField and RunPython to populate it
AlterField to change the field to be non-nullable
Explanation
On PostgreSQL and SQLite, this problem can occur if you have a sufficiently complex RunPython command combined with schema alterations in the same migration. For example, if you are adding a non-nullable field, the typical migration steps for this is:
AddField to add the field as nullable
RunRython to populate it
AlterField to change the field to be non-nullable
On SQLite and Postgres, this can cause problems because the whole thing is being done in one transaction.
The Django docs have a specific warning about this:
On databases that do support DDL transactions (SQLite and PostgreSQL), RunPython operations do not have any transactions automatically added besides the transactions created for each migration. Thus, on PostgreSQL, for example, you should avoid combining schema changes and RunPython operations in the same migration or you may hit errors like OperationalError: cannot ALTER TABLE "mytable" because it has pending trigger events.
If this is the case, the solution is to separate your migration into multiple migrations. In general, the way to split is to have a first migration containing the steps up through the run_python command and the second migration containing all the ones after it. Thus, in the case described above, the pattern would be the AddField and RunPython in one migration, and the AlterField in a second.
Have just hit this problem. You can also use db.start_transaction() and db.commit_transaction() in the schema migration to separate data changes from schema changes. Probably not so clean as to have a separate data migration but in my case I would need schema, data, and then another schema migration so I decided to do it all at once.
You are altering the column schema. That footer column can no longer contain a blank value. There are most likely blank values already stored in the DB for that column. Django is going to update those blank rows in your DB from blank to the now default value with the migrate command. Django tries to update the rows where footer column has a blank value and change the schema at the same time it seems (I'm not sure).
The problem is you can't alter the same column schema you are trying to update the values for at the same time.
One solution would be to delete the migrations file updating the schema. Then, run a script to update all those values to your default value. Then re-run the migration to update the schema. This way, the update is already done. Django migration is only altering the schema.
In my case I've got
AddField
RunPython
RemoveField
Then I just moved the last RemoveFied to the new migration file, that fixed the problem
step 1)the solution is to remove the latest migration from the migration folder and remove the latest added fields in models.
step 2)then again makemigration and migrate
step 3)At the last add the field again that has been removed in the first step
step 4)then again makemigration and migrate
Problem solved
I'm trying to save a stripe (the billing service) company id [around 200 characters or so] to my database in Django.
The specific error is:
database error: value too long for type character varying(4)
How can I enable Django to allow for longer values?
I saw:
value too long for type character varying(N)
and:
Django fixture fails, stating "DatabaseError: value too long for type character varying(50)"
my database is already encoded for UTF-8, according to my webhost.
EDIT : I see that one answer recommends making the column wider. Does that involve modifying the PostgreSQL database?
My specific system is Webfaction, CentOs shared machine, Django running on PostgreSQL. I would really appreciate a conceptual overview of what's going on and how I can fix it.
Yes, make the column wider. The error message is quite clear: your 200 characters are too big to fit in a varchar(4).
First, update your model fields max_length attribute from 4 to a number that you expect will be long enough to contain the data you're feeding it.
Next up you have to update the database column itself as django will not automatically update existing columns.
Here are a few options:
1:
Drop the database and run syncdb again. Warning: you will lose all your data.
2: Manually update the column via SQL:
Type in python manage.py dbshell to get into your database shell and type in
ALTER TABLE my_table ALTER COLUMN my_column TYPE VARCHAR(200)
3: Learn and use a database migration tool like django south which will help keep your database updated with your model code.
Using Django 1.11 and Postgres 9.6 I ran into this, for no apparent reason.
I set max_length=255 in the migration file and executed:
manage.py migrate
Then I set the correct length on the Model's max_length and ran another makemigrations and the ran migrate again.
JUST IN CASE SOMEONE RUNS INTO THIS ERROR FOR DJANGO 3 and POSTGRES.
STEP 1: Go to your migration folder.
STEP 2: Navigate to your recent migration file.
STEP 3: Check for the operations list in the migration file.
STEP 4: Update the field or the table column max_length to the higher number to accommodate your data.