django south migration, doesnt set default - python

I use south to migrate my django models. There is however a nasty bug in south. It doesn't set default values in Postgres Databases. Example:
created_at = models.DateTimeField(default = datetime.now)
tag_id = models.PositiveIntegerField(default = 0)
South will add these 2 fields to database, but fail to set their default values, which needs to be done manually.
Is there any patch for this bug?
UPDATE
I had already tried setting default date with auto_now_add=True, but that is also not setting defaults. Adding null=True in field adds a db.alter_column in migration script produced by south. But that only removes NOT NULL constraint, doesnt add a default. Same for integer field

If you are auto-generating your migrations using:
./manage.py schemamigration app_name --auto
Then you need to make a small edit to the migration before you actually apply it. Go into the generated migration (should be called something like app_name/migrations/000X__auto_add_field_foo.py) and look for the argument:
keep_default=False
in the db.add_column call. Simply change this to:
keep_default=True
And Django will now apply your default value to the actual schema, in addition to any existing rows. Would be great if South had some kind of setting to generate this parameter as True by default, but no such luck. You will need to make this edit every time.

This is not a bug, in South or elsewhere.
I think you are confused about how default values work in Django generally. Django does not set default values in the database schema. It applies them directly in Python, when a new instance is created. You can verify this by doing manage.py sqlall and see that the generated SQL does not contain default attributes.

As mentioned in earlier answers, the default mechanism in django is implemented in the model class, and is not relevant to south migrations.
Also, since south 0.8, the keep_default flag is deprecated, and won't add the default value to your model.
What I do to solve this is writing a custom migration to add the default value. You can do that by creating a separate data migration:
./manage.py datamigration your_app_name migration_name
and add the following line to the forwards function:
orm.YourModel.objects.update(field_name = DEFAULT_VALUE)
Alternatively, instead of creating a new migration, you can modify your original migration:
add no_dry_run = True to the class itself (so you will have access to the ORM).
add orm.YourModel.objects.update(field_name = DEFAULT_VALUE) to the end of the forwards function.
This way you don't have to write a backwards migration, because you already have the original delete-column one.

Related

Django error when trying to migrate models with foreignkeys that don't have null=True or default set

I have this model that is a post like on Twitter, that has a creator. I'd ideally like the post to always require a creator_id and if the creator gets deleted then delete the post as well
class Post(AbstractBaseModel):
creator_id = models.ForeignKey(User, on_delete=models.CASCADE, related_name="post_creator_id")
body = models.CharField(max_length=511)
Whenever I try to run 'python manage.py migrate' I get this error
"You are trying to change the nullable field 'creator_id' on cheerpost to non-nullable without a default; we can't do that (the database needs something to populate existing rows)."
The options to solve this are are 1) provide a one off default or 2) ignore for now. Neither of these seem to fulfill my constraint that I want to enforce, which is creator_id must exist and is the person who created the post or the entity gets deleted.
I've tried deleting the DB and recreating it from scratch in postgres as well as deleting it using the following query:
TRUNCATE Post;
DELETE FROM Post;
If you've deleted the DB, just the data and tables from DB are deleted.
That doesn't reflect any changes in Django. All the changes you've made to the fields of your model still exist in migrations. You have to delete the old migrations too.
Delete those old migrations from your app, create new migrations from scratch and apply them.
python manage.py makemigrations
python manage.py migrate
Django is asking you to provide a one-off default for any rows you already have in your database, since the field was nullable before the migration. The issue is that Django doesn’t know if there are any rows in the existing database where that column is null, so it needs instructions on what to do if it finds any. You can just provide one and forget about it—it will never be used again after the migration is complete.
Also, you may want to review how the related_name parameter works; you’ve got it backwards.

Django migrations not getting applied on Postgres after changing the datatype of an attribute in models.py

I had this model where the price was an Integer Field. I did run migrations and all was fine.
from django.db import models
class Bill(models.Model):
price= models.IntegerField()
Then due to requirement changes, I had to make the price field as JSONField which would store the price value based on certain keys similar to this
price={"actual_price":100, "tax_price":20}
I made the changes in the model like below:
from django.db import models
class Bill(models.Model):
price= JSONField(blank=True, null=True)
I performed makemigrations and migrate operations, the migrations are not getting reflected in DB. There are no errors as well. I get the error "Column: price does not exist" when my code tries to read from the DB.
I tried the following things by referring to StackOverflow other questions:
Removed the field. Run migrations, applied it. Removing field works
fine. But when I add the field back instead of considering it as new
field to be added, it just doesn't get inserted.
Removing the migrations from the migrations.py folder, re-running it and re-applying it.
Please note:
I am using Postgres DB. The same thing works for SQLite DB which is in-house provided by Django but not for Postgres.
Losing the data is not a feasible option as the DB data is of production server.
I tried adding the column manually through PostGres using the query ALTER Table ADD Column which worked perfectly fine. This is a hook that I used which was used as a last resort.
The data in the initial integer field was present for some of the records before applying the updated migration. Strange thing is I also wasn't asked by Django to set the value in case I want to override the data.
I need the Django migrations for applying the changes automatically to work. Due to this issue, only the addition of new column and modifying the datatype of the column is not working at all (Rest operations like removing the column works).
Since there was no other option and no solutions provided, I had to point my application to new DB in PostGres and then it started working fine. Lesson Learnt for future:
Not to update directly the datatype of the model attribute and then apply migrations.
First Remove the field, apply migrations and then create the field with same name and required data type and then apply migrations.
Use the same Database that you are using for production in your local itself. In this way, you can avoid conflicts after deploying to production. I had used SQLite for local and PostGres for Production. From now on, I will use PostGres for Local also so that I can debug issues in my local itself.

Django: Quickly deal with adding non-nullable field

When developing models I quite often get the non-nullable field error when running makemigrations:
You are trying to add a non-nullable field 'user' to randommodel without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows)
2) Quit, and let me add a default in models.py
Select an option:
Almost all the time when I get this error I'm quite happy to delete the data in that table (it's normally only a couple of test entries while developing) and it would be more efficient to just delete it rather than determine what a suitable default would be.
However currently I don't have a suitable method for doing this and end up flushing the database and/or deleting the migrations, which is pretty heavy handed but works.
What's the best way to delete the data just in that model/table to remove the error? (Would it be via shell/shell_plus?)
Model:
class RandomModel(models.Model):
user_details = JSONField(unique=True)
user = models.ForeignKey(User)
Even if you have deleted all the records in that table, when running makemigrations, you'll be asked to provide default values again. This is because you're making a new migration file for an existing table.
One solution I can think of is to tell Django that you're starting that app_name over again by running migrate app_name zero. This will unapply all migration files that have ever been applied to your database.
Then delete all the migration files in your app_name. And run makemigrations again. This would create a new initial migration file. Then you just apply it to your database with migrate.
As you've said you don't mind deleting your data. This is even better. You don't have to even delete any record. It will just create a new table with the same name with all the new fields and with 0 record.

Edit database outside Django ORM

If one is using Django, what happens with changes made directly to the database (in my case postgres) through either pgadmin or psql?
How are such changes handled by migrations? Do they take precedence over what the ORM thinks the state of affairs is, or does Django override them and impose it's own sense of change history?
Finally, how are any of these issues effected, or avoided, by git, if at all?
Thanks.
You can exclude a model completely from the django migrations, and then you are responsible to adjust the schema to the django code (or the django code to the existing schema):
class SomeModel(models.Model):
class Meta:
managed = False
db_table = "some_table_name"
name = models.Fields....
Note that you can't have it both ways, so migrations are preferred when possible. You can always define a custom SQL migration, that will save the need for external changes. However, sometimes you do need to handle the schema elsewhere instead of migrations, and then use managed=False
The migrations system does not look at your current schema at all. It builds up its picture from the graph of previous migrations and the current state of models.py. That means that if you make changes to the schema from outside this system, it will be out of sync; if you then make the equivalent change in models.py and create migrations, when you run them you will probably get an error.
For that reason, you should avoid doing this. If it's done already, you could apply the conflicting migration in fake mode, which simply marks it as done without actually running the code against the database. But it's simpler to do everything via migrations in the first place.
git has no impact on this at all, other than to reiterate that migrations are code, and should be added to your git repo.

Change Django model column default using South

I'm using South with a Postgresql DB for a Django project.
There is a model field that I'd like to change the default value for going forward. I dont need previous records effected. Just new records.
Do I need to do a migration for this, or just change the model?
OLD FIELD DETAIL:
background_style = models.CharField(max_length=1, choices=BACKGROUND_STYLE, default=BackgroundStyleCode.CENTERED)
NEW FIELD DETAIL:
background_style = models.CharField(max_length=1, choices=BACKGROUND_STYLE, default=BackgroundStyleCode.STRETCHED)
(model name is "Page")
You should run a migration. Any time you make a change to a model, no matter how insignificant, you should create a schema migration so that you can move backwards and forwards to any point in time without any "magic" edits.

Categories

Resources