I have existing database/production system pre-migration support for Django. We did start to use Djangos migration 2 years back it turns out I did forget to install migrations for one model that now causes problems.
2 years back I had the following models; Location, Tool and a third Log that points to instances of Location and Tool.
The 0001_inital.py for Log model has a dependency to Tool for 0001_inital but for Locations it points to _ first _
Now, today I am trying to get Location to use migration (so I later can add things to it, needed now) for the first time...
Running makemigration Location works and generates a new clean migration directory and all, but then when doing migrate --fake I get the following django.db.migrations.exceptions.InconsistentMigrationHistory: Migration log.0001_initial is applied before its dependency locations.0001_initial on database 'default'.
Understand this is caused by the mistake earlier by forgetting to get migrations done for Locations when we started to use this in Django - any ideas how to resolve this in a good way?
Solved it - here is how it went.
1) reset the migration history for Log
>> python manage.py migrate --fake log zero
2) create first migrations for Location (this is what we forgot 2 years back)
>> python manage.py makemigrations locations
3) Edited the 0001_initial.py for Log and changed reference in it dependencies for Location from'_ first _' to '0001_initial'
4) clear (if any) migrations for locations
>> python manage.py migrate --fake locations zero
5) recreate projects all migration history
>> python manage.py showmigrations
>> python manage.py migrate --fake
6) Done - tested it by trying run migration to see if thing wirk
>> python manage.py migrate
No migrations to apply
:-)
Related
I renamed some field in my model, and ran
python manage.py makemigration # successful
python manage.py migrate
On the second command I get
NotSupportedError: Renaming the 'my_model'.''my_column' while in a transaction is not supported on SQLite because it would break referential integrity. Try adding atomic = False to the Migration class
However, I don't see which transaction it means. There is no python or sqlite process that is running at the time I get that error. Is some lock left in sqlite or django file? And how do I fix that??
Go to the app folder in which you have renamed some field in the model.
when you have ran this command
python manage.py makemigration.
This in the app folder inside migration folder would have made a migration file (last file, eg: 000_initial).
Open that file inside that Migration Class would be written in the beginning of that class add this.
atomic = False
It will look something like this
class Migration(migrations.Migration):
atomic = False
That will help you run command error free:
python manage.py migrate
For more Reference check: https://docs.djangoproject.com/en/2.1/howto/writing-migrations/
I upload my first Django-project into DigitalOcean. After command python manage.py loaddata initial_data.json, I have received this message:
django.db.utils.IntegrityError: Problem installing fixture
'/webapps/django_shop/shop/initial_data.json': Could not load
contenttypes.ContentType(pk=3): duplicate key value violates unique
constraint "django_content_type_app_label_76bd3d3b_uniq" DETAIL: Key
(app_label, model)=(auth, permission) already exists.
How can I fix it?
I had the same problem and I solved this way
DB with data to export from
python manage.py dumpdata --exclude auth.permission --exclude contenttypes > db.json
New DB to import to
python manage.py flush
// Important! Disable all signals on models pre_save and post_save
python manage.py loaddata db.json
// Do not forget to enable all signals that you disabled
It looks like you've generated fixtures that include Django's default data set, i.e. the built-in entries that are inserted normally as part of the first migrate run for some of Django's plumbing data types.
You should review your fixture process, because content type entries will be created automatically when your (and Django's) apps' migrations are run, so they should not be present in fixtures. It's possible there are other tables that will have this same problem, so now would be a good time to make sure you're not including any other data that would result in this situation.
As in this question, I set up a dumpdata-based backup system for my database. The setup is akin to running a cron script that calls dumpdata and moves the backup to a remote server, with the aim of simply using loaddata to recover the database. However, I'm not sure this plays well with migrations. loaddata now has an ignorenonexistent switch to deal with deleted models/fields, but it is not able to resolve cases where columns were added with one-off defaults or apply RunPython code.
The way I see it, there are two sub-problems to address:
Tag each dumpdata output file with the current version of each app
Splice the fixtures into the migration path
I'm stumped about how to tackle the first problem without introducing a ton of overhead. Would it be enough to save an extra file per backup that contained an {app_name: migration_number} mapping?
The second problem I think is easier once the first one is solved, since the process is roughly:
Create a new database
Run migrations forward to the appropriate point for each app
Call loaddata with the given fixture file
Run the rest of the migrations
There's some code in this question (linked from the bug report) that I think could be adapted for this purpose.
Since these are fairly regular/large snapshots of the database, I don't want to keep them as data migrations cluttering up the migrations directory.
I am taking the following steps to backup, restore or transfer my postgresql database between any instance of my project:
The idea is to keep the least possible migrations as if manage.py makemigrations was run for the first time on an empty database.
Let's assume that we have a working database to our development environment. This database is a current copy of the production database that should not be open to any changes. We have added models, altered attributes etc and those actions have generated additional migrations.
Now the database is ready to be migrated to production which -as stated before- is not open to public so it is not altered in any way. In order to achieve this:
I perform the normal procedure in the development environment.
I copy the project to the production environment.
I perform the normal procedure in the production environment
We make the changes in our development environment. No changes should happen in the production database because they will be overridden.
Normal Procedure
Before anything else, I have a backup of the project directory (which includes a requirements.txt file), a backup of the database and -of course- git is a friend of mine.
I take a dumpdata backup in case I need it. However, dumpdata has some serious limitations regarding content types, permissions or other cases where a natural foreignkey should be used:
./manage.py dumpdata --exclude auth.permission --exclude contenttypes --exclude admin.LogEntry --exclude sessions --indent 2 > db.json
I take a pg_dump backup to use:
pg_dump -U $user -Fc $database --exclude-table=django_migrations > path/to/backup-dir/db.dump
Only if I want to merge existing migrations in one, I delete all migrations from every application.
In my case the migrations folder is a symlink, so I use the following script:
#!/bin/bash
for dir in $(find -L -name "migrations")
do
rm -Rf $dir/*
done
I delete and recreate the database:
For example, a bash script can include the following commands:
su -l postgres -c "PGPASSWORD=$password psql -c 'drop database $database ;'"
su -l postgres -c "createdb --owner $username $database"
su -l postgres -c "PGPASSWORD=$password psql $database -U $username -c 'CREATE EXTENSION $extension ;'"
I restore the database from the dump:
pg_restore -Fc -U $username -d $database path/to/backup-dir/db.dump
If migrations were deleted in step 3, I recreate them in the following way:
./manage.py makemigrations <app1> <app2> ... <appn>
... by using the following script:
#!/bin/bash
apps=()
for app in $(find ./ -maxdepth 1 -type d ! -path "./<project-folder> ! -path "./.*" ! -path "./")
do
apps+=(${app#??})
done
all_apps=$(printf "%s " "${apps[#]}")
./manage.py makemigrations $all_apps
I migrate using a fake migration:
./manage.py migrate --fake
In case something has gone completely wrong and everything is ***, (this can happen, indeed), I can use the backup to revert everything to its previous working state. If I would like to use the db.json file from step one, it goes like this:
When pg_dump or pg_restore fails
I perform the steps:
3 (delete migrations)
4 (delete and recreate the database)
6 (makemigrations)
and then:
Apply the migrations:
./manage.py migrate
Load the data from db.json:
./manage.py loaddata path/to/db.json
Then I try to find out why my previous effort was not successful.
When the steps are performed successfully, I copy the project to the server and perform the same ones to that box.
This way, I always keep the least number of migrations and I am able to use pg_dump and pg_restore to any box that shares the same project.
I cannot seem to get this working.
I need South to do migrations for a bunch of apps.
Downloaded south 0.7.3
Unzipped, ran setup.py develop (as it says in the turorial)
Double checked to see if it south is where it should be by going to python interpreter and doing (no errors)
import south
I do
C:\Users\j\iMiCode\imi_admin>python ./manage.py syncdb
Syncing... No fixtures found.
Synced:
> django.contrib.auth
> django.contrib.contenttypes
> django.contrib.sessions
> django.contrib.sites
> django.contrib.messages
> django.contrib.admin
Not synced (use migrations):
- south (use ./manage.py migrate to migrate these)
-At this point from what I understand south should have been synced correct?
Anything else I do after this, complains that I have no south_migrationhistory tables in the database.
PS. I am working with Django 1.2.7, python 2.6, on Windows7
It seems to me like a bug in South.
Also this may be cause by doing wrong thigs like: running schemamigration --auto south and etc. My suggestion would be install it by running python setup.py install or through easy_install or pip
South documentation says: "Once South is added in, you’ll need to run ./manage.py syncdb to make the South migration-tracking tables (South doesn’t use migrations for its own models, for various reasons)."
But your output says that south skipped making tables for its own models because it thought south app used migrations
As a workaround you could use
python manage.py syncdb --all
Which causes all tables regardless of migrations to be synchronized and
python manage.py migrate --fake
to fake migrations.
For a new app, with no existing tables the steps to adding south are this:
add 'south', to the list of INSTALLED_APPS.
make sure the app you need to migrate is also in INSTALLED_APPS.
run ./manage.py syncdb (or python manage.py syncdb from inside your project directory). This adds the migration tables to the database.
from the command line, perform ./manage.py schemamigration yourappname --initial
run ./manage.py migrate yourappname
Based on the error you're giving, it sounds like after step 1 & 2 you forgot to run syncdb to create the migration tables, and the South app isn't finding the place it wants to store schema migrations.
I ran into this same issue. Turns out, by some magics, I had created migrations inside of hte south app.
Discovered by:
~ $ # cd to python library
~ $ cd `python -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())"`
python2.7/site-packages $ cd south
python2.7/site-packages/south $ ls migrations
0001_initial.py 0002_initial.py 0003_initial.py __init__.py
These are bad, should not be there, and is what triggers south to skip itself.
Removed all things south, reinstalled, then syncdb once again worked.
python2.7/site-packages $ rm -rf south* South*
~ $ pip install south
I want to generate a basic DB schema for my django project to display all my Apps with Models and Model Fields with boundary conditions etc. Is there already any DB schema generator for django in python? Or otherwise how should i go about doing it.
If your talking about needing to see the SQL schema, run ./manage.py sqlall <appname>
If you want a visualisation of the schema you can get django-extensions and run ./manage.py graph_models -a -g -o my_project.png. This will produce a pretty schema graph for you, but generally omits border conditions. you may want to check the options to add more data. http://readthedocs.org/docs/django-extensions/en/latest/graph_models.html
manage.py sql <appname appname ...> (docs)
Using Your DB
As mentioned in the tutorial, you can use your database's command line client to get the schema.
Example using sqlite:
python manage.py dbshell
> .schema
You may need to install sqlite3 for this to work.
Using Django
You used to be able to use python manage.py sql ..., but it has been deprecated in 1.9 in favor of migrations. You can check out the initial migration scripts using:
python manage.py sqlmigrate myapp 0001_initial
(From Answer: Equivalent of sqlall in Django 1.9?)