Is there a way to execute a Django fixture just once - when the appropriate table is created? I have some initial data that should be put in the app tables, but once the tables are there, I don't want every ./manage.py syncdb to refresh the data. According to Django docs it seems this can only be done for fixtures in SQL format and not JSON / YAML:
http://docs.djangoproject.com/en/1.3/howto/initial-data/
You're going to want to use the post_syncdb signal, and filter/manually load the fixture via the underlying methods when specific apps or models are created.
Related
i want to connect my existing database of sql server with django but the problem is django has its model which create its own database but i dont want to create database using django i just want to use mine to retrieve data.
the one solution i saw was to use inspectdb but hte problem with inspectdb is that it does not pick wring keys and constraints sometime plus have to set many thing manually but in my project my database is user defined user will connect its database so i dont actually know that how many and what table user's database have do i just want to connect that database with django and use the value of it.
my existing database is sqlserver.
any technique to use my existing database without using django database and retrive data from existing database.
thankyou.
As you mentionned, you should use : inspectdb to create your models as stated in the doc (https://docs.djangoproject.com/en/4.0/howto/legacy-databases/)
Setting the managed option to False in the Meta class of model, you can instruct Django to not make any migrations or database schema modification, but you need to tweak the model yourself to ensure every kind of mapping is as you intendend, and obviousl ensure coherence between your DB schema and models.
In ASP.NET there is entity framework or something called "database first," where entities are generated from an existing database. Is there something similar for Django?
I usually work with a pre-existing database that I need to create a backend (and subsequently a front end) for. Some of these relational databases have many tables and relations so manually writing models isn't a good idea. I've scoured Google for solutions but have come up relatively empty handed.
You can use the information on this link.
python manage.py inspectdb > models.py
https://docs.djangoproject.com/en/3.0/howto/legacy-databases/
It depends on your database, but I've worked with it and is good.
The default in Django is a Code First-type approach where the Django framework engine creates a Db for you based on your models and uses migrations to update the Db with model changes (like Code First).
https://docs.djangoproject.com/en/1.11/intro/tutorial02/
What you're describing sounds like a Database First approach in the .Net world.
Integrating Django with a legacy database:
https://docs.djangoproject.com/en/1.11/howto/legacy-databases/
As the title suggests, I am interested in how can I create the same model in multiple databases. I have the following scenario: every user has his own database (this is the simplified case) and when the user adds a new module to his app, I want to be able to create the db table for the module that he added.
The idea that I currently have is to create dynamic models, add custom app_label, and than use a Database Router that creates the model in the corresponding database.
Is there an easier way to obtain this?
Thanks in advance!
You should look at the syncdb command - you can call it programmatically like so:
from django.core.management import call_command
call_command('syncdb')
In your case, you'll want to use named arguments for any options you would have passed in at the command line:
call_command('syncdb', noinput=True, database=user_db)
Hope that helps.
Check this out also: https://docs.djangoproject.com/en/dev/ref/django-admin/#running-management-commands-from-your-code
Use the migrate command with different arguments. For instance if your db aliases are primary and secondary
do this
python manage.py migrate --database=primary
python manage.py migrate --databases=secondary
I have a bootstrap script that performs syncdb and migrate:
import settings
from django.core.management import setup_environ, call_command
setup_environ(settings) # Setting up the env settings
call_command('syncdb', migrate=True, interactive=False) # Sync the database
Pre-Requisites:
django-south for migrations.
Process happening:
initial_data fixture contains data for a model that is created by migrations.
syncdb is executed it creates all the tables except for those apps where migrations exists.
Post syncdb it tries to load initial_data and raises error of db not found because the table for app with migrations were not created by syncdb. [ Problem ]
Then it performs migration which creates the db.
Post migration it automatically loads initial_data successfully this time.
Problem:
How can I get rid of the error, when it tries to load fixture for the table that is not yet created ?
Can I edit the above script in a way so that it loads initial_data only after performing the migration ?
You could disable loading initial data when syncdb:
call_command('syncdb', load_initial_data=False, interactive=False)
call_command('migrate', interactive=False)
From the source code of syncdb.py:
# Stealth option -- 'load_initial_data' is used by the testing setup
# process to disable initial fixture loading.
load_initial_data = options.get('load_initial_data', True)
There are a few ways you can solve this:
Exclude the apps from the initial data dump by just calling the apps you want to populate data for
Could try this library https://github.com/davedash/django-fixture-magic
You could write a custom management command to populate the models you require
You can also use data migration that comes with south http://south.aeracode.org/docs/tutorial/part3.html
Personally I would go with either 1 or 3. With the first point, store the fixtures individually in each app under a fixtures folder. However, this is pain to update if your models change. So writing a custom management command might be the most painless.
I want to write tests that can show whether or not the database is in sync with my models.py file. Actually I have already written them, only to find out that django creates a new database each time the tests are run based on the models.py file.
Is there any way I can make the models.py test use the existing database schema? The one that's in mysql/postgresql, and not the one that's in /myapp/models.py ?
I don't care about the data that's in the database, I only care about it's schema i.e. I want my tests to notice if a table in the database has less fields than the schema in my models.py file.
I'm using the unittest framework (actually the django extension to it) if this has any relevance.
thanks
What we did was override the default test_runner so that it wouldn't create a new database to test against. This way, it runs the test against whatever our current local database looks like. But be very careful if you use this method because any changes to data you make in your tests will be permanent. I made sure that all our tests restores any changes back to their original state, and keep our pristine version of our database on the server and backed up.
So to do this you need to copy the run_test method from django.test.simple to a location in your project -- I put mine in myproject/test/test_runner.py
Then make the following changes to that method:
// change
old_name = settings.DATABASE_NAME
from django.db import connection
connection.creation.create_test_db(verbosity, autoclobber=not interactive)
result = unittest.TextTestRunner(verbosity=verbosity).run(suite)
connection.creation.destroy_test_db(old_name, verbosity)
// to:
result = unittest.TextTestRunner(verbosity=verbosity).run(suite)
Make sure to do all the necessary imports at the top and then in your settings file set the setting:
TEST_RUNNER = 'myproject.test.test_runner.run_tests'
Now when you run ./manage.py test Django will run the tests against the current state of your database rather than creating a new version based on your current model definitions.
Another thing you can do is create a copy of your database locally, and then do a check in your new run_test() method like this:
if settings.DATABASE_NAME != 'my_test_db':
sys.exit("You cannot run tests using the %s database. Please switch DATABASE_NAME to my_test_db in settings.py" % settings.DATABASE_NAME)
That way there's no danger of running tests against your main database.