Django workflow when modifying models frequently? - python

as I usually don't do the up front design of my models in Django projects I end up modifying the models a lot and thus deleting my test database every time (because "syncdb" won't ever alter the tables automatically for you). Below lies my workflow and I'd like to hear about yours. Any thoughts welcome..
Modify the model.
Delete the test database. (always a simple sqlite database for me.)
Run "syncdb".
Generate some test data via code.
goto 1.
A secondary question regarding this.. In case your workflow is like above, how do you execute the 4. step? Do you generate the test data manually or is there a proper hook point in Django apps where you can inject the test-data-generating-code at server startup?\
TIA.

Steps 2 & 3 can be done in one step:
manage.py reset appname
Step 4 is most easily managed, from my understanding, by using fixtures

This is a job for Django's fixtures. They are convenient because they are database independent and the test harness (and manage.py) have built-in support for them.
To use them:
Set up your data in your app (call
it "foo") using the admin tool
Create a fixtures directory in your
"foo" app directory
Type: python manage.py dumpdata --indent=4 foo > foo/fixtures/foo.json
Now, after your syncdb stage, you just type:
python manage.py loaddata foo.json
And your data will be re-created.
If you want them in a test case:
class FooTests(TestCase):
fixtures = ['foo.json']
Note that you will have to recreate or manually update your fixtures if your schema changes drastically.
You can read more about fixtures in the django docs for Fixture Loading

Here's what we do.
Apps are named with a Schema version number. appa_2, appb_1, etc.
Minor changes don't change the number.
Major changes increment the number. Syncdb works. And a "data migration" script can be written.
def migrate_appa_2_to_3():
for a in appa_2.SomeThing.objects.all():
appa_3.AnotherThing.create( a.this, a.that )
appa_3.NewThing.create( a.another, a.yetAnother )
for b in ...
The point is that drop and recreate isn't always appropriate. It's sometimes helpful to move data form the old model to the new model without rebuilding from scratch.

South is the coolest.
Though good ol' reset works best when data doesn't matter.
http://south.aeracode.org/

To add to Matthew's response, I often also use custom SQL to provide initial data as documented here.
Django just looks for files in <app>/sql/<modelname>.sql and runs them after creating tables during syncdb or sqlreset. I use custom SQL when I need to do something like populate my Django tables from other non-Django database tables.

Personally my development db is for a project I'm working on right now is rather large, so I use dmigrations to create db migration scripts to modify the db (rather than wiping out the db everytime like I did in the beginning).
Edit: Actually, I'm using South now :-)

Related

Django: How to reset single model/class database in app managing several different classes?

Can django delete/erease/clean data of single database/class in an app managing more disjunctive classes? Or is it safer to build separate app for each separate/disjunctive database?
I have found how to delete all data from all project databases, i.e.:
python manage.py flush
I was only surpruised that user IDs and class IDs have not been reset to 1, but previously used ID numbers stayed somewhere in project memory.
I have also found that it is possible to delete/clean all databases of signle app, e.g. here or here.
python manage.py migrate <app> zero
Here even IDs were reset. Does somebody know how to reset user IDs to start again from number 1?
But nowhere, I have found how to delete/erase/clean/reset signle database/class of the app with more than one database. I have only found for ruby that there it could be possible like:
Model.delete_all
or
Model.destroy_all
But nobody confirmed there, that commads really work. Will/Can it work also in django?
Kind regards, Rene

Using Alembic in unit testing a SQLAlchemy app?

I have and ORM app that uses SQLAlchemy, Alembic for migration and Pytest for testing. In my testing, I have a database as a fixture. It used to be, before I used migrations, that I dropped all the tables and recreated them for each testing session.
Now that I am using migrations, I want to use Alembic in creating my fixtures too because I believe that mimics a production environment more closely.(Is that a good rationale?)
One way to do it is to downgrade() all the way down and upgrade() up each time. I don't really like this. I might be wrong.
Another would be to drop_all() and create_all() for unit tests, and just write another test that stamps the database with head and tests an upgrade and downgrade.
Is there another good/standard way to integrate migrations with fixtures so I do not have to use drop_tables?
Or is there a way to, after drop_tables stamp the db as "tail" or empty? without explicitly using the migration hash for revision 0, cause that creates dependencies, something like alembic downgrade -1 that will make it go back to year 0. Thank you.
I recommend starting a temporary database instance each time, e.g. with testing.mysqld or testing.postgresql. The advantage of this approach is that you're guaranteed to start fresh each time; the success of your tests will not depend on external factors. The downside is the extra handful of seconds that it takes to start the instance.
If you insist on using an existing database instance, you can, like you said, use create_all() + alembic stamp head. However, instead of doing drop_all(), simply drop the entire database (or schema, in the case of PostgreSQL) and recreate it.
If you insist on using drop_all(), you can drop the alembic_version table to tell alembic that the current version is "tail".

Do I Need to Migrate to Link my Database to Django

I'm working on a project that I inherited, and I want to add a table to my database that is very similar to one that already exists. Basically, we have a table to log users for our website, and I want to create a second table to specifically log users that our site fails to do a task for.
Since I didn't write the site myself, and am pretty new to both SQL and Django, I'm a little paranoid about running a migration (we have a lot of really sensitive data that I'm paranoid about wiping).
Instead of having a django migration create the table itself, can I create the second table in MySQL, and the corresponding model in Django, and then have this model "recognize" the SQL table? without explicitly using a migration?
SHORT ANSWER: Yes.
MEDIUM ANSWER: Yes. But you will have to figure out how Django would have created the table, and do it by hand. That's not terribly hard.
Django may also spit out some warnings on startup about migrations being needed...but those are warnings, and if the app works, then you're OK.
LONG ANSWER: Yes. But for the sake of your sanity and sleep quality, get a completely separate development environment and test your backups. (But you knew that already.)

Django: How to make "python manage.py syncdb" not only create a DB but also fill it with data I need?

When I run "python manage.py syncdb" I want not empty DB but with the data I want. If there a hook in Django to run a number of foo.save() lines?
From Django docs about fixtures:
Providing initial data with fixtures
A fixture is a collection of data that Django knows how to import into a database. The most straightforward way of creating a fixture if you’ve already got some data is to use the manage.py dumpdata command. Or, you can write fixtures by hand; fixtures can be written as JSON, XML or YAML (with PyYAML installed) documents. The serialization documentation has more details about each of these supported serialization formats.
Before version 1.7 Django there was a mechanism to load fixtures automatically:
If you create a fixture named initial_data.[xml/yaml/json], that fixture will be loaded every time you run migrate. This is extremely convenient, but be careful: remember that the data will be refreshed every time you run migrate. So don’t use initial_data for data you’ll want to edit.
If you are using Django>=1.7, you must issue the loaddata manage command or create a migration:
If an application uses migrations, there is no automatic loading of fixtures. Since migrations will be required for applications in Django 2.0, this behavior is considered deprecated. If you want to load initial data for an app, consider doing it in a data migration.
The JSON serializer used to choke on large inputs (tried to load everything in memory or something like that); the XML serializer used to behave better for larger fixtures.
Assuming you are using Django 1.7, you can write a data migration to insert any data you need.
There's a whole django docs page about it.
In short, you have three options:
Provide a set of "fixtures", that describe models. This way is DB agnostic and can be used with any DB django can talk to.
Provide a set of SQL scripts to run. Benefit you can get here is to use some database specific data types/programming capabilities/etc. However, it's considered deprecated in Django 1.7 and will be removed in Django 2.0
Create a set of data migrations (comes with Django >=1.7, for earlier versions use South)
Also, loading initial data can occur every time migrate is run:
If you create a fixture named initial_data.[xml/yaml/json], that fixture will be loaded every time you run migrate. This is extremely convenient, but be careful: remember that the data will be refreshed every time you run migrate. So don’t use initial_data for data you’ll want to edit.

DB Permissions with Django unit testing

Disclaimer:
I'm very new to Django. I must say that so far I really like it. :)
(now for the "but"...)
But, there seems to be something I'm missing related to unit testing. I'm working on a new project with an Oracle backend. When you run the unit tests, it immediately gives a permissions error when trying to create the schema. So, I get what it's trying to do (create a clean sandbox), but what I really want is to test against an existing schema. And I want to run the test with the same username/password that my server is going to use in production. And of course, that user is NOT going to have any kind of DDL type rights.
So, the basic problem/issue that I see boils down to this: my system (and most) want to have their "app_user" account to have ONLY the permissions needed to run. Usually, this is basic "CRUD" permissions. However, Django unit tests seem to need more than this to do a test run.
How do other people handle this? Is there some settings/work around/feature of Django that I'm not aware (please refer to the initial disclaimer).
Thanks in advance for your help.
David
Don't force Django to do something unnatural.
Allow it to create the test schema. It's a good thing.
From your existing schema, do an unload to create .JSON dump files of the data. These files are your "fixtures". These fixtures are used by Django to populate the test database. This is The Greatest Testing Tool Ever. Once you get your fixtures squared away, this really does work well.
Put your fixture files into fixtures directories within each app package.
Update your unit tests to name the various fixtures files that are required for that test case.
This -- in effect -- tests with an existing schema. It rebuilds, reloads and tests in a virgin database so you can be absolutely sure that it works without destroying (or even touching) live data.
As you've discovered, Django's default test runner makes quite a few assumptions, including that it'll be able to create a new test database to run the tests against.
If you need to override this or any of these default assumptions, you probably want to write a custom test runner. By doing so you'll have full control over exactly how tests are discovered, bootstrapped, and run.
(If you're running Django's development trunk, or are looking forward to Django 1.2, note that defining custom test runners has recently gotten quite a bit easier.)
If you poke around, you'll find a few examples of custom test runners you could use to get started.
Now, keep in mind that once you've taken control of test running you'll need to ensure that you someone meet the same assumptions about environment that Django's built-in runner does. In particular, you'll need to someone guarantee that whatever test database you'll use is a clean, fresh one for the tests -- you'll be quite unhappy if you try to run tests against a database with unpredictable contents.
After I read David's (OP) question, I was curious about this too, but I don't see the answer I was hoping to see. So let me try to rephrase what I think at least part of what David is asking. In a production environment, I'm sure his Django models probably will not have access to create or drop tables. His DBA will probably not allow him to have permission to do this. (Let's assume this is True). He will only be logged into the database with regular user privileges. But in his development environment, the Django unittest framework forces him to have higher level privileges for the unittests instead of a regular user because Django requires it to create/drop tables for the model unittests. Since the unittests are now running at a higher privilege than will occur in production, you could argue that running the unittests in development are not 100% valid and errors could happen in production that might have been caught in development if Django could run the unittests with user privileges.
I'm curious if Django unittests will ever have the ability to create/drop tables with one user's (higher) privileges, and run the unittests with a different user's (lower) privileges. This would help more accurately simulate the production environment in development.
Maybe in practice this is really not an issue. And the risk is so minor compared to the reward that it not worth worrying about it.
Generally speaking, when unit tests depend on test data to be present, they also depend on it to be in a specific format/state. As such, your framework's policy is to not only execute DML (delete/insert test data records), but it also executes DDL (drop/create tables) to ensure that everything is in working order prior to running your tests.
What I would suggest is that you grant the necessary privileges for DDL to your app_user ONLY on your test_ database.
If you don't like that solution, then have a look at this blog entry where a developer also ran into your scenario and solved it with a workaround:
http://www.stopfinder.com/blog/2008/07/26/flexible-test-database-engine-selection-in-django/
Personally, my choice would be to modify the privileges for the test database. This way, I could rule out all other variables when comparing performance/results between testing/production environments.
HTH,
-aj
What you can do, is creating separate test settings.
As I've learned at http://mindlesstechnology.wordpress.com/2008/08/16/faster-django-unit-tests/ you can use the sqlite3 backend, which is created in memory by the Django unit test framework.
Quoting:
Create a new test-settings.py file next to your app’s settings.py
containing:
from projectname.settings import * DATABASE_ENGINE = 'sqlite3'
Then when you want to run tests real fast, instead of manage.py test,
you run
manage.py test --settings=test-settings
This runs my test suite in less than 5 seconds.
Obviously you still want to run tests on your real db backend, but
this is awesome for sanity checks, and while you’re doing test
development.
To load initial data, provide fixtures in your testcase.
class MyAppTestCase(TestCase):
fixtures = ['myapp/fixtures/filename']

Categories

Resources