Problem with django nose and south having multiple databases - python

I had a django project with one database (default). South was installed for generating migration scripts and nose as a test framework. Models were built on this database. All tests were run successfully.
Subsequently, I needed to connect a second database (legacy), which I also added to the DATABASES configuration. I access this database using raw sql and no models. While trying to run the previously running tests, I noticed that:
nose also creates a test database for the legacy database
default django tables (auth_... etc) are also created in this database
south runs all migration scripts against the legacy database as well and fail to do so
What I would like is to disable the creation of the test legacy database and the running of the migration scripts on it. Ideally, I would like to create tables in the test legacy database myself by issuing raw sql create-insert statements. Is that possible?
Thank you for any help.

Your path of least resistance is probably to write your own test running management command. To do so you can either override the existing command or simply create a separate command with your desired behavior.
The docs for creating custom management commands can be found on the official django docs and you can find a decent example of overriding the stock "test" command in the django-test-extensions project.

Related

Will testing.postgresql allow me to add triggers?

I'm writing a Python app and I want to test some triggers I have on a postgres db. I've used testing.postgresql in the past for storing simple things. Does it support triggers? I'm running migrations against it and the triggers don't seem to react (or exist if I look in information_schema.triggers).

Storing the Django project with the PostgreSQL database on GitHub

I changed the database from SQLite3 to PostgreSQL in my Django project. Is it possible to store my new database in the GitHub repository so that after cloning and running by the command
python manage.py runserver
the project has started with the whole database?
You cannot save the database as such, you can instead create fixtures to be run. Whenever someone will clone the project, he/she can simple run those fixtures to populate the database.

Django on GAE - How to automatically 'migrate' on deploy?

Django v1.11
Postgresql v9.6
Currently, I use 2 Google CloudSQL databases, one for development and one for production. Whenever I make changes to my models, I run python manage.py migrate to update the tables in the development database. This migration does not affect the production database, however.
Now, whenever I git push changes to my Django project, TravisCI automatically runs tests and deploys the code to Google App Engine. Currently, it runs on GAE flexible environment (so I can use Python 3.5)
What I want to have is for Travis or GAE to automatically run python manage.py migrate on the production database before runserver. However, I can't figure out how to run custom commands during a deploy.
I've tried looking around GAE and Travis documentation and adding scripts to .travis.yml and app.yaml, but to no avail.
As of now, anytime there is a model change, I have to migrate the production database locally in a very hacky way. Ideally, GAE will migrate at the beginning of every deploy.
Not sure if you have seen this:
Travis CI Script Deployment
A reference from a similar issue:
How can I run a script as part of a Travis CI build?
Also, consider a database migration tool embedded with your source code, Postgresql is supported (something similar to FlywayDB migration ):
Yoyo database migrations¶

Django Migration Process for Elasticbeanstalk / Multiple Databases

I am developing a small web application using Django and Elasticbeanstalk.
I created a EB application with two environments (staging and production), created a RDS instance and assigned it to my EB environments.
For development I use a local database, because deploying to AWS takes quite some time.
However, I am having troubles with the migrations. Because I develop and test locally every couple of minutes, I tend to have different migrations locally and on the two environments.
So once I deploy the current version of the app to a certain environment, the "manage.py migrate" fails most of the times because tables already exist or do not exist even though they should (because another environment already created the tables).
So I was wondering how to handle the migration process when using multiple environments for development, staging and production with some common and some exclusive database instances that might not reflect the same structure all the time?
Should I exclude the migration files from the code repository and the eb deployment and run makemigrations & migrate after every deployment? Should I not run migrations automatically using the .ebextensions and apply all the migrations manually through one of the instances?
What's the recommended way of using the same Django application with different database instances on different environments?
Seems that you might have deleted the table or migrations at some point of time.
When you run makemigrations, django create migratins and when you run migrate, it creates database whichever is specified in settings file.
One thing is if you keep on creating migrations and do not run it in a particular database, it will be absolutely fine. Whenever you switch to databsse and run migrations, it will handle it as every database will store the point upto which migrations have been run until now in django-migrations table and will start running next migrations only.
To solve your problem, you can delete all databases and migration files and start afresh as you are perhaps testing right now. Things will go fine untill you delete a migration or a database in any of the server.
If you have precious data, you should get into migration files and tables to analyse and manage things.

How do I get Django to log why an sql transaction failed?

I am trying to debug a Pootle (pootle is build on django) installation which fails with a django transaction error whenever I try to add a template to an existing language. Using the python debugger I can see that it fails when pootle tries to save a model as well as all the queries that have been made in that session.
What I can't see is what specifically causes the save to fail. I figure pootle/django must have added some database database constraint, how do I figure out which one? MySql (the database being used) apparently can't log just failed transactions.
Install django debug toolbar, you can easily check all of the queries that have been executed

Categories

Resources