DJANGO development server using TestCase database - python

GOAL: Run the DJANGO development server, using the TestCase database.
Case: DJANGO produces this database when running a TestCase. Now I'm filling the database using DJANGO-autofixture. It would be really nice to start the DJANGO testserver using this database, so I can checkout how the website presents it. Unfortunately, I can't find anywhere how to do this.
Writing the test database to sqlite would make sense, but I don't see an options for this.
Any hints are appreciated! Thanks!

The testcase database only lives within the context of a test. It is created, migrated, and loaded with fixtures before you run your tests. Default behavior is that after your test suite runs (fail or succeed), the database is dropped.
I recommend just loading your fixtures via the django-admin testserver yourfixture.json command
If you really wanted to but I think it is a not a good idea.
You have the option to provide a --keepdb argument to your test command. This will keep your test database after your test case runs. The name of that db will your actual dbs name prefixed with test_ You can then connect to that database via the database settings.
test keep alive and test database
look at database settings

Related

Django: How to disable Database status check at startup?

As far as I know Django apps can't start if any of the databases set in the settings.py are down at the start of the application. Is there anyway to make Django "lazyload" the initial database connection?
I have two databases configured and one of them is a little unstable and sometimes it can be down for some seconds, but it's only used for some specific use cases of the application. As you can imagine I don't want that all the application can't start because of that. Is there any solution for that?
I'm using Django 1.6.11, and we also use Django South for database migrations (in case that it's related somehow).
I do not know how safe it is but you can do the following:
In your settings.py start with empty DATABASES:
DATABASES = {}
In your apps.py, utilize the ready() hook to establish your database connections:
from settings.py import DATABASES
class YourAppConfig(AppConfig):
def ready(self):
DATABASES.update({
Your database connections
})
Now I cannot place my hand over fire for that one, but Django will try to reinitialize the database connection when needed.
I did some more tests and problem only happens when you are using de development server python manage.py runserver. In that case, it forces a connection with the database.
Using an actual WSGI server it doesn't happen as #Alasdair informed.
#JohnMoutafis in the end I didn't test your solution, but that could work.

How do I prevent django from accessing a database during a test?

I'm trying to write a django health check for a cloud app to determine if the server is still healthy.
One of the checks determines if the database is still accessible. And I want to test if this works, so I'm writing a django test script to verify this.
Except django is really good at keeping a database connection.
Things which haven't worked:
Using override settings:
#override_settings(DATABASES={})
def test_dead_database(self):
Popping the default database settings:
from django.conf import settings
old_db_eng = settings.DATABASES.pop('default')
But Django keeps the database alive!
How can I force the django database to go away within a test?
edit: this is made more difficult as the test suite checks against a number of database engines - so I'd prefer a django focused solution.

Django tests, transactions and angular protractor

I use django-rest-framework for backend and angularjs for frontend. I started write e2e tests using protractor and faced with a problem, that after each test all changes in database are saved.
In django every test is enclosed in a database transaction that is rolled back at the end of the test. Is there a way to enclose in transaction every protractor test? I know that I can use django live server, python-selenium and write tests in python, but then I lose advantages of protractor.
Unfortunately, there is no universal solution for this problem.
One option is to connect to your database directly from Protractor/Node.js with a database client of your choice and make the necessary database changes before, after or during the tests. You can even use ORMs like sequelize.js as an abstraction layer for your database tables. But, since your backend is not Node.js, having two database abstraction layers in two different languages would probably overcomplicate things.
Or, generally a better way: you can use your Django REST API in the "set up" and "tear down" phases of your Protractor tests to restore/prepare the necessary database state by making requests to the REST API with an HTTP client, please see more at:
Direct Server HTTP Calls in Protractor

create database automatically in django using settings.py or models.py

Current flow:
1) mysql >> CREATE {DATABASE} db_name [create_specification] ;
2) change database info in settings.py
3) python manage.py syncdb (assuming we have ready-made models)
Isn't there a way to do the same without using step 1. Maybe putting database name and specifications somewhere in settings.py so that i don't have to manually configure the db everytime i shift this project to some server
EDIT -
WHY I want to dodge the first step:
In my case, different columns of different tables have different collation types. So while development, whenever I recreate the database, i need to manually change the configurations of individual columns/tables, which is frustrating.
All you need is to know database user / password with grant create database, all other data is in settings. You can connect a custom command to pre_syncdb signal to gather this data.
Take a look to createsuperuser that is raised on post_syndb signal to learn about this.
EDITED
syncdb is not longer available. Since 1.9 you should to use pre_migrate signal.
I don't think its possible to dodge the step one, at least if you're using a different database backend except SQLite in django.
A note from docs: https://docs.djangoproject.com/en/dev/intro/tutorial01/#database-setup
If you’re using PostgreSQL or MySQL, make sure you’ve created a
database by this point. Do that with “CREATE DATABASE database_name;”
within your database’s interactive prompt.
If you’re using SQLite, you don’t need to create anything beforehand -
the database file will be created automatically when it is needed.
If you move your project to a server like Heroku, the db creation is automated, like when using PostgreSQL.
I'm wondering why you want to dogde the first step, however if you're desperate, you might still want to trying going the direct python way via psycopg2
Creating a postgresql DB using psycopg2
Granting the overall application database creation permissions to spare a minor manual initialization at deployment time sounds like a bad idea.
You should rather create a separate script that automates deployment for you. Fabric seems to have become the standard tool for that.
Done via django_extensions:
1) ./manage.py sqlcreate | ./manage.py dbshell
2) ./manage.py migrate
If at least the user on settings.py is already able to connect on the DB and have creation permissions, this should work.
To have it, I suggest to provide an envvar DATABASE_URL and use dj_database_url
At least is working for me on Postgres.
You can use sqlite while you are still in development stage.

Lazy psql connection with Django

I have a Django app that has several database backends - all connected to different instances of Postgresql database. One of them is not guaranteed to be always online. It even can be offline when application starts up.
Can I somehow configure Django to use lazy connections? I would like to:
Try querying
return "sorry, try again later" if database is offline
or return the results if database is online
Is this possible?
The original confusion is that Django tries to connect to its databases on startup. This is actually not true. Django does not connect to database, until some app tries to access the database.
Since my web application uses auth and site apps, it looks like it tries to connect on startup. But its not tied to startup, its tied to the fact that those app access the database "early".
If one defines second database backend (non-default), then Django will not try connecting to it unless application tries to query it.
So the solution was very trivial - originally I had one database that hosted both auth/site data and also "real" data that I've exposed to users. I wanted to make "real" database connection to be volatile. So I've defined separate psql backend for it and switched default backend to sqlite.
Now when trying to access "real" database through Query, I can easily wrap it with try/except and handle "Sorry, try again later" over to the user.

Categories

Resources