Django testing hangs with --keepdb flag - python

For performance tuning reasons, I want to run Django tests against a copy of my production database. As I understand it, this should be possible by:
(1) adjusting Django settings like
DATABASES = {
'default': {
...
'TEST': {
'NAME': 'my_database_copy',
},
}
}
and (2) using the --keepdb flag, as in python manage.py test --keepdb.[1]
But when I do this, the process hangs, looking like this:
bash-4.2$ python manage.py test --keepdb
Using existing test database for alias 'default'...
(The process won't close with ctrl+c. I'm using Docker, and I stop it by restarting Docker.)
There are no unapplied migrations for the database, and the test command (python manage.py test) works fine, if --keepdb is omitted.
I confirmed that the database copy is properly restored and accessible because I can access it when I run python manage.py shell.
[1] https://docs.djangoproject.com/en/3.1/topics/testing/overview/#preserving-the-test-database

Adjust the settings dictionary by adding the SERIALIZE key, like this:
DATABASES = {
'default': {
...
'TEST': {
'NAME': 'my_database_copy',
'SERIALIZE': False,
},
}
}
When SERIALIZE is True (the default), Django tries to read a copy of the whole database into memory as a string. See[1]. This considered helpful for tests when the database engine does not support transactions, but in my case crashed due to insufficient memory. Deactivating this behavior through settings is covered here[2].
[1] https://github.com/django/django/blob/d5b526bf78a9e5d9760e0c0f7647622bf47782fe/django/db/backends/base/creation.py#L73
[2] https://docs.djangoproject.com/en/3.1/ref/settings/#serialize

Related

How to configure my django settings.py for production using postgresql

I'm already deploying my django app and I'm using postgresql as my server and I used heroku for hosting my app. However, I don't know what I should place in my host instead of using localhost.
note: this works perfectly if I run it locally.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'trilz',
'USER': 'postgres',
'PASSWORD': 'franz123',
'HOST': 'localhost',
'PORT': '5432',
}
}
You should probably use environment variables (the prior link actually uses databases as an example. Using hardcoded values leaves you vulnerable to a bunch of different risks. The Django guide also presents connection files.
After you started to use that, then you need to figure out where you are running your Postgres database. localhost means "my machine" (i.e the same machine as running the Django app.). If you are using some database-as-a-service, they'll expose all the environmental variables you need. If you are using something like Heroku, they will expose environment variables when running the service, that you'll probably use. If you are using a Kubernetes/Docker setup you yourself control, then you'll probably be in control of where the database is running and needs to use the path to that.
For heroku
I've used https://pypi.org/project/dj-database-url/ for a hobby project (which is no longer maintained, but does work, the last time I used it).
My config then looked like this:
DATABASES = {"default": {"ENGINE": "django.db.backends.sqlite3", "NAME": "mydatabase"}}
if "DATABASE_URL" in os.environ:
logger.info("Adding $DATABASE_URL to default DATABASE Django setting.")
DATABASES["default"] = dj_database_url.config(conn_max_age=600)
DATABASES["default"]["init_command"] = "SET sql_mode='STRICT_TRANS_TABLES'"
That gives you a working Sqlite3 version if no URL is added. You can use something else if you'd like. Heroku exposes an environment variable called DATABASE_URL that links to the database you configured in Heroku, which you catch in if "DATABASE_URL" in os.environ:, and then subsequently use. Did this provide a sufficient answer?

Django sqlite3 timeout has no effect

I have a simple integration test in Django that spawns a single Celery worker to run a job, which writes a record to the database. The Django thread also writes a record to the database. Because it's a test, I use the default in-memory sqlite3 database. There are no transactions being used.
I often get this error:
django.db.utils.OperationalError: database table is locked
which according to the Django docs is due to one connection timing out while waiting for another to finish. It's "more concurrency than sqlite can handle in default configuration". This seems strange given that it's two records in two threads. Nevertheless, the same docs say to increase the timeout option to force connections to wait longer. Ok, I change my database settings to this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'OPTIONS': {'timeout': 10000000},
}
}
This has no effect. The error still appears and it clearly has not waited 1e7 seconds or 1e7 milliseconds or 1e7 microseconds before doing so. Is there an additional setting I'm missing?
I have tried both Python 3.5 and Python 3.6 and both Django 1.11 and Django 2.0.
I had the same issue and my experiments gave me the following:
I've figured out that Django uses in-memory SQLite DB in the test mode until you explicitly change this. That explains why I only see that problem in my unit tests. To force Django to use SQLite DB in the file set DATABASES->TEST->NAME explicitly in your settings.py. For example like this:
DATABASES = {
'default': {
...
'TEST': {
'NAME': 'testdb.sqlite3',
},
},
}
Setting timeout value larger than 2147483.647 (looks familiar, right? :-) ) disables timeout (or sets it to negligibly small value).
As far as I understand, the root of the problem is that when SQLite uses the shared cache the timeout value is not respected at all.

How to connect Celery worker to django test database

I could like get my celery worker process to talk to the django test database.
Its an oracle database, so I believe the database/user is already created.
I am just trying to figure out what to pass the Celery/App configuration to get it to talk to the "TEST" database.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
.............
'TEST': {
'USER': WIT_TEST_DB_USER,
'PASSWORD': WIT_TEST_DB_USER,
}
}
}
I have seen a stackoverflow article that talks about passing the settings.conf from the parent test setup() to the worker process. That may be necessary when the test database file is automatically generated in case of sqllite databases.
In my case, its a well defined oracle test database that I think is already part of the config/settings files.
so I am looking for a way to directly start the work process independent of the testrunner/testcase code.
Can some one suggest an approach to doing this?
You are regarding your test database as an ordinary database. So I think the best solution would be to define your test database as the default database under DATABASES settings in a separate settings file. And when running your worker you can pass the specific new settings file to your worker like this:
export DJANGO_SETTINGS_MODULE='[python path to your celery specific settings file]'
# the command to run your celery worker

Django Test Error "Permission denied to create database" - Using Heroku Postgres

I am trying to do do a simple Django test using a tests.py file in a Django App directory (mysite/polls/tests.py), but every time I run 'python manage.py test polls', I get the error:
C:\Python27\python.exe "C:\Program Files (x86)\JetBrains\PyCharm 3.0\helpers\pycharm\django_test_manage.py" test polls "C:\Users\<myname>\PycharmProjects\mysite"
Testing started at 8:40 PM ...
Creating test database for alias 'default'...
Type 'yes' if you would like to try deleting the test database 'test_<database>', or 'no' to cancel: Got an error creating the test database: permission denied to create database
From what I've read, apparently Heroku PG uses a shared database, so I do not have the permission to create/destroy databases, which is necessary for testing. Is there an obvious solution to this? I am still developing on my local drive, so any workarounds would be appreciated. I know that testing is an important part of programming, so I would like to be able to implement a testing method as soon as possible.
I am trying to test using the TestCase django class.
What I am using:
1) Heroku Postgres Hobby Dev Plan
2) Postgres 9.3.3
3) Python 2.7.6
4) Django 1.6.1
EDIT:
So after doing a bit more research, I found out that I can override my DATABASES dict variable in settings.py to use SQLite to test locally (when 'test' is an argument in shell), but I would still prefer a PostgreSQL implementation, since from what I read, PostgreSQL is more strict (which I am a fan of).
For anyone interested in the semi-solution I have found (courtesy of another member of Stackoverflow):
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
Don't forget to import sys.
There's no reason for you to need to create a database with your tests. Instead, change your tests to access the local database, if a DATABASE_URL environment variable is undefined. Here's what I do in Node.js, where I have local test and dev databases, and a Heroku-provided production db:
if (typeof(process.env.DATABASE_URL) !== 'undefined') {
dbUrl = url.parse(process.env.DATABASE_URL);
}
else if (process.env.NODE_ENV === 'test') {
dbUrl = url.parse('tcp://postgres:postgres#127.0.0.1:5432/test');
}
else {
dbUrl = url.parse('tcp://postgres:postgres#127.0.0.1:5432/db');
}

How to run Django's test database only in memory?

My Django unit tests take a long time to run, so I'm looking for ways to speed that up. I'm considering installing an SSD, but I know that has its downsides too. Of course, there are things I could do with my code, but I'm looking for a structural fix. Even running a single test is slow since the database needs to be rebuilt / south migrated every time. So here's my idea...
Since I know the test database will always be quite small, why can't I just configure the system to always keep the entire test database in RAM? Never touch the disk at all. How do I configure this in Django? I'd prefer to keep using MySQL since that's what I use in production, but if SQLite 3 or something else makes this easy, I'd go that way.
Does SQLite or MySQL have an option to run entirely in memory? It should be possible to configure a RAM disk and then configure the test database to store its data there, but I'm not sure how to tell Django / MySQL to use a different data directory for a certain database, especially since it keeps getting erased and recreated each run. (I'm on a Mac FWIW.)
If you set your database engine to sqlite3 when you run your tests, Django will use a in-memory database.
I'm using code like this in my settings.py to set the engine to sqlite when running my tests:
if 'test' in sys.argv:
DATABASE_ENGINE = 'sqlite3'
Or in Django 1.2:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'sqlite3'}
And finally in Django 1.3 and 1.4:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
(The full path to the backend isn't strictly necessary with Django 1.3, but makes the setting forward compatible.)
You can also add the following line, in case you are having problems with South migrations:
SOUTH_TESTS_MIGRATE = False
I usually create a separate settings file for tests and use it in test command e.g.
python manage.py test --settings=mysite.test_settings myapp
It has two benefits:
You don't have to check for test or any such magic word in sys.argv, test_settings.py can simply be
from settings import *
# make tests faster
SOUTH_TESTS_MIGRATE = False
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
Or you can further tweak it for your needs, cleanly separating test settings from production settings.
Another benefit is that you can run test with production database engine instead of sqlite3 avoiding subtle bugs, so while developing use
python manage.py test --settings=mysite.test_settings myapp
and before committing code run once
python manage.py test myapp
just to be sure that all test are really passing.
MySQL supports a storage engine called "MEMORY", which you can configure in your database config (settings.py) as such:
'USER': 'root', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'OPTIONS': {
"init_command": "SET storage_engine=MEMORY",
}
Note that the MEMORY storage engine doesn't support blob / text columns, so if you're using django.db.models.TextField this won't work for you.
I can't answer your main question, but there are a couple of things that you can do to speed things up.
Firstly, make sure that your MySQL database is set up to use InnoDB. Then it can use transactions to rollback the state of the db before each test, which in my experience has led to a massive speed-up. You can pass a database init command in your settings.py (Django 1.2 syntax):
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'HOST':'localhost',
'NAME':'mydb',
'USER':'whoever',
'PASSWORD':'whatever',
'OPTIONS':{"init_command": "SET storage_engine=INNODB" }
}
}
Secondly, you don't need to run the South migrations each time. Set SOUTH_TESTS_MIGRATE = False in your settings.py and the database will be created with plain syncdb, which will be much quicker than running through all the historic migrations.
You can do double tweaking:
use transactional tables: initial fixtures state will be set using database rollback after every TestCase.
put your database data dir on ramdisk: you will gain much as far as database creation is concerned and also running test will be faster.
I'm using both tricks and I'm quite happy.
How to set up it for MySQL on Ubuntu:
$ sudo service mysql stop
$ sudo cp -pRL /var/lib/mysql /dev/shm/mysql
$ vim /etc/mysql/my.cnf
# datadir = /dev/shm/mysql
$ sudo service mysql start
Beware, it's just for testing, after reboot your database from memory is lost!
Another approach: have another instance of MySQL running in a tempfs that uses a RAM Disk. Instructions in this blog post: Speeding up MySQL for testing in Django.
Advantages:
You use the exactly same database that your production server uses
no need to change your default mysql configuration
Extending on Anurag's answer I simplified the process by creating the same test_settings and adding the following to manage.py
if len(sys.argv) > 1 and sys.argv[1] == "test":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.test_settings")
else:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
seems cleaner since sys is already imported and manage.py is only used via command line, so no need to clutter up settings
Use below in your setting.py
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'

Categories

Resources