I have a little Django app that uses PyMongo and MongoDB.
If I write (or update) something in the database, I have to restart the server for it to show in the web page. I'm running with 'python manage.py runserver'
I switched to the django dummy cache but that didn't help.
Every database action is within an 'with MongoClient' statement.
I figured it out. I read in the data in the django_tables2 class variables. So it was never refreshed...
Bangs forehead on desk...
I am writing a REST API using flask_restful and managing the mysql db using flask-sqlalchemy. I would like to know what the best practice for loading existing data into a table when the app starts is.
I am currently calling the db.create_all() method withing an endpoint with the #app.before_first_request decorator. I would like to then fill in one of the tables created with existing data from a csv file. Should the code to push the data in a separate script or within the function?
Thanks!
I would separate loading initial database data from application initialization, because probably initial data from my experience would not be changed often and can take some time if file is bigger, and usually you don't need to reload it in the database each time application loads.
I think you will most certainly need database migrations at some point in your application development, so I would suggest setting up Flask-Migrate to handle that, and running its upgrade method on application creation (create_app method if you are using Flask application factories pattern) which will handle database migrations. I am saying this since it will save you some headache when you are introducing it later on database already populated with actual data which is initialized with db.create_all().
And for populating database with seed data I would go with Flask CLI or Flask-Script. In one of my recent projects I used Flask-Script for this, and created separate manage.py file which amongst other application management methods contained initial data seeding method which looked something like this:
#manager.command
def seed():
"Load initial data into database."
db.session.add(...)
db.session.commit()
And it was run on demand by following command:
python manage.py seed
I am trying to do do a simple Django test using a tests.py file in a Django App directory (mysite/polls/tests.py), but every time I run 'python manage.py test polls', I get the error:
C:\Python27\python.exe "C:\Program Files (x86)\JetBrains\PyCharm 3.0\helpers\pycharm\django_test_manage.py" test polls "C:\Users\<myname>\PycharmProjects\mysite"
Testing started at 8:40 PM ...
Creating test database for alias 'default'...
Type 'yes' if you would like to try deleting the test database 'test_<database>', or 'no' to cancel: Got an error creating the test database: permission denied to create database
From what I've read, apparently Heroku PG uses a shared database, so I do not have the permission to create/destroy databases, which is necessary for testing. Is there an obvious solution to this? I am still developing on my local drive, so any workarounds would be appreciated. I know that testing is an important part of programming, so I would like to be able to implement a testing method as soon as possible.
I am trying to test using the TestCase django class.
What I am using:
1) Heroku Postgres Hobby Dev Plan
2) Postgres 9.3.3
3) Python 2.7.6
4) Django 1.6.1
EDIT:
So after doing a bit more research, I found out that I can override my DATABASES dict variable in settings.py to use SQLite to test locally (when 'test' is an argument in shell), but I would still prefer a PostgreSQL implementation, since from what I read, PostgreSQL is more strict (which I am a fan of).
For anyone interested in the semi-solution I have found (courtesy of another member of Stackoverflow):
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
Don't forget to import sys.
There's no reason for you to need to create a database with your tests. Instead, change your tests to access the local database, if a DATABASE_URL environment variable is undefined. Here's what I do in Node.js, where I have local test and dev databases, and a Heroku-provided production db:
if (typeof(process.env.DATABASE_URL) !== 'undefined') {
dbUrl = url.parse(process.env.DATABASE_URL);
}
else if (process.env.NODE_ENV === 'test') {
dbUrl = url.parse('tcp://postgres:postgres#127.0.0.1:5432/test');
}
else {
dbUrl = url.parse('tcp://postgres:postgres#127.0.0.1:5432/db');
}
I'd like to run local Django unit tests for a Google App Engine project. GAE recently received some python unit testing utilities that allow one to create stubs for e.g. memcache, the datastore, the task queue, etc.
I'd like to be able to use Django's unit testing framework. My first thought is to overload DjangoTestSuiteRunner to do the following for each test case:
# setUp
self.testbed = testbed.Testbed()
# Then activate the testbed, which prepares the service stubs for use.
self.testbed.activate()
# Next, declare which service stubs you want to use.
self.testbed.init_datastore_v3_stub()
self.testbed.init_memcache_stub()
# ... after tests:
#
# Teardown
self.testbed.deactivate()
I'd like to know if anyone else has tried to run Django's testing framework with the new unittests that can be run from the command line for GAE, and if so what pitfalls they've encountered. For example, are there any issues with calling Django's django.test.utils.setup_test_environment and teardown_test_environment? What other issues might come up?
Incidentally, I'm not using any Django-GAE helpers such as google-app-engine-django.
Thank you for reading.
Just wanted to mention: standard django unit testing worked very nice for me with django-nonrel and GAE Test Bed, including task-queues, memcache, etc. I think it is the same python unit testing code that you mentioned.
My Django unit tests take a long time to run, so I'm looking for ways to speed that up. I'm considering installing an SSD, but I know that has its downsides too. Of course, there are things I could do with my code, but I'm looking for a structural fix. Even running a single test is slow since the database needs to be rebuilt / south migrated every time. So here's my idea...
Since I know the test database will always be quite small, why can't I just configure the system to always keep the entire test database in RAM? Never touch the disk at all. How do I configure this in Django? I'd prefer to keep using MySQL since that's what I use in production, but if SQLite 3 or something else makes this easy, I'd go that way.
Does SQLite or MySQL have an option to run entirely in memory? It should be possible to configure a RAM disk and then configure the test database to store its data there, but I'm not sure how to tell Django / MySQL to use a different data directory for a certain database, especially since it keeps getting erased and recreated each run. (I'm on a Mac FWIW.)
If you set your database engine to sqlite3 when you run your tests, Django will use a in-memory database.
I'm using code like this in my settings.py to set the engine to sqlite when running my tests:
if 'test' in sys.argv:
DATABASE_ENGINE = 'sqlite3'
Or in Django 1.2:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'sqlite3'}
And finally in Django 1.3 and 1.4:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
(The full path to the backend isn't strictly necessary with Django 1.3, but makes the setting forward compatible.)
You can also add the following line, in case you are having problems with South migrations:
SOUTH_TESTS_MIGRATE = False
I usually create a separate settings file for tests and use it in test command e.g.
python manage.py test --settings=mysite.test_settings myapp
It has two benefits:
You don't have to check for test or any such magic word in sys.argv, test_settings.py can simply be
from settings import *
# make tests faster
SOUTH_TESTS_MIGRATE = False
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
Or you can further tweak it for your needs, cleanly separating test settings from production settings.
Another benefit is that you can run test with production database engine instead of sqlite3 avoiding subtle bugs, so while developing use
python manage.py test --settings=mysite.test_settings myapp
and before committing code run once
python manage.py test myapp
just to be sure that all test are really passing.
MySQL supports a storage engine called "MEMORY", which you can configure in your database config (settings.py) as such:
'USER': 'root', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'OPTIONS': {
"init_command": "SET storage_engine=MEMORY",
}
Note that the MEMORY storage engine doesn't support blob / text columns, so if you're using django.db.models.TextField this won't work for you.
I can't answer your main question, but there are a couple of things that you can do to speed things up.
Firstly, make sure that your MySQL database is set up to use InnoDB. Then it can use transactions to rollback the state of the db before each test, which in my experience has led to a massive speed-up. You can pass a database init command in your settings.py (Django 1.2 syntax):
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'HOST':'localhost',
'NAME':'mydb',
'USER':'whoever',
'PASSWORD':'whatever',
'OPTIONS':{"init_command": "SET storage_engine=INNODB" }
}
}
Secondly, you don't need to run the South migrations each time. Set SOUTH_TESTS_MIGRATE = False in your settings.py and the database will be created with plain syncdb, which will be much quicker than running through all the historic migrations.
You can do double tweaking:
use transactional tables: initial fixtures state will be set using database rollback after every TestCase.
put your database data dir on ramdisk: you will gain much as far as database creation is concerned and also running test will be faster.
I'm using both tricks and I'm quite happy.
How to set up it for MySQL on Ubuntu:
$ sudo service mysql stop
$ sudo cp -pRL /var/lib/mysql /dev/shm/mysql
$ vim /etc/mysql/my.cnf
# datadir = /dev/shm/mysql
$ sudo service mysql start
Beware, it's just for testing, after reboot your database from memory is lost!
Another approach: have another instance of MySQL running in a tempfs that uses a RAM Disk. Instructions in this blog post: Speeding up MySQL for testing in Django.
Advantages:
You use the exactly same database that your production server uses
no need to change your default mysql configuration
Extending on Anurag's answer I simplified the process by creating the same test_settings and adding the following to manage.py
if len(sys.argv) > 1 and sys.argv[1] == "test":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.test_settings")
else:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
seems cleaner since sys is already imported and manage.py is only used via command line, so no need to clutter up settings
Use below in your setting.py
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'