I have a number of projects that use the following configuration model:
settings.py includes default configuration and config specifications, mainly used for development purposes.
The default settings (including db settings) can be overridden by external configuration files, usually defined by admins for the various environments they manage. In order to ease the admins, I have written a management command and packaged separately, which adds the option to create example configuration files based on the default configuration in settings.py
Trying to run the command without the possibility for successful db connection, however, raises django.db.utils.OperationalError.
How can I make the command work without db connection, as it does not need one and normally when the command is needed, most probably the db connection is not configured properly.
settings.DATABASES != {} as there are the default db settings.
Django 1.10.6, Python 3.5.3
Do requires_system_checks = False
class Command(BaseCommand):
requires_system_checks = False
...
def add_arguments(self, parser):
#rest of code
For more info
Related
I need to execute some housekeeping code but only in development or production environment. Unfortunately all management commands execute similar to runserver. Is there any clean way to classify what is the execution environment and run the code selectively.
I saw some solutions like 'runserver' in sys.argv
but it does not work for production. And does not look very clean.
Does django provide anything to classify all these different scenarios code is executing at?
Edit
The real problem is we need to initialise our local cache once the apps are loaded with some data that are frequently accessed. In general I want to fetch DB for some specific information and cache it (currently in memory). The issue is, when it tries to fetch DB, the table may not be created, in fact there may not be migration files created at all. So, when I run makemigrations/migrate, it will run this code which tries to fetch from DB, and throw error saying table does not exist. But if I can't run makemigration/migrate, there will be no table, it is kind of a loop I'm trying to avoid. The part of code will run for all management commands, but I would like to restrict it's execution only to when the app is actually serving requests (that is when the cache is needed) and not for any management commands (including the user defined ones).
```python
from django.apps import AppConfig
from my_app.signals import app_created
class MyAppConfig(AppConfig):
name = 'my_app'
def ready(self):
import my_app.signals
# Code below should be executed only in actual app execution
# And not in makemigration/migrate etc management commands
app_created.send(sender=MyAppConfig, sent_by="MyApp")
```
Q) Send app created signal for app execution other than executions due to management commands like makemigrations, migrate, etc.
There are so many different ways to do this. But generally when I create a production (or staging, or development) server I set an environment variable. And dynamically decide which settings file to load based on that environment variable.
Imagine something like this in a Django settings file:
import os
ENVIRONMENT = os.environ.get('ENVIRONMENT', 'development')
Then you can use
from django.conf import settings
if settings.ENVIRONMENT == 'production':
# do something only on production
Since, I did not get an convincing answer and I managed to pull off a solution, although not a 100% clean. I thought I would just share solution I ended up with.
import sys
from django.conf import settings
if (settings.DEBUG and 'runserver' in sys.argv) or not settings.DEBUG:
"""your code to run only in development and production"""
The rationale is you run the code if it is not in DEBUG mode no matter what. But if it is in DEBUG mode check if the process execution had runserver in the arguments.
I would like to allow people that don't have access to the Django project root folder (so no access to settings.py file and no DB password) to use specific models outside of Django.
I would like some users to be able to query certain tables in the database using the powerful Django structure (querysets etc...) without giving full access to all the tables.
Would any of the following strategies work? Is it even possible to do that? And what are the best practices for this kind of issues?
Idea 1: Setting-up django using my_project.settings.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_project.settings")
django.setup()
The problem is that it requires access to the project root folder and give access to all the database
Idea 2: Setting-up django using a different database user with restricted access
from django.conf import settings
settings.configure(
DATABASE_ENGINE = 'django.db.backends.mysql',
DATABASE_NAME = '<db_name>',
DATABASE_USER = 'new_user',
DATABASE_PASSWORD = '<psw>',
DATABASE_HOST = '<host>',
INSTALLED_APPS = ...
)
I didn't manage to make this work yet but I feel like the django.setup() will fail because the DB user won't have access to all the tables. Also it still needs access to the project root folder.
Idea 3: Using the database (MySQL) directly from python, no link at all with Django.
This is probably not advised but that's the only solution I have so far.
Is there any django-like python package that would allow querying the database in a similar way to Django?
Idea 4: Maybe something using a custom manage.py command?
In pyramid, I need to render my templates according to different runtime environments -- enable google analytics, use minified code, etc. (when in production). Is there an easy way to find out the current environment -- perhaps an existing flag to find out which ini file was used?
Pyramid INI files can hold arbitrary configuration entries, so why not include a flag in your files that distinguishes between production and development deployments?
I'd do it like this; in your production .ini file:
[app:main]
production_deployment = True # Set to False in your development .ini file
Pass this value on to the Pyramid Configurator:
def main(global_config, **settings):
# ...
from pyramid.settings import asbool
production_deployment = asbool(settings.get(
'production_deployment', 'false'))
settings['production_deployment'] = production_deployment
config = Configurator(settings=settings)
You can now access the settings from just about anywhere in your Pyramid code. For example, in a request handler:
settings = request.registry.settings
if settings['production_deployment']:
# Enable some production code here.
However, I'd also use more finegrained settings in this case; a flag for enabling Google Analytics, one for minifying resources, etc. That way you can test each individual setting in your development environment, write unit tests for these switches, etc.
I set this sort of thing as an environmental variable named something like PYRAMID_ENV which can be viewed via os.environ. For example in your code:
import os
pyramid_env = os.environ.get('PYRAMID_ENV', 'debug')
if pyramid_env == 'debug':
# Setup debug things...
else:
# Setup production things...
Then you can set the variable in the init script or when starting the server:
PYRAMID_ENV=production python server.py
Docs on access to environmental variables: http://docs.python.org/library/os.html#os.environ
My Django unit tests take a long time to run, so I'm looking for ways to speed that up. I'm considering installing an SSD, but I know that has its downsides too. Of course, there are things I could do with my code, but I'm looking for a structural fix. Even running a single test is slow since the database needs to be rebuilt / south migrated every time. So here's my idea...
Since I know the test database will always be quite small, why can't I just configure the system to always keep the entire test database in RAM? Never touch the disk at all. How do I configure this in Django? I'd prefer to keep using MySQL since that's what I use in production, but if SQLite 3 or something else makes this easy, I'd go that way.
Does SQLite or MySQL have an option to run entirely in memory? It should be possible to configure a RAM disk and then configure the test database to store its data there, but I'm not sure how to tell Django / MySQL to use a different data directory for a certain database, especially since it keeps getting erased and recreated each run. (I'm on a Mac FWIW.)
If you set your database engine to sqlite3 when you run your tests, Django will use a in-memory database.
I'm using code like this in my settings.py to set the engine to sqlite when running my tests:
if 'test' in sys.argv:
DATABASE_ENGINE = 'sqlite3'
Or in Django 1.2:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'sqlite3'}
And finally in Django 1.3 and 1.4:
if 'test' in sys.argv:
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
(The full path to the backend isn't strictly necessary with Django 1.3, but makes the setting forward compatible.)
You can also add the following line, in case you are having problems with South migrations:
SOUTH_TESTS_MIGRATE = False
I usually create a separate settings file for tests and use it in test command e.g.
python manage.py test --settings=mysite.test_settings myapp
It has two benefits:
You don't have to check for test or any such magic word in sys.argv, test_settings.py can simply be
from settings import *
# make tests faster
SOUTH_TESTS_MIGRATE = False
DATABASES['default'] = {'ENGINE': 'django.db.backends.sqlite3'}
Or you can further tweak it for your needs, cleanly separating test settings from production settings.
Another benefit is that you can run test with production database engine instead of sqlite3 avoiding subtle bugs, so while developing use
python manage.py test --settings=mysite.test_settings myapp
and before committing code run once
python manage.py test myapp
just to be sure that all test are really passing.
MySQL supports a storage engine called "MEMORY", which you can configure in your database config (settings.py) as such:
'USER': 'root', # Not used with sqlite3.
'PASSWORD': '', # Not used with sqlite3.
'OPTIONS': {
"init_command": "SET storage_engine=MEMORY",
}
Note that the MEMORY storage engine doesn't support blob / text columns, so if you're using django.db.models.TextField this won't work for you.
I can't answer your main question, but there are a couple of things that you can do to speed things up.
Firstly, make sure that your MySQL database is set up to use InnoDB. Then it can use transactions to rollback the state of the db before each test, which in my experience has led to a massive speed-up. You can pass a database init command in your settings.py (Django 1.2 syntax):
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'HOST':'localhost',
'NAME':'mydb',
'USER':'whoever',
'PASSWORD':'whatever',
'OPTIONS':{"init_command": "SET storage_engine=INNODB" }
}
}
Secondly, you don't need to run the South migrations each time. Set SOUTH_TESTS_MIGRATE = False in your settings.py and the database will be created with plain syncdb, which will be much quicker than running through all the historic migrations.
You can do double tweaking:
use transactional tables: initial fixtures state will be set using database rollback after every TestCase.
put your database data dir on ramdisk: you will gain much as far as database creation is concerned and also running test will be faster.
I'm using both tricks and I'm quite happy.
How to set up it for MySQL on Ubuntu:
$ sudo service mysql stop
$ sudo cp -pRL /var/lib/mysql /dev/shm/mysql
$ vim /etc/mysql/my.cnf
# datadir = /dev/shm/mysql
$ sudo service mysql start
Beware, it's just for testing, after reboot your database from memory is lost!
Another approach: have another instance of MySQL running in a tempfs that uses a RAM Disk. Instructions in this blog post: Speeding up MySQL for testing in Django.
Advantages:
You use the exactly same database that your production server uses
no need to change your default mysql configuration
Extending on Anurag's answer I simplified the process by creating the same test_settings and adding the following to manage.py
if len(sys.argv) > 1 and sys.argv[1] == "test":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.test_settings")
else:
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
seems cleaner since sys is already imported and manage.py is only used via command line, so no need to clutter up settings
Use below in your setting.py
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'
When I try to run
manage.py test
a database password prompt shows.
Previously, tests would run without me having to enter the db password manaually.
I just updated my database to postgres 8.4. I assume it's some setting I'm forgetting.
How can I configure it to run tests without asking for the password?
Additional Info:
I created the database with the user 'postgres', but am accessing in django with the user, 'postgis'. I checked the permissions of these users, and they are the same.
When running the test the db and tables get created fine (no password requested).
It's only when it installs 'Custom SQL' that the password is requested.
RESOLUTION
As Carl pointed out the ~/.pgpass file [*nix] and %APPDATA%\postgresql\pgpass.conf (where %APPDATA% refers to the Application Data subdirectory in the user's profile) [windows] allows you to configure databases so you don't need to enter a password each time.
See the postgres documentation: The Password File
I checked my configuration and it looks like this file was/is auto-created. I updated my password file and now django tests run without the need to manually enter a password on each custom sql installation.
Django tests use a different database; your DATABASE_NAME setting with "_test" appended. My first guess would be that somewhere in your Postgres authentication config (either in pg_hba.conf or in a ~/.pgpass file), you are allowing access to DATABASE_NAME with no password, but you don't have the same config for DATABASE_NAME_test.
I assume it's some setting I'm
forgetting.
Not trying to make a fool out of you, but sometimes simple solutions are overlooked:
Did you set the DATABASE_PASSWORD setting in your settings.py file?