I have a 'core' Django product that includes default implementations of common tasks, but I want to allow that implementation to be redefined (or customised if that makes it easier).
For example in the core product, I might have a view which allows a user to click a button to resend 'all notifications':
# in core/views.py
... imports etc...
from core.tasks import resend_notifications
def handle_user_resend_request(request, user_id):
user = get_object_or_404(id=user_id)
if request.method == 'POST':
for follower in user.followers:
resend_notifications(follower.id)
... etc etc ...
# in core/tasks.py
... imports etc...
def resend_notifications(id):
send_email(User.objects.get(id=id))
And then in some deployments of this product, perhaps the 'resend_notifications' needs to look like:
# in customer_specific/tasks.py
... imports etc ...
def resend_notifications(id):
person = User.objects.get(id=id)
if '#super-hack.email.com' in person.email:
# This is not a real email, send via the magic portal
send_via_magic(person)
else:
send_email(person)
# and send via fax for good measure
send_fax(person)
How do I get the resend_notifications function in the views.py file to point to the customer_specific version?
Should I be defining this in the Django config and sharing access that way? What if the tasks are actually Celery tasks?
NB: The tasks I have are actually defined as Celery tasks (I removed this extra detail because I think the question is more general). I have tried with a custom decorator tag that mutates a global object, but that is definitely not the way to go for a number of reasons.
PS: I feel like this is a dependency injection question, but that is not a common thing in Django.
In a similar situation I ended up going for a solution like so – I put this on my Organization model in the application (equiv of a GitHub organization).
#property
def forms(self):
if self.ldap:
from portal.ldap import forms
else:
from portal.users import forms
return forms
I essentially wanted to use different form classes if the organization the authenticated user belongs to has LDAP configured – and thus the create/invite user forms needed to be different.
I then overwrote get_form_class in the appropriate views like so:
def get_form_class(self):
return self.request.user.organization.forms.CreateUserForm
I imagine you might want to do something similar in your scenario, wrap your function(s) in a proxy abstraction that determines which version to use – be that based on env vars, settings or the request.
This ended up being solved via a Django settings object that can be reconfigured by the deployment config. It was largely inspired by the technique here: settings.py from django-rest-framework.
For example, I have a settings file like this in my project:
yourproject/settings.py
"""
Settings for <YOUR PROJECT> are all namespaced in the YOUR_PROJECT config option.
For example your project's config file (usually called `settings.py` or 'production.py') might look like this:
YOUR_PROJECT = {
'PROCESS_TASK': (
'your_project.tasks.process_task',
)
}
This module provides the `yourproject_settings` object, that is used
to access settings, checking for user settings first, then falling
back to the defaults.
"""
# This file was effectively borrow from https://github.com/tomchristie/django-rest-framework/blob/8385ae42c06b8e68a714cb67b7f0766afe316883/rest_framework/settings.py
from __future__ import unicode_literals
from django.conf import settings
from django.utils.module_loading import import_string
DEFAULTS = {
'RESEND_NOTIFICATIONS_TASK': 'core.tasks.resend_notifications',
}
# List of settings that may be in string import notation.
IMPORT_STRINGS = (
'RESEND_NOTIFICATIONS_TASK',
)
MANDATORY_SETTINGS = (
'RESEND_NOTIFICATIONS_TASK',
)
def perform_import(val, setting_name):
"""
If the given setting is a string import notation,
then perform the necessary import or imports.
"""
if val is None:
return None
if callable(val):
return val
if isinstance(val, (list, tuple)):
return [perform_import(item, setting_name) for item in val]
try:
return import_string(val)
except (ImportError, AttributeError) as e:
msg = "Could not import '%s' for setting '%s'. %s: %s." % (val, setting_name, e.__class__.__name__, e)
raise ImportError(msg)
class YourProjectSettings(object):
"""
A settings object, that allows settings to be accessed as properties.
For example:
from your_project.settings import yourproject_settings as the_settings
print(the_settings.RESEND_NOTIFICATIONS_TASK)
Any setting with string import paths will be automatically resolved
and return the class, rather than the string literal.
"""
namespace = 'YOUR_PROJECT'
def __init__(self, mandatory=None, defaults=None, import_strings=None):
self.mandatory = mandatory or MANDATORY_SETTINGS
self.defaults = defaults or DEFAULTS
self.import_strings = import_strings or IMPORT_STRINGS
self.__check_settings()
#property
def user_settings(self):
if not hasattr(self, '_user_settings'):
self._user_settings = getattr(settings, self.__class__.namespace, {})
return self._user_settings
def __getattr__(self, attr):
if attr not in self.defaults and attr not in self.mandatory:
raise AttributeError("Invalid Pyrite setting: '%s'" % attr)
try:
# Check if present in user settings
val = self.user_settings[attr]
except KeyError:
# Fall back to defaults
val = self.defaults[attr]
# Coerce import strings into classes
if attr in self.import_strings:
val = perform_import(val, attr)
# Cache the result
setattr(self, attr, val)
return val
def __check_settings(self):
for setting in self.mandatory:
if setting not in self.user_settings:
raise RuntimeError(
'The "{}" setting is required as part of the configuration for "{}", but has not been supplied.'.format(
setting, self.__class__.namespace))
yourproject_settings = YourProjectSettings(MANDATORY_SETTINGS, DEFAULTS, IMPORT_STRINGS)
This allows me to either:
Use the default value (i.e. 'core.tasks.resend_notications'); OR
To redefine the binding in my config file:
site_config/special.py
... other django settings like DB / DEBUG / Static files etc
YOUR_PROJECT = {
'RESEND_NOTIFICATIONS_TASK': 'customer_specific.tasks.resend_notifications',
}
... etc. ...
Then in my view function, I access the correct function via the settings:
core/views.py
... imports etc...
from yourproject.settings import yourproject_settings as my_settings
def handle_user_resend_request(request, user_id):
user = get_object_or_404(id=user_id)
if request.method == 'POST':
for follower in user.followers:
my_settings.RESEND_NOTIFICATIONS_TASK(follower.id)
... etc etc ...
Related
Let's say I have a route that allows clients to create a new user
(pseudocode)
#app.route("POST")
def create_user(user: UserScheme, db: Session = Depends(get_db)) -> User:
...
and my UserScheme accepts a field such as an email. I would like to be able to set some settings (for example max_length) globally in a different model Settings. How do I access that inside a scheme? I'd like to access the db inside my scheme.
So basically my scheme should look something like this (the given code does not work):
class UserScheme(BaseModel):
email: str
#validator("email")
def validate_email(cls, value: str) -> str:
settings = get_settings(db) # `db` should be set somehow
if len(value) > settings.email_max_length:
raise ValueError("Your mail might not be that long")
return value
I couldn't find a way to somehow pass db to the scheme. I was thinking about validating such fields (that depend on db) inside my route. While this approach works somehow, the error message itself is not raised on the specific field but rather on the entire form, but it should report the error for the correct field so that frontends can display it correctly.
One option to accept arbitrary JSON objects as input, and then construct a UserScheme instance manually inside the route handler:
#app.route(
"POST",
response_model=User,
openapi_extra={
"requestBody": {
"content": {
"application/json": {
"schema": UserScheme.schema(ref_template="#/components/schemas/{model}")
}
}
}
},
)
def create_user(request: Request, db: Session = Depends(get_db)) -> User:
settings = get_settings(db)
user_data = request.json()
user_schema = UserScheme(settings, **user_data)
Note that this idea was borrowed from https://stackoverflow.com/a/68815913/2954547, and I have not tested it myself.
In order to facilitate the above, you might want to redesign this class so that the settings object itself as an attribute on the UserScheme model, which means that you don't ever need to perform database access or other effectful operations inside the validator, while also preventing you from instantiating a UserScheme without some kind of sensible settings in place, even if they are fallbacks or defaults.
class SystemSettings(BaseModel):
...
def get_settings(db: Session) -> SystemSettings:
...
EmailAddress = typing.NewType('EmailAddress', st)
class UserScheme(BaseModel):
settings: SystemSettings
if typing.TYPE_CHECKING:
email: EmailAddress
else:
email: str | EmailAddress
#validator("email")
def _validate_email(cls, value: str, values: dict[str, typing.Any]) -> EmailAddress:
if len(value) > values['settings'].max_email_length:
raise ValueError('...')
return EmailAddress(value)
The use of tyipng.NewType isn't necessary here, but I think it's a good tool in situations like this. Note that the typing.TYPE_CHECKING trick is required to make it work, as per https://github.com/pydantic/pydantic/discussions/4823.
I am following the Wagtail docs to create a custom link handler:
myapp.handlers.py
from django.contrib.auth import get_user_model
from wagtail.core.rich_text import LinkHandler
class UserLinkHandler(LinkHandler):
identifier = 'user'
#staticmethod
def get_model():
return get_user_model()
#classmethod
def get_instance(cls, attrs):
model = cls.get_model()
return model.objects.get(username=attrs['username'])
#classmethod
def expand_db_attributes(cls, attrs):
user = cls.get_instance(attrs)
return '<a href="mailto:%s">' % user.email
my_app/wagtail_hooks.py
from wagtail.core import hooks
from my_app.handlers import MyCustomLinkHandler
#hooks.register('register_rich_text_features')
def register_link_handler(features):
features.register_link_type(LinkHandler)
However, the handler does not show up in the admin widget. The expected behaviour is it should be in an option in the link type bar:
I've followed the docs exactly, is there something missing?
This is not part of register_link_type's functionality. A link type handler only defines the mapping between the database representation of a link and the final HTML output - it doesn't provide any user interface support for actually inserting those links. As the linked documentation notes:
This example assumes that equivalent front-end functionality has been added to allow users to insert these kinds of links into their rich text editor.
For that, you'll need to consult https://docs.wagtail.org/en/stable/extending/extending_draftail.html.
I want to customize the field texture in django’s new “readonly mode”: E.g. foreign keys shall be displayed as links.
In general I identified the following options:
Implement custom fields for every model – results in code duplication
Reimplement django’s display_for_field method
Basically I could copy & paste the django.contrib.admin.utils module, insert my changes, override sys.modules['django.contrib.admin.utils'] = myutils but that's ugly because of maintainability in case of Django updates in the future.
So I decided to override only the display_for_fields method of django.contrib.admin.utils using the following approach to avoid duplication of Django code:
Override display_for_field function in django.contrib.admin.utils in settings.py:
from myapp.contrib.admin import utils
utils.override_method()
In myapp.utils.py:
from django.contrib.admin import utils
from django.contrib.admin.utils import *
def display_for_field_mod(value, field, empty_value_display):
if isinstance(field, models.ForeignKey) and value:
if field.related_model.__name__.lower() != 'user':
link_string = 'admin:myapp_' + field.related_model.__name__.lower() + '_change'
link = reverse(link_string, args=(value.id,))
return format_html('{}', link, value)
else:
return formats.localize(value)
else:
return display_for_field(value, field, empty_value_display)
def override_method():
utils.display_for_field = display_for_field_mod
But the problem is: display_for_field gets imported in django.contrib.admin.helpers using:
from django.contrib.admin.utils import (
display_for_field, [...]
)
So due to the scope of the imported funtion I cannot override this function from outside.
Do I miss some other obvious possibility? Is there a clean method to achieve this or is the only option to duplicate/modify django’s original code?
I came across a similar issue. If anyone's still looking for a solution, you just need to override the display_for_field in helpers, e.g.:
from django.contrib.admin import helpers, utils
def override_method():
helpers.display_for_field = display_for_field_mod
utils.display_for_field = display_for_field_mod
Only overriding helpers is sufficient, but it's a good idea to override utils as well, just in case.
Using the following example from the documentation:
def combine_names(apps, schema_editor):
Person = apps.get_model("yourappname", "Person")
for person in Person.objects.all():
person.name = "%s %s" % (person.first_name, person.last_name)
person.save()
class Migration(migrations.Migration):
dependencies = [
('yourappname', '0001_initial'),
]
operations = [
migrations.RunPython(combine_names),
]
How would I create and run a test against this migration, confirming that the data is migrated correctly?
I was doing some google to address the same question and found an article that nailed the hammer on the nail for me and seemed less hacky than existing answers. So, putting this here in case it helps anyone else coming though.
The proposed the following subclass of Django's TestCase:
from django.apps import apps
from django.test import TestCase
from django.db.migrations.executor import MigrationExecutor
from django.db import connection
class TestMigrations(TestCase):
#property
def app(self):
return apps.get_containing_app_config(type(self).__module__).name
migrate_from = None
migrate_to = None
def setUp(self):
assert self.migrate_from and self.migrate_to, \
"TestCase '{}' must define migrate_from and migrate_to properties".format(type(self).__name__)
self.migrate_from = [(self.app, self.migrate_from)]
self.migrate_to = [(self.app, self.migrate_to)]
executor = MigrationExecutor(connection)
old_apps = executor.loader.project_state(self.migrate_from).apps
# Reverse to the original migration
executor.migrate(self.migrate_from)
self.setUpBeforeMigration(old_apps)
# Run the migration to test
executor = MigrationExecutor(connection)
executor.loader.build_graph() # reload.
executor.migrate(self.migrate_to)
self.apps = executor.loader.project_state(self.migrate_to).apps
def setUpBeforeMigration(self, apps):
pass
And an example use case that they proposed was:
class TagsTestCase(TestMigrations):
migrate_from = '0009_previous_migration'
migrate_to = '0010_migration_being_tested'
def setUpBeforeMigration(self, apps):
BlogPost = apps.get_model('blog', 'Post')
self.post_id = BlogPost.objects.create(
title = "A test post with tags",
body = "",
tags = "tag1 tag2",
).id
def test_tags_migrated(self):
BlogPost = self.apps.get_model('blog', 'Post')
post = BlogPost.objects.get(id=self.post_id)
self.assertEqual(post.tags.count(), 2)
self.assertEqual(post.tags.all()[0].name, "tag1")
self.assertEqual(post.tags.all()[1].name, "tag2")
You can use django-test-migrations package. It is suited for testing: data migrations, schema migrations, and migrations' order.
Here's how it works:
from django_test_migrations.migrator import Migrator
# You can specify any database alias you need:
migrator = Migrator(database='default')
old_state = migrator.before(('main_app', '0002_someitem_is_clean'))
SomeItem = old_state.apps.get_model('main_app', 'SomeItem')
# One instance will be `clean`, the other won't be:
SomeItem.objects.create(string_field='a')
SomeItem.objects.create(string_field='a b')
assert SomeItem.objects.count() == 2
assert SomeItem.objects.filter(is_clean=True).count() == 2
new_state = migrator.after(('main_app', '0003_auto_20191119_2125'))
SomeItem = new_state.apps.get_model('main_app', 'SomeItem')
assert SomeItem.objects.count() == 2
# One instance is clean, the other is not:
assert SomeItem.objects.filter(is_clean=True).count() == 1
assert SomeItem.objects.filter(is_clean=False).count() == 1
We also have native integrations for both pytest:
#pytest.mark.django_db
def test_main_migration0002(migrator):
"""Ensures that the second migration works."""
old_state = migrator.before(('main_app', '0002_someitem_is_clean'))
SomeItem = old_state.apps.get_model('main_app', 'SomeItem')
...
And unittest:
from django_test_migrations.contrib.unittest_case import MigratorTestCase
class TestDirectMigration(MigratorTestCase):
"""This class is used to test direct migrations."""
migrate_from = ('main_app', '0002_someitem_is_clean')
migrate_to = ('main_app', '0003_auto_20191119_2125')
def prepare(self):
"""Prepare some data before the migration."""
SomeItem = self.old_state.apps.get_model('main_app', 'SomeItem')
SomeItem.objects.create(string_field='a')
SomeItem.objects.create(string_field='a b')
def test_migration_main0003(self):
"""Run the test itself."""
SomeItem = self.new_state.apps.get_model('main_app', 'SomeItem')
assert SomeItem.objects.count() == 2
assert SomeItem.objects.filter(is_clean=True).count() == 1
Full guide: https://sobolevn.me/2019/10/testing-django-migrations
Github: https://github.com/wemake-services/django-test-migrations
PyPI: https://pypi.org/project/django-test-migrations/
EDIT:
These other answers make more sense:
https://stackoverflow.com/a/56212859
https://stackoverflow.com/a/59016744, if you don't mind the extra (dev) dependency
ORIGINAL:
Running your data-migration functions (such as combine_names from the OP's example) through some basic unit-tests, before actually applying them, makes sense to me too.
At first glance this should not be much more difficult than your normal Django unit-tests: migrations are Python modules and the migrations/ folder is a package, so it is possible to import things from them. However, it took some time to get this working.
The first difficulty arises due to the fact that the default migration file names start with a number. For example, suppose the code from the OP's (i.e. Django's) data-migration example sits in 0002_my_data_migration.py, then it is tempting to use
from yourappname.migrations.0002_my_data_migration import combine_names
but that would raise a SyntaxError because the module name starts with a number (0).
There are at least two ways to make this work:
Rename the migration file so it does not start with a number. This should be perfectly fine according to the docs: "Django just cares that each migration has a different name." Then you can just use import as above.
If you want to stick to the default numbered migration file names, you can use Python's import_module (see docs and this SO question).
The second difficulty arises from the fact that your data-migration functions are designed to be passed into RunPython (docs), so they expect two input arguments by default: apps and schema_editor. To see where these come from, you can inspect the source.
Now, I'm not sure this works for every case (please, anyone, comment if you can clarify), but for our case, it was sufficient to import apps from django.apps and get the schema_editor from the active database connection (django.db.connection).
The following is a stripped-down example showing how you can implement this for the OP example, assuming the migration file is called 0002_my_data_migration.py:
from importlib import import_module
from django.test import TestCase
from django.apps import apps
from django.db import connection
from yourappname.models import Person
# Our filename starts with a number, so we use import_module
data_migration = import_module('yourappname.migrations.0002_my_data_migration')
class DataMigrationTests(TestCase):
def __init__(self, *args, **kwargs):
super(DataMigrationTests, self).__init__(*args, **kwargs)
# Some test values
self.first_name = 'John'
self.last_name = 'Doe'
def test_combine_names(self):
# Create a dummy Person
Person.objects.create(first_name=self.first_name,
last_name=self.last_name,
name=None)
# Run the data migration function
data_migration.combine_names(apps, connection.schema_editor())
# Test the result
person = Person.objects.get(id=1)
self.assertEqual('{} {}'.format(self.first_name, self.last_name), person.name)
You could add a crude if statement to a prior migration that tests if the test suite is running, and adds initial data if it is -- that way you can just write a test to check if the objects are in the final state you want them in. Just make sure your conditional is compatible with production, here's an example that would work with python manage.py test:
import sys
if 'test in sys.argv:
# do steps to update your operations
For a more "complete" solution, this older blog post has some good info and more up-to-date comments for inspiration:
https://micknelson.wordpress.com/2013/03/01/testing-django-migrations/#comments
The Django documentation mentions that you can add your own settings to django.conf.settings. So if my project's settings.py defines
APPLES = 1
I can access that with settings.APPLES in my apps in that project.
But if my settings.py doesn't define that value, accessing settings.APPLES obviously won't work. Is there some way to define a default value for APPLES that is used if there is no explicit setting in settings.py?
I'd like best to define the default value in the module/package that requires the setting.
In my apps, I have a seperate settings.py file. In that file I have a get() function that does a look up in the projects settings.py file and if not found returns the default value.
from django.conf import settings
def get(key, default):
return getattr(settings, key, default)
APPLES = get('APPLES', 1)
Then where I need to access APPLES I have:
from myapp import settings as myapp_settings
myapp_settings.APPLES
This allows an override in the projects settings.py, getattr will check there first and return the value if the attribute is found or use the default defined in your apps settings file.
How about just:
getattr(app_settings, 'SOME_SETTING', 'default value')
Here are two solutions. For both you can set settings.py files in your applications and fill them with default values.
Configure default value for a single application
Use from MYAPP import settings instead of from django.conf import settings in your code.
Edit YOURAPP/__init__.py:
from django.conf import settings as user_settings
from . import settings as default_settings
class AppSettings:
def __getattr__(self, name):
# If the setting you want is filled by the user, let's use it.
if hasattr(user_settings, name):
return getattr(user_settings, name)
# If the setting you want has a default value, let's use it.
if hasattr(default_settings, name):
return getattr(default_settings, name)
raise AttributeError("'Settings' object has no attribute '%s'" % name)
settings = AppSettings()
Configure default values for a whole project
Use from MYPROJECT import settings instead of from django.conf import settings in your code.
Edit MYPROJECT/MYPROJECT/__init__.py
import os, sys, importlib
from . import settings as user_settings
def get_local_apps():
"""Returns the locally installed apps names"""
apps = []
for app in user_settings.INSTALLED_APPS:
path = os.path.join(user_settings.BASE_DIR, app)
if os.path.exists(path) and app != __name__:
apps.append(sys.modules[app])
return apps
class AppSettings:
SETTINGS_MODULE = 'settings'
def __getattr__(self, setting_name):
# If the setting you want is filled by the user, let's use it.
if hasattr(user_settings, setting_name):
return getattr(user_settings, setting_name)
# Let's check every local app loaded by django.
for app in get_local_apps():
module_source = os.path.join(app.__path__[0], "%s.py" % self.SETTINGS_MODULE)
module_binary = os.path.join(app.__path__[0], "%s.pyc" % self.SETTINGS_MODULE)
if os.path.exists(module_source) or os.path.exists(module_binary):
module = importlib.import_module("%s.%s" % (app.__name__, self.SETTINGS_MODULE))
# Let's take the first default value for this setting we can find in any app
if hasattr(module, setting_name):
return getattr(module, setting_name)
raise AttributeError("'Settings' object has no attribute '%s'" % setting_name)
settings = AppSettings()
This solution may seem more easier to install, but it does not guarantee that the good default value will be returned. If several applications declare the same variable in their settings.py, you can not be sure which one will return the default value you asked.
Starting from Mike's answer, I now wrapped the default setting handling into a class with easy to use interface.
Helper module:
from django.conf import settings
class SettingsView(object):
class Defaults(object):
pass
def __init__(self):
self.defaults = SettingsView.Defaults()
def __getattr__(self, name):
return getattr(settings, name, getattr(self.defaults, name))
Usage:
from localconf import SettingsView
settings = SettingsView()
settings.defaults.APPLES = 1
print settings.APPLES
This prints the value from django.conf.settings, or the default if it isn't set there. This settings object can also be used to access all the standard setting values.
I recently had the same problem and created a Django app that is designed to be used for exactly such a case. It allows you to define default values for certain settings. It then first checks whether the setting is set in the global settings file. If not, it will return the default value.
I've extended it to also allow for some type checking or pre handling of the default value (e.g. a dotted class path can be converted to the class itself on load)
The app can be found at: https://pypi.python.org/pypi?name=django-pluggableappsettings&version=0.2.0&:action=display