Django application models on different databases - python

I have two django apps, call them Main on server A and Tasker on server B.
Main responds to user requests and does a lot of things that can be quickly done.
On the other hand, Tasker only has a few models for logging and celery tasks.
On server A, 'tasker' is not included in INSTALLED_APPS as I don't need it there, whereas on server B, it is.
From django's documentation, i create a router and defined db_for_read and db_for_write
class ModelsRouter(object):
"""
Logging models are on local,
but updated models are on another server
"""
def db_for_read(self, model, **hints):
if model._meta.app_label == 'tasker':
return 'tasker'
return None
def db_for_write(self, model, **hints):
if model._meta.app_label == 'tasker':
return 'tasker'
return None
On server B, DATABASES setting contains two keys:
default pointing to server A
tasker pointing to localhost
The problem I have is that when i run manage.py migrate, the models of tasker are created on server A.
How can I set the project on server B to be aware of the following:
- models of main app are on server A
- models of tasker are on server B (also localhost) ?

I managed to solve the problem the following way:
I modified ModelsRouter to use main database configuration if models are NOT from app tasker
on the server where I deployed tasker, I modified DATABASES so as default points to localhost and main points to the other server where main resides
On server B, I ran manage.py migrate tasker as the other models are not needed in that database.
It's working now:
logging is done in tables on server B
updates are performed on the other server
The problem I ran into when running manage.py migrate tasker is this:
RuntimeError: Error creating new content types. Please make sure contenttypes is migrated before trying to migrate apps individually.
but i'll manage it.

Related

Django: Exclude apps from python manage.py migrate when using multiple databases

QUESTION :
How to exclude the logs migrations from the default database, when using multiple databases in Django.
I want this to be automated. I started overriding the migrate command
I am using the default database for all models in my application and I need new database Logs, for only one model (the model is in different app - logs)
I successfully connected the application with the both databases. Also I am using a Router to control the operations
class LogRouter:
route_app_labels = {'logs'}
def db_for_read(self, model, **hints):
...
def db_for_write(self, model, **hints):
...
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
Make sure the logs app only appear in the
'logs' database.
"""
if app_label in self.route_app_labels:
return db == 'logs'
if db != 'default':
"""
If the database is not default, do not apply the migrations to the other
database.
"""
return False
return None
With allow_migrate I am faking the logs migrations in the default database which is updating the table django_migrations with the logs migration.
Also with
if db != 'default':
"""
If the database is not default, do not apply the migrations to the other database.
"""
return False
I am faking the migrations from the default database in the logs database and again the django_migrations table is updated with all the default database migrations.
This is fine solution, but I want to achieve:
The logs migrations to be ignored in the default database, including django_migrations table
The migrations for the default database to be ignored from the logs database, including django_migrations table
To achieve this, I tried overriding the migrate command:
from django.core.management.commands import migrate
class Command(migrate.Command):
def handle(self, *args, **options):
super(Command, self).handle(*args, **options)
# this is equal to python manage.py migrate logs --database=logs
# This will execute only the logs migrations in the logs database
options['app_label'] = options['database'] ='logs'
super(Command, self).handle(*args, **options)
With this code I am fixing the logs database, but the default still tries to execute the logs migrations (it is writing them down in the django_migrations table)

Automatically create django model instances at startup with empty database

My django project requires some model instances to be created at startup if they do not exist.
I currently create the required model instances I need in an app config.
class MyAppConfig(AppConfig):
name = 'my_app'
def ready(self):
create_required_objects()
def create_required_objects():
from my_app.models import MyObject
for name in MyObject.reserved_names:
if not MyObject.objects.filter(name=name).exists():
MyObject.objects.create(name=name, not_editable=True)
This works perfectly when the sqlite database is initialized, however if I clear the database and then try to run the sever, I get the following error:
django.db.utils.OperationalError: no such table: my_app_object
I would like to be able to clear the database (preferably by just removing db.sqlite3) and run the server.
Use post_migrate signal to create a new instance when migrating to the new database:
Like:
from django.db.models.signals import post_migrate
from my_app.models import MyObject
def create_required_objects(sender, **kwargs):
for name in MyObject.reserved_names:
if not MyObject.objects.filter(name=name).exists():
MyObject.objects.create(name=name, not_editable=True)
class MyAppConfig(AppConfig):
name = 'my_app'
def ready(self):
post_migrate.connect(create_required_objects ,sender=self)
This code automatically generates the user after migrating to the database.
You can use model_bakery to populate some temp data. You may need to do makemigrations and migrate to set up all tables in your database, you can follow this workflow.
python manage.py makemigrations
python manage.py migrate
In terms of populating data, you can try the following code:
from model_bakery import baker
from my_app.models import MyObject
baker.make(MyObject)
Add baker.make(MyObject) in your create_required_objects function after installation of model_bakery:
pip install model_bakery

Is it safe to re-create a SQLAlchemy object while server is running?

Our application uses flask + gunicorn. Now we want to make it able to reload db configuration while it is runing, which means it can switch to a new db without restart process. With the help of config center we can dispatch config at runtime, but how can I re-init the global varibale db?
db = SQLAlchemy()
def create_app():
app = Flask(__name__)
app.config.from_object(dynamic_config)
db.init_app(app)
And assume at some time. We dispatch new config, how can db be init with new config? Or is it safe to just replace it with new SQLAlchemy() instance? Like do this:
from models import set_db # which will set global db to new instance
from app import app
def callback(odl, new):
new_db = SQLAlchemy()
# re-construct config with old, and new
# now app.config is updated
new_db.init_app(app)
set_db(new_db)
Is it ok to do this? As I'm concerned, it will cause something like thread safety and may destroy Transaction.
Help me with this, many thanks
If i were you I would use SQLALCHEMY_BINDS config or different instances of db with different configs instead of changing configuration every time.
and I think it isn't good practice to change db structure using application(in case you are doing that)

GAE Django Transaction

I have been developing on the GAE dev_appserver and my code relied heavily on Django's transactionmiddleware. I have tested it locally and it works.
After deployment to GAE, however, model saves that are committed are not rolled back.
Sample code:
#transaction.commit_on_success
def get(self, request):
name = request.GET.get('name')
d = Department(name=name)
d.save()
raise Exception('Failed')
Is this because Django transaction API is not honored by GAE or is it a problem on my app settings?
FYI django.middleware.transaction.TransactionMiddleware is currently last on the list of MIDDLEWARE_CLASSES
According to this website, the Django database backend for Google App Engine does not support Django transactions. You can however use the run_in_transaction method from the App Engine's SDK.

"Table flask_db.user doesn't exist" error when doing db.session.commit()

I am building my first app with Flask Python micro-framework and I have a problem with committing my models to the database. When I test my User model on the command line, all works well. But when I do a db.session.commit(), I have error 1146 : "Table doesn't exist."
I'm using a MySQL database in local mode and there is no error with login/password
Maybe I'm doing it wrong on the configuration or something else. So here is my config application config file
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'mysql://root:admin#localhost/flask_db'
db = SQLAlchemy(app)
from app import views
The error explains it all-- while you may have the models for your data, you haven't yet created the tables in the database to store and query them. Simply import your models then run db.create_all() to generate the tables and you should be good to go.
It'll be worth you reading the quickstart guide for Flask-SQLAlchemy to get your head around the general flow.

Categories

Resources