I have two models in my django app and I want their tables/databases to be stored in seperate db/sqlite3 files rather than in the default 'db.sqlite3' file.
Eg:
my models.py has two classes Train and Bus, I want them to be stored in train.db and bus.db
Of course you could always just use Train.objects.using('train') for your calls and that would select the correct database (assuming you defined a database called train in your settings.py.
If you don't want to do that, I had a similar problem and I adjusted my solution to your case. It was partially based on this blog article and the Django documentation for database routers is here.
With this solution your current database will not be affected, however your current data will also not be transferred to the new databases. Depending on your version of Django you need to either include allow_syncdb or the right version of allow_migrate.
In settings.py:
DATABASES = {
'default': {
'NAME': 'db.sqlite3',
'ENGINE': 'django.db.backends.sqlite3',
},
'train': {
'NAME': 'train.db',
'ENGINE': 'django.db.backends.sqlite3',
},
'bus': {
'NAME': 'bus.db',
'ENGINE': 'django.db.backends.sqlite3',
},
}
DATABASE_ROUTERS = [ 'yourapp.DatabaseAppsRouter']
DATABASE_APPS_MAPPING = {'train': 'train', 'bus': 'bus'}
In a new file called database_router.py:
from django.conf import settings
class DatabaseAppsRouter(object):
"""
A router to control all database operations on models for different
databases.
In case an app is not set in settings.DATABASE_APPS_MAPPING, the router
will fallback to the `default` database.
Settings example:
DATABASE_APPS_MAPPING = {'model_name1': 'db1', 'model_name2': 'db2'}
"""
def db_for_read(self, model, **hints):
"""Point all read operations to the specific database."""
return settings.DATABASE_APPS_MAPPING.get(model._meta.model_name, None)
def db_for_write(self, model, **hints):
"""Point all write operations to the specific database."""
return settings.DATABASE_APPS_MAPPING.get(model._meta.model_name, None)
def allow_relation(self, obj1, obj2, **hints):
"""Have no opinion on whether the relation should be allowed."""
return None
def allow_syncdb(self, db, model): # if using Django version <= 1.6
"""Have no opinion on whether the model should be synchronized with the db. """
return None
def allow_migrate(db, model): # if using Django version 1.7
"""Have no opinion on whether migration operation is allowed to run. """
return None
def allow_migrate(db, app_label, model_name=None, **hints): # if using Django version 1.8
"""Have no opinion on whether migration operation is allowed to run. """
return None
(Edit: this is also what Joey Wilhelm suggested)
Related
postgresql is my primary data base. If any object create in my original database then I want it will also keep an duplicate object in my secondary database. I read Django documentation for create an clone database but didn't work. here is my code:
#replicarouter.py
class PrimaryReplicaRouter:
def db_for_read(self, model, **hints):
"""
Reads go to a randomly-chosen replica.
"""
return 'primary'
def db_for_write(self, model, **hints):
"""
Writes always go to primary.
"""
return 'primary'
def allow_relation(self, obj1,**hints):
"""
Relations between objects are allowed if both objects are
in the primary/replica pool.
"""
db_set = {'primary', 'replica_database'}
if obj1._state.db in db_set:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
settings.py
DATABASES = {
'default':{},
'primary': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'my_db_name',
'USER': 'postgres',
'PASSWORD': 'my_db_pass',
'HOST': 'localhost',
'PORT':5432,
},
'replica_database': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'my_db_name',
'USER': 'root',
'PASSWORD': '',
'HOST': 'localhost',
'PORT': 3306,
}
}
DATABASE_ROUTERS = ['my_root_folder_name.dbrouters.AuthRouter','my_root_folder_name.replicarouter.PrimaryReplicaRouter']
right now all new objects creating in my primary database. I want to keep clone of every new object in my replica_database. If any object added in my primary database then it will also add an clone object in my replica_database.
finally I find out an solution. After reading django multiple database documentation we can use multiple database but can't keep clone object in our database. Let explain little bit assume you have two model teacher and student. You can use two separate database for teacher and student but you can't use another database for keep clone objects of your teacher and student model. So here we will use django siganls for keep clone objects in our replica_database. Signals will be triggered and create clone object whenever any objects create in our model. Here is my code:
settings.py
'default': {
'NAME': 'primary_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '',
},
'replica1_database': {
'NAME': 'replica1_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '', },
models.py:
from django.db.models.signals import post_save
from django.dispatch import receiver
class Contact(models.Model):
name = models.CharField(blank=True, null=True, max_length=100)
#receiver(post_save, sender=Contact, dispatch_uid="clone_objects")
def replica1_databse(sender, instance, created, **kwargs):
if created: #cerate clone object in our replica1 database
obj = instance
obj.save(using='replica1')
else: #updating clone object in our replica1 database
obj = Contact.objects.using('replica1').update(
name=instance.name)
Here I am triggering signals for create an clone objects in my replica1_database.
Now run python manage.py makemigrtions contact and python manage.py migrate contact this two migrations applied in your default database. This is the most important step ----> You have to run python manage.py migrate --database=replica1 This migrate applied for your replica1 database.
I Think it also an good idea to keep an backup database for avoid any unexcepted situation such as server down.
I have two Postgres database connections and when using another than default ORM calls fail on views, but not raw. I am using docker and containers are connected and using Django's own runserver command. Using ORM in django command or django shell works fine, but not in a view.
As a side note, both databases are actually Django-projects, but main project is reading some data directly from another project's database using own unmanaged model.
Python: 3.9.7
Django: 3.2.10
Postgres: 13.3 (main project), 12.2 (side project)
# settings
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'USER': 'pguser',
'PASSWORD': 'pguser',
'NAME': 'mainproject',
'HOST': 'project-db', # docker container
'PORT': '5432',
},
'external': {
'ENGINE': 'django.db.backends.postgresql',
'USER': 'pguser',
'PASSWORD': 'pguser',
'NAME': 'sideproject',
'HOST': 'side-db', # docker container, attached to same network
'PORT': '5432',
},
}
# My unmanaged model
class MyTestModel(models.Model):
class Meta:
# table does not exist in 'default', but it does exist in 'external'
db_table = 'my_data_table'
managed = False
# my_data_table is very simple it has id field as integer
# and primary key (works out of the box with django)
# it has normal fields only like IntegerField or CharField
# But even if this model is empty then it returns normally the PK field
# My custom command in Main project
# python manage.py mycommand
# ...
def handle(self, *args, **options):
# This works fine and data from external is populated in MyTestModel
data = MyTestModel.objects.using('external').all()
for x in data:
print(x, vars(x))
# My simple view in Main project
class MyView(TemplateView):
# But using example in views.py, random view class:
def get(self, request, *args, **kwargs):
# with raw works
data = MyTestModel.objects.using('external').raw('select * from my_data_table')
for x in data:
print(x, vars(x))
# This does not work
# throws ProgrammingError: relation "my_data_table" does not exist
data = MyTestModel.objects.using('external').all()
for x in data:
print(x, vars(x))
return super().get(request, *args, **kwargs)
So somehow runserver and views does not generate query correctly when using ORM. It cannot be a connection error, because when running command or view with ".raw()" works.
Now the funny thing is that if I change "db_table" to something what is common in both database lets say "django_content_type" ORM "filter()" and "all()" works in view too. And yes then it actually returns data from correct database. So if main project has 50 content types and side project (external) has 100 content types it actually returns those 100 content types from external.
I have tried everything, rebuilding docker, creating new tables to database directly, force reload postgres user and made sure all is owner permissions and all permissions (should be ok anyway because command side works). I even tried to use another database.
I know I didn't post the full settings and my local settings which could have helped more to solve this case.
But I noticed that I had installed locally Django Silk, which captures the request and tries to analyze called database queries. Looks like it may have been loaded too early or it doesn't like external databases. But disabling Django silk from installed apps and removing it's middleware removed the problem.
I'm trying to use two DBs for my django project. The first one is for authentication etc, the second should hold data sent by the user through a form.
I added the second DB to my settings.py file, but i keep getting errors, the most recent one is (1146, "Table 'dataset.main_SomeModel' doesn't exist")
Indeed, it looks like my Django project can't interact with the db, since there is no table there.
Am i doing something wrong? Maybe is it the wrong way to use two DBs here?
Here is settings.py, the second db called dataset is the one i'm trying to use:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'dataset': {
'NAME': 'dataset',
'ENGINE': 'django.db.backends.mysql',
'USER': 'root',
'PASSWORD': 'password goes here'
}
}
Here is the model:
class SomeModel(models.Model):
data = models.CharField(max_length=100)
num = models.Float()
def save(self): # ALL the signature
super(SomeModel, self).save(using='dataset')
And here is the form:
class DataForm(forms.ModelForm):
class Meta:
model = SomeModel
fields = ("data", "num")
def save(self, commit=True):
send = super(DataForm, self).save(commit=False)
if commit:
send.save()
return send
Since i added the line using="dataset" shouldn't the data be sent to the dataset db? Or am i doing something else wrong? Any advice is appreciated!
Edit: i tried migrating the second database using manage.py migrate --database="dataset" but i get the error The connection dataset doesn't exist
You are missing quote '.
Try this
super(SomeModel, self).save(using='dataset')
You can lookup this Multi DB Save
I am using django restframework and want to handle multiple databases. I am using django function using(alias) and switch_db(alias) for manually switching between database whenever I want to Get, Post or update data.
I am facing problem while posting and updating data.i.e whenever serializer.is_valid() will be called.
serializer.is_valid() will go and first check for db_alias in model.py file. If I have not specified db_alias under meta it will select default database for validation. If I am specifying db_alias in model it will select that database for validation.
But I do not want to specify db_alias in model since my usecase is to store data on different database based on some logic in my view file. So dynamically I want to select database from view and want to store data in it.
Almost I have implemented but I am facing problem when my model is having Reference Field. In this case serializer.is_valid is going to default database for validating that reference field.
Required Details: I am using mongoengine(0.9.0), document, document serializer.
My files are as below:
model.py:
class ngroup(Document):
groupname = StringField(max_length=100, required=True)
description = StringField(max_length=100, required=False)
parent = ReferenceField('ngroup',null=True)
created_ts = DateTimeField(default=datetime.now())
modified_ts = DateTimeField(default=datetime.now())
is_deleted = BooleanField(default=False)
serializer.py:
from device_management.models import ngroup
from rest_framework_mongoengine.serializers import DocumentSerializer
from mongoengine import EmbeddedDocumentField, ReferenceField, StringField, ObjectIdField, IntField, BooleanField, FloatField, DateTimeField,ListField
class ngroupSerializer(DocumentSerializer):
class Meta:
model = ngroup
def setOrgId(self, orgid):
self.orgid = orgid
def create(self, validated_data):
ngroup_data = ngroup(**validated_data).switch_db(self.orgid)
ngroup_data.save()
return ngroup_data
def update(self, instance, validated_data):
ngroup_data = ngroup.objects.using(self.orgid).get(id = instance.id)
ngroup_data = ngroup_data.switch_db(self.orgid)
ngroup_data = ngroup_data.update(**validated_data)
return validated_data
def to_internal_value(self, data):
print "data:" , data
return super(DocumentSerializer, self).to_internal_value(data)
view.py:
def create(self, request, format=None):
orgid = str(request.user.orgid.id)
data=request.data
serializer = ngroupSerializer(data=data)
if serializer.is_valid():
try:
serializer.save()
except Exception as e:
log.error("create" , extra={'extra':{'error': str(e),'message' :strings.DATA_VALIDATION_ERROR }})
return response.errorResponse(message=strings.SERIALIZATION_ERROR_MSG,error=str(e),rstatus=status.HTTP_400_BAD_REQUEST)
return response.successResponse(res_data=serializer.data, message=strings.POST_SUCCESS_MSG, rstatus=status.HTTP_201_CREATED)
log.error("create" , extra={'extra':{'error': serializer.errors,'message' :strings.DATA_VALIDATION_ERROR }})
return response.errorResponse(message=strings.DATA_VALIDATION_ERROR,error=serializer.errors,rstatus=status.HTTP_400_BAD_REQUEST)
settings.py:
DATABASES = {
'default': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
'586e47c784413825f2b5bc49': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb1',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
# Enter super_user organisation here. This DB will be same as default db only always
'58996fb28441384430dc8ae6': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
}
pip freeze(Installation versions):
Django==1.5.11
django-browserid==2.0.2
django-classy-tags==0.8.0
django-missing==0.1.18
django-mongo-auth==0.1.3
django-mongodb-engine==0.6.0
django-mongoengine==0.2.1
django-redis-sessions==0.5.6
django-rest-framework-mongoengine==3.3.0
django-sekizai==0.10.0
django-websocket-redis==0.4.7
djangorestframework==3.1.2
djangorestframework-jwt==1.9.0
djangotoolbox==1.8.0
gevent==1.1.2
greenlet==0.4.10
httplib2==0.9.2
mongoengine==0.9.0
oauthlib==2.0.1
pika==0.10.0
Pygments==2.1.3
PyJWT==1.4.2
pymongo==2.8
python-dateutil==2.6.0
python-openid==2.2.5
pytz==2016.10
redis==2.10.5
requests==2.12.3
requests-oauthlib==0.7.0
rest-condition==1.0.3
six==1.10.0
tweepy==3.5.0
twilio==5.7.0
I have overide create in serializer to take care of database while calling serializer.save() but how to handle serializer.is_valid().
My project has been stuck at this point. Any help will be greatly appreciated...
This is not the exact solution to above problem but we have 2 options.
1) Do not go for serializer.is_valid() or serializer.save().
Directly create ngroup:
def my_create(self, validated_data):
gateway = Gateway(**validated_data).switch_db(self.orgid)
gateway.save()
return gateway
2) Another solution is to use django-mogodb-engine and django models and modelserializers instead of documents and documents serializers.
I have tried following this with Django-mongodb-engine and are working well:
-> JWT authentication
-> custom user
-> foreign key
-> embedded model
-> list of embedded model
-> dict field
-> **Routers for switching between databases.(Manual switching DB is not required)**
I can also use middleware_classes to specify runtime in each request which database to use.
Reference Link: Django Authenticate Backend Multiple Databases
Husain,
Unfortunately, you're mixing incompatible projects together. Mongoengine, django-mongoengine and Django-REST-Framework-Mongoengine projects are alternative to django-mongodb-engine, they are not meant to be used together.
As far as I know, django-mongodb-engine project's been dead for 2 years, or even longer, to be honest. At the same time, Mongoengine stack is working in production, though, development is not too active. I really want to create a proper django database backend out of Mongoengine to make it a first-class citizen in Django world, and it seems like Django guys are looking in that direction, too.
You might also want to look into this post.
This is my second attempt. I tried to switch the database connection in view create(). Didn't work for me:
settings.py
# We define 2 Mongo databases - default (project) and project2
MONGODB_DATABASES = {
"project": {
"name": "project",
"host": "localhost",
"port": 27017,
"tz_aware": True, # if you use timezones in django (USE_TZ = True)
},
"project2": {
"name": "project2",
"host": "localhost",
"port": 27017,
"tz_aware": True, # if you use timezones in django (USE_TZ = True)
}
}
mongoengine.register_connection(alias='default', name=MONGODB_DATABASES["project"]["name"], host="local")
mongoengine.register_connection(alias='project2', name=MONGODB_DATABASES["project2"]["name"], host="local")
connection = mongoengine.connect(db="project", alias='default')
views.py
class AuthorViewSet(MongoModelViewSet):
lookup_field = 'id'
serializer_class = AuthorSerializer
def create(self, request, format=None):
global Author
mongoengine.connection.disconnect(alias='project')
mongoengine.connect('project2', alias='project2')
return super(AuthorViewSet, self).create(request, format)
def get_queryset(self):
return Author.objects.all()
I have a series of integration-level tests that are being run as a management command in my Django project. These tests are verifying the integrity of a large amount of weather data ingested from external sources into my database. Because I have such a large amount of data, I really have to test against my production database for the tests to be meaningful. What I'm trying to figure out is how I can define a read-only database connection that is specific to that command or connection object. I should also add that these tests can't go through the ORM, so I need to execute raw SQL.
The structure of my test looks like this
class Command(BaseCommand):
help = 'Runs Integration Tests and Query Tests against Prod Database'
def handle(self,*args, **options):
suite = unittest.TestLoader().loadTestsFromTestCase(TestWeatherModel)
ret = unittest.TextTestRunner().run(suite)
if(len(ret.failures) != 0):
sys.exit(1)
else:
sys.exit(0)
class TestWeatherModel(unittest.TestCase):
def testCollectWeatherDataHist(self):
wm = WeatherManager()
wm.CollectWeatherData()
self.assertTrue(wm.weatherData is not None)
And the WeatherManager.CollectWeatherData() method would look like this:
def CollecWeatherData(self):
cur = connection.cursor()
cur.execute(<Raw SQL Query>)
wm.WeatherData = cur.fetchall()
cur.close()
I want to somehow idiot-proof this, so that someone else (or me) can't come along later and accidentally write a test that would modify the production database.
You can achieve this by hooking into Django's connection_created signal, and
then making the transaction read-only.
The following works for PostgreSQL:
from django.db.backends.signals import connection_created
class MyappConfig(AppConfig):
def ready(self):
def connection_created_handler(connection, **kwargs):
with connection.cursor() as cursor:
cursor.execute('SET default_transaction_read_only = true;')
connection_created.connect(connection_created_handler, weak=False)
This can be useful for some specific Django settings (e.g. to run development
code with runserver against the production DB), where you do not want to
create a real read-only DB user.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'myusername',
'PASSWORD': 'mypassword',
'HOST': 'myhost',
'OPTIONS': {
'options': '-c default_transaction_read_only=on'
}
}
}
Source: https://nejc.saje.info/django-postgresql-readonly.html
Man, once again, I should read the docs more carefully before I post questions here. I can define a readonly connection to my production database in the settings file, and then straight from the docs:
If you are using more than one database, you can use django.db.connections to obtain the connection (and cursor) for a specific database. django.db.connections is a dictionary-like object that allows you to retrieve a specific connection using its alias:
from django.db import connections
cursor = connections['my_db_alias'].cursor()
# Your code here...
If you add a serializer for you model, you could specialized in the serializer that is working in readonly mode
class AccountSerializer(serializers.ModelSerializer):
class Meta:
model = Account
fields = ('id', 'account_name', 'users', 'created')
read_only_fields = ('account_name',)
from http://www.django-rest-framework.org/api-guide/serializers/#specifying-read-only-fields