I'm trying to use two DBs for my django project. The first one is for authentication etc, the second should hold data sent by the user through a form.
I added the second DB to my settings.py file, but i keep getting errors, the most recent one is (1146, "Table 'dataset.main_SomeModel' doesn't exist")
Indeed, it looks like my Django project can't interact with the db, since there is no table there.
Am i doing something wrong? Maybe is it the wrong way to use two DBs here?
Here is settings.py, the second db called dataset is the one i'm trying to use:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'dataset': {
'NAME': 'dataset',
'ENGINE': 'django.db.backends.mysql',
'USER': 'root',
'PASSWORD': 'password goes here'
}
}
Here is the model:
class SomeModel(models.Model):
data = models.CharField(max_length=100)
num = models.Float()
def save(self): # ALL the signature
super(SomeModel, self).save(using='dataset')
And here is the form:
class DataForm(forms.ModelForm):
class Meta:
model = SomeModel
fields = ("data", "num")
def save(self, commit=True):
send = super(DataForm, self).save(commit=False)
if commit:
send.save()
return send
Since i added the line using="dataset" shouldn't the data be sent to the dataset db? Or am i doing something else wrong? Any advice is appreciated!
Edit: i tried migrating the second database using manage.py migrate --database="dataset" but i get the error The connection dataset doesn't exist
You are missing quote '.
Try this
super(SomeModel, self).save(using='dataset')
You can lookup this Multi DB Save
Related
postgresql is my primary data base. If any object create in my original database then I want it will also keep an duplicate object in my secondary database. I read Django documentation for create an clone database but didn't work. here is my code:
#replicarouter.py
class PrimaryReplicaRouter:
def db_for_read(self, model, **hints):
"""
Reads go to a randomly-chosen replica.
"""
return 'primary'
def db_for_write(self, model, **hints):
"""
Writes always go to primary.
"""
return 'primary'
def allow_relation(self, obj1,**hints):
"""
Relations between objects are allowed if both objects are
in the primary/replica pool.
"""
db_set = {'primary', 'replica_database'}
if obj1._state.db in db_set:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
settings.py
DATABASES = {
'default':{},
'primary': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'my_db_name',
'USER': 'postgres',
'PASSWORD': 'my_db_pass',
'HOST': 'localhost',
'PORT':5432,
},
'replica_database': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'my_db_name',
'USER': 'root',
'PASSWORD': '',
'HOST': 'localhost',
'PORT': 3306,
}
}
DATABASE_ROUTERS = ['my_root_folder_name.dbrouters.AuthRouter','my_root_folder_name.replicarouter.PrimaryReplicaRouter']
right now all new objects creating in my primary database. I want to keep clone of every new object in my replica_database. If any object added in my primary database then it will also add an clone object in my replica_database.
finally I find out an solution. After reading django multiple database documentation we can use multiple database but can't keep clone object in our database. Let explain little bit assume you have two model teacher and student. You can use two separate database for teacher and student but you can't use another database for keep clone objects of your teacher and student model. So here we will use django siganls for keep clone objects in our replica_database. Signals will be triggered and create clone object whenever any objects create in our model. Here is my code:
settings.py
'default': {
'NAME': 'primary_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '',
},
'replica1_database': {
'NAME': 'replica1_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '', },
models.py:
from django.db.models.signals import post_save
from django.dispatch import receiver
class Contact(models.Model):
name = models.CharField(blank=True, null=True, max_length=100)
#receiver(post_save, sender=Contact, dispatch_uid="clone_objects")
def replica1_databse(sender, instance, created, **kwargs):
if created: #cerate clone object in our replica1 database
obj = instance
obj.save(using='replica1')
else: #updating clone object in our replica1 database
obj = Contact.objects.using('replica1').update(
name=instance.name)
Here I am triggering signals for create an clone objects in my replica1_database.
Now run python manage.py makemigrtions contact and python manage.py migrate contact this two migrations applied in your default database. This is the most important step ----> You have to run python manage.py migrate --database=replica1 This migrate applied for your replica1 database.
I Think it also an good idea to keep an backup database for avoid any unexcepted situation such as server down.
I created a Django application and connected it to SQL Server with a trusted connection. When I migrate, Django is creating a new schema name (trusted connection username) in the database and adding this on all tables as prefix. By my IT department convention, all tables should be created with the 'dbo' prefix.
The most interesting part is: When I access the database from SSMS (also with trusted connection), and create a new database, I do not have this issue, it creates as 'dbo.table_name'.
Does anybody knows how can I fix it? See below better example and some code.
Summary:
Django creating: 'my_username.table_name'
I need: 'dbo.table_name'
My django settings.py
DATABASES = {
'default': {
'ENGINE': 'sql_server.pyodbc',
'NAME': 'dabase_name',
'HOST': 'database_host',
'USER': '',
'OPTIONS': {
'driver': "ODBC Driver 17 for SQL Server",
'Trusted_Connection' : 'Yes',
}
}
}
One of my models (table) as example:
class Sap_module(models.Model):
sap_module = models.CharField(max_length=2, unique=True)
available = models.BooleanField(default=True)
def __str__(self):
return self.sap_module
I am using django restframework and want to handle multiple databases. I am using django function using(alias) and switch_db(alias) for manually switching between database whenever I want to Get, Post or update data.
I am facing problem while posting and updating data.i.e whenever serializer.is_valid() will be called.
serializer.is_valid() will go and first check for db_alias in model.py file. If I have not specified db_alias under meta it will select default database for validation. If I am specifying db_alias in model it will select that database for validation.
But I do not want to specify db_alias in model since my usecase is to store data on different database based on some logic in my view file. So dynamically I want to select database from view and want to store data in it.
Almost I have implemented but I am facing problem when my model is having Reference Field. In this case serializer.is_valid is going to default database for validating that reference field.
Required Details: I am using mongoengine(0.9.0), document, document serializer.
My files are as below:
model.py:
class ngroup(Document):
groupname = StringField(max_length=100, required=True)
description = StringField(max_length=100, required=False)
parent = ReferenceField('ngroup',null=True)
created_ts = DateTimeField(default=datetime.now())
modified_ts = DateTimeField(default=datetime.now())
is_deleted = BooleanField(default=False)
serializer.py:
from device_management.models import ngroup
from rest_framework_mongoengine.serializers import DocumentSerializer
from mongoengine import EmbeddedDocumentField, ReferenceField, StringField, ObjectIdField, IntField, BooleanField, FloatField, DateTimeField,ListField
class ngroupSerializer(DocumentSerializer):
class Meta:
model = ngroup
def setOrgId(self, orgid):
self.orgid = orgid
def create(self, validated_data):
ngroup_data = ngroup(**validated_data).switch_db(self.orgid)
ngroup_data.save()
return ngroup_data
def update(self, instance, validated_data):
ngroup_data = ngroup.objects.using(self.orgid).get(id = instance.id)
ngroup_data = ngroup_data.switch_db(self.orgid)
ngroup_data = ngroup_data.update(**validated_data)
return validated_data
def to_internal_value(self, data):
print "data:" , data
return super(DocumentSerializer, self).to_internal_value(data)
view.py:
def create(self, request, format=None):
orgid = str(request.user.orgid.id)
data=request.data
serializer = ngroupSerializer(data=data)
if serializer.is_valid():
try:
serializer.save()
except Exception as e:
log.error("create" , extra={'extra':{'error': str(e),'message' :strings.DATA_VALIDATION_ERROR }})
return response.errorResponse(message=strings.SERIALIZATION_ERROR_MSG,error=str(e),rstatus=status.HTTP_400_BAD_REQUEST)
return response.successResponse(res_data=serializer.data, message=strings.POST_SUCCESS_MSG, rstatus=status.HTTP_201_CREATED)
log.error("create" , extra={'extra':{'error': serializer.errors,'message' :strings.DATA_VALIDATION_ERROR }})
return response.errorResponse(message=strings.DATA_VALIDATION_ERROR,error=serializer.errors,rstatus=status.HTTP_400_BAD_REQUEST)
settings.py:
DATABASES = {
'default': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
'586e47c784413825f2b5bc49': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb1',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
# Enter super_user organisation here. This DB will be same as default db only always
'58996fb28441384430dc8ae6': {
'ENGINE': 'django_mongodb_engine',
'NAME': 'mydb',
'USER': 'admin',
'PASSWORD':'admin123',
'HOST': '127.0.0.1',
'PORT': 27017,
'DBTYPE' : "mongo",
},
}
pip freeze(Installation versions):
Django==1.5.11
django-browserid==2.0.2
django-classy-tags==0.8.0
django-missing==0.1.18
django-mongo-auth==0.1.3
django-mongodb-engine==0.6.0
django-mongoengine==0.2.1
django-redis-sessions==0.5.6
django-rest-framework-mongoengine==3.3.0
django-sekizai==0.10.0
django-websocket-redis==0.4.7
djangorestframework==3.1.2
djangorestframework-jwt==1.9.0
djangotoolbox==1.8.0
gevent==1.1.2
greenlet==0.4.10
httplib2==0.9.2
mongoengine==0.9.0
oauthlib==2.0.1
pika==0.10.0
Pygments==2.1.3
PyJWT==1.4.2
pymongo==2.8
python-dateutil==2.6.0
python-openid==2.2.5
pytz==2016.10
redis==2.10.5
requests==2.12.3
requests-oauthlib==0.7.0
rest-condition==1.0.3
six==1.10.0
tweepy==3.5.0
twilio==5.7.0
I have overide create in serializer to take care of database while calling serializer.save() but how to handle serializer.is_valid().
My project has been stuck at this point. Any help will be greatly appreciated...
This is not the exact solution to above problem but we have 2 options.
1) Do not go for serializer.is_valid() or serializer.save().
Directly create ngroup:
def my_create(self, validated_data):
gateway = Gateway(**validated_data).switch_db(self.orgid)
gateway.save()
return gateway
2) Another solution is to use django-mogodb-engine and django models and modelserializers instead of documents and documents serializers.
I have tried following this with Django-mongodb-engine and are working well:
-> JWT authentication
-> custom user
-> foreign key
-> embedded model
-> list of embedded model
-> dict field
-> **Routers for switching between databases.(Manual switching DB is not required)**
I can also use middleware_classes to specify runtime in each request which database to use.
Reference Link: Django Authenticate Backend Multiple Databases
Husain,
Unfortunately, you're mixing incompatible projects together. Mongoengine, django-mongoengine and Django-REST-Framework-Mongoengine projects are alternative to django-mongodb-engine, they are not meant to be used together.
As far as I know, django-mongodb-engine project's been dead for 2 years, or even longer, to be honest. At the same time, Mongoengine stack is working in production, though, development is not too active. I really want to create a proper django database backend out of Mongoengine to make it a first-class citizen in Django world, and it seems like Django guys are looking in that direction, too.
You might also want to look into this post.
This is my second attempt. I tried to switch the database connection in view create(). Didn't work for me:
settings.py
# We define 2 Mongo databases - default (project) and project2
MONGODB_DATABASES = {
"project": {
"name": "project",
"host": "localhost",
"port": 27017,
"tz_aware": True, # if you use timezones in django (USE_TZ = True)
},
"project2": {
"name": "project2",
"host": "localhost",
"port": 27017,
"tz_aware": True, # if you use timezones in django (USE_TZ = True)
}
}
mongoengine.register_connection(alias='default', name=MONGODB_DATABASES["project"]["name"], host="local")
mongoengine.register_connection(alias='project2', name=MONGODB_DATABASES["project2"]["name"], host="local")
connection = mongoengine.connect(db="project", alias='default')
views.py
class AuthorViewSet(MongoModelViewSet):
lookup_field = 'id'
serializer_class = AuthorSerializer
def create(self, request, format=None):
global Author
mongoengine.connection.disconnect(alias='project')
mongoengine.connect('project2', alias='project2')
return super(AuthorViewSet, self).create(request, format)
def get_queryset(self):
return Author.objects.all()
I have a series of integration-level tests that are being run as a management command in my Django project. These tests are verifying the integrity of a large amount of weather data ingested from external sources into my database. Because I have such a large amount of data, I really have to test against my production database for the tests to be meaningful. What I'm trying to figure out is how I can define a read-only database connection that is specific to that command or connection object. I should also add that these tests can't go through the ORM, so I need to execute raw SQL.
The structure of my test looks like this
class Command(BaseCommand):
help = 'Runs Integration Tests and Query Tests against Prod Database'
def handle(self,*args, **options):
suite = unittest.TestLoader().loadTestsFromTestCase(TestWeatherModel)
ret = unittest.TextTestRunner().run(suite)
if(len(ret.failures) != 0):
sys.exit(1)
else:
sys.exit(0)
class TestWeatherModel(unittest.TestCase):
def testCollectWeatherDataHist(self):
wm = WeatherManager()
wm.CollectWeatherData()
self.assertTrue(wm.weatherData is not None)
And the WeatherManager.CollectWeatherData() method would look like this:
def CollecWeatherData(self):
cur = connection.cursor()
cur.execute(<Raw SQL Query>)
wm.WeatherData = cur.fetchall()
cur.close()
I want to somehow idiot-proof this, so that someone else (or me) can't come along later and accidentally write a test that would modify the production database.
You can achieve this by hooking into Django's connection_created signal, and
then making the transaction read-only.
The following works for PostgreSQL:
from django.db.backends.signals import connection_created
class MyappConfig(AppConfig):
def ready(self):
def connection_created_handler(connection, **kwargs):
with connection.cursor() as cursor:
cursor.execute('SET default_transaction_read_only = true;')
connection_created.connect(connection_created_handler, weak=False)
This can be useful for some specific Django settings (e.g. to run development
code with runserver against the production DB), where you do not want to
create a real read-only DB user.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'myusername',
'PASSWORD': 'mypassword',
'HOST': 'myhost',
'OPTIONS': {
'options': '-c default_transaction_read_only=on'
}
}
}
Source: https://nejc.saje.info/django-postgresql-readonly.html
Man, once again, I should read the docs more carefully before I post questions here. I can define a readonly connection to my production database in the settings file, and then straight from the docs:
If you are using more than one database, you can use django.db.connections to obtain the connection (and cursor) for a specific database. django.db.connections is a dictionary-like object that allows you to retrieve a specific connection using its alias:
from django.db import connections
cursor = connections['my_db_alias'].cursor()
# Your code here...
If you add a serializer for you model, you could specialized in the serializer that is working in readonly mode
class AccountSerializer(serializers.ModelSerializer):
class Meta:
model = Account
fields = ('id', 'account_name', 'users', 'created')
read_only_fields = ('account_name',)
from http://www.django-rest-framework.org/api-guide/serializers/#specifying-read-only-fields
I am having a database with some data filled in it and i want to use it in my new django app. So is their any way to use data of database in my django app.Actually i don't want to make any changes in my old database and only want to use its data. Anybody please suggest me what will be the better approach to do this.
While serching i also found a command-inspectdb
which can generate model.py file from database, but their are some issues with it that it does'nt map the foreign key in model.py, we need to rearrange our classes in model.py file and some more. So i am searching for some other alternative.
You could access data from legacy database using connection.cursor() from django.db module.
If you have two dabases
DATABASES = {
'default': {
'NAME': 'new_database',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': '',
'PASSWORD': ''
},
'old': {
'NAME': 'old_database',
'ENGINE': 'django.db.backends.mysql',
'USER': '',
'PASSWORD': ''
}
}
...
from django.db import connections
cursor = connections['old'].cursor()
cursor.execute("SELECT...")
cursor.fetchall()
refer to docs:
Executing custom SQL directly
Multiple databases
But if you want to modify data in your old database it is better idea to create models.py file and use it as always. Using inspectdb or not is up to you. For example you cold generate model using inpsectdb in separate temporary project, make dumpdata to create json files and upload data to your active project somehow.