Django database tables being created under incorrect schema - python

I created a Django application and connected it to SQL Server with a trusted connection. When I migrate, Django is creating a new schema name (trusted connection username) in the database and adding this on all tables as prefix. By my IT department convention, all tables should be created with the 'dbo' prefix.
The most interesting part is: When I access the database from SSMS (also with trusted connection), and create a new database, I do not have this issue, it creates as 'dbo.table_name'.
Does anybody knows how can I fix it? See below better example and some code.
Summary:
Django creating: 'my_username.table_name'
I need: 'dbo.table_name'
My django settings.py
DATABASES = {
'default': {
'ENGINE': 'sql_server.pyodbc',
'NAME': 'dabase_name',
'HOST': 'database_host',
'USER': '',
'OPTIONS': {
'driver': "ODBC Driver 17 for SQL Server",
'Trusted_Connection' : 'Yes',
}
}
}
One of my models (table) as example:
class Sap_module(models.Model):
sap_module = models.CharField(max_length=2, unique=True)
available = models.BooleanField(default=True)
def __str__(self):
return self.sap_module

Related

Using Django select_related across multiple Databases

I have two database connections in the Django project, below is the Database settings:
# settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'mysql_db_name',
'USER': 'mysql_user',
'PASSWORD': 'mysql_db_pass',
'HOST': 'localost',
'PORT': '3306',
},
'postgres_db': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgresql_db_name',
'USER': 'postgresql_user',
'PASSWORD': 'postgresql_db_pass',
'HOST': 'localhost',
'PORT': '5432',
}
And these are models:
# sales/models.py
from django.contrib.auth.models import User # Django's auth user model this is in MySQL
class Batch(models.model): # this table is in PostgreSQL
class params:
db = 'postgres_db'
created_by = ForeignKeyAcrossDb(User, db_constraint=False, on_delete=models.DO_NOTHING)
ForeignKeyAcrossDb is working fine which is inherited from models.Foreignkey.
Foreignkey across multiple databases is working as expected, this is not an issue, my concern is using select_related().
If I run the ORM
data = Batch.objects.filter(created_by=request.user.pk) it works fine.
But I want to run this ORM using select_related() like this
data = Batch.objects.select_related('created_by').filter(created_by=request.user.pk)
This gives an error
ProgrammingError at /
relation "auth_user" does not exist
LINE 1: ...r"."date_joined" FROM "sales_batch" INNER JOIN "auth_user...
Which is pretty obvious as auth_user is in MySQL and select_related() is looking in PostgreSQL for the join to work.
How can I use select_related() spanning multiple databases?
Any help would be appreciated.

Django how to create, update and keep clone objects to another database for backup purpose?

postgresql is my primary data base. If any object create in my original database then I want it will also keep an duplicate object in my secondary database. I read Django documentation for create an clone database but didn't work. here is my code:
#replicarouter.py
class PrimaryReplicaRouter:
def db_for_read(self, model, **hints):
"""
Reads go to a randomly-chosen replica.
"""
return 'primary'
def db_for_write(self, model, **hints):
"""
Writes always go to primary.
"""
return 'primary'
def allow_relation(self, obj1,**hints):
"""
Relations between objects are allowed if both objects are
in the primary/replica pool.
"""
db_set = {'primary', 'replica_database'}
if obj1._state.db in db_set:
return True
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
All non-auth models end up in this pool.
"""
return True
settings.py
DATABASES = {
'default':{},
'primary': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'my_db_name',
'USER': 'postgres',
'PASSWORD': 'my_db_pass',
'HOST': 'localhost',
'PORT':5432,
},
'replica_database': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'my_db_name',
'USER': 'root',
'PASSWORD': '',
'HOST': 'localhost',
'PORT': 3306,
}
}
DATABASE_ROUTERS = ['my_root_folder_name.dbrouters.AuthRouter','my_root_folder_name.replicarouter.PrimaryReplicaRouter']
right now all new objects creating in my primary database. I want to keep clone of every new object in my replica_database. If any object added in my primary database then it will also add an clone object in my replica_database.
finally I find out an solution. After reading django multiple database documentation we can use multiple database but can't keep clone object in our database. Let explain little bit assume you have two model teacher and student. You can use two separate database for teacher and student but you can't use another database for keep clone objects of your teacher and student model. So here we will use django siganls for keep clone objects in our replica_database. Signals will be triggered and create clone object whenever any objects create in our model. Here is my code:
settings.py
'default': {
'NAME': 'primary_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '',
},
'replica1_database': {
'NAME': 'replica1_database',
'ENGINE': 'django.db.backends.mysql',
'HOST': 'localhost',
'USER': 'root',
'PASSWORD': '', },
models.py:
from django.db.models.signals import post_save
from django.dispatch import receiver
class Contact(models.Model):
name = models.CharField(blank=True, null=True, max_length=100)
#receiver(post_save, sender=Contact, dispatch_uid="clone_objects")
def replica1_databse(sender, instance, created, **kwargs):
if created: #cerate clone object in our replica1 database
obj = instance
obj.save(using='replica1')
else: #updating clone object in our replica1 database
obj = Contact.objects.using('replica1').update(
name=instance.name)
Here I am triggering signals for create an clone objects in my replica1_database.
Now run python manage.py makemigrtions contact and python manage.py migrate contact this two migrations applied in your default database. This is the most important step ----> You have to run python manage.py migrate --database=replica1 This migrate applied for your replica1 database.
I Think it also an good idea to keep an backup database for avoid any unexcepted situation such as server down.

Can't use multiple Databases in my django project

I'm trying to use two DBs for my django project. The first one is for authentication etc, the second should hold data sent by the user through a form.
I added the second DB to my settings.py file, but i keep getting errors, the most recent one is (1146, "Table 'dataset.main_SomeModel' doesn't exist")
Indeed, it looks like my Django project can't interact with the db, since there is no table there.
Am i doing something wrong? Maybe is it the wrong way to use two DBs here?
Here is settings.py, the second db called dataset is the one i'm trying to use:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'dataset': {
'NAME': 'dataset',
'ENGINE': 'django.db.backends.mysql',
'USER': 'root',
'PASSWORD': 'password goes here'
}
}
Here is the model:
class SomeModel(models.Model):
data = models.CharField(max_length=100)
num = models.Float()
def save(self): # ALL the signature
super(SomeModel, self).save(using='dataset')
And here is the form:
class DataForm(forms.ModelForm):
class Meta:
model = SomeModel
fields = ("data", "num")
def save(self, commit=True):
send = super(DataForm, self).save(commit=False)
if commit:
send.save()
return send
Since i added the line using="dataset" shouldn't the data be sent to the dataset db? Or am i doing something else wrong? Any advice is appreciated!
Edit: i tried migrating the second database using manage.py migrate --database="dataset" but i get the error The connection dataset doesn't exist
You are missing quote '.
Try this
super(SomeModel, self).save(using='dataset')
You can lookup this Multi DB Save

Create custom Database Connection in Django

I am creating a project in DJango where I want to use a mixture of MySQL and ArangoDB. I am creating an API based on DJango REST Framework.
The Process for ArangoDB
User calls an API and request enters in the VIEW
View validates the request.data via serializers
If data is valid, .save() is called which saves the data in ArangoDB.
I won't use Models as there is no schema (that's why using NoSQL).
The Problem
How can I create a global connection with ArangoDB that I can use everywhere else?
Will the connection with ArangoDB be active or I need to re-connect?
What should I do?
Regardless of no schema, you should create models to work with. Take a look at this example.
Also you have to set similar DATABASES settings:
DATABASES = {
'default': {
'ENGINE': 'arangodb_driver',
'HOST': 'localhost',
'PORT': '8529',
'NAME': 'some_user',
'USER': 'root',
'PASSWORD': 'some_password',
}
}
And to install the driver via:
pip install git+git://github.com/pablotcarreira/django-arangodb

Is there any method in django to use database without creating models.py file?

I am having a database with some data filled in it and i want to use it in my new django app. So is their any way to use data of database in my django app.Actually i don't want to make any changes in my old database and only want to use its data. Anybody please suggest me what will be the better approach to do this.
While serching i also found a command-inspectdb
which can generate model.py file from database, but their are some issues with it that it does'nt map the foreign key in model.py, we need to rearrange our classes in model.py file and some more. So i am searching for some other alternative.
You could access data from legacy database using connection.cursor() from django.db module.
If you have two dabases
DATABASES = {
'default': {
'NAME': 'new_database',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': '',
'PASSWORD': ''
},
'old': {
'NAME': 'old_database',
'ENGINE': 'django.db.backends.mysql',
'USER': '',
'PASSWORD': ''
}
}
...
from django.db import connections
cursor = connections['old'].cursor()
cursor.execute("SELECT...")
cursor.fetchall()
refer to docs:
Executing custom SQL directly
Multiple databases
But if you want to modify data in your old database it is better idea to create models.py file and use it as always. Using inspectdb or not is up to you. For example you cold generate model using inpsectdb in separate temporary project, make dumpdata to create json files and upload data to your active project somehow.

Categories

Resources