How to create sqlite table in Django automatically? - python

Im using Django 2.0. My task is to write a large dataset that after being analized it'll be dropped every day. I've decided to write that data in SQLite using database routers that generates the file automatically but it does not create the model table and throws an OperationalError cause the table does not exist. Which (if someone had a similar situation) should be a nice solution for this?
Thanks in advance!

Elaborating on #SuperStew suggestion, something like this should work:
# in settings.py
import os
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'mydatabase',
}
}
db_path = DATABASES['default']['NAME']
if not os.path.isfile(db_path):
open(db_path, 'wb').close()

Related

Can't use multiple Databases in my django project

I'm trying to use two DBs for my django project. The first one is for authentication etc, the second should hold data sent by the user through a form.
I added the second DB to my settings.py file, but i keep getting errors, the most recent one is (1146, "Table 'dataset.main_SomeModel' doesn't exist")
Indeed, it looks like my Django project can't interact with the db, since there is no table there.
Am i doing something wrong? Maybe is it the wrong way to use two DBs here?
Here is settings.py, the second db called dataset is the one i'm trying to use:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'dataset': {
'NAME': 'dataset',
'ENGINE': 'django.db.backends.mysql',
'USER': 'root',
'PASSWORD': 'password goes here'
}
}
Here is the model:
class SomeModel(models.Model):
data = models.CharField(max_length=100)
num = models.Float()
def save(self): # ALL the signature
super(SomeModel, self).save(using='dataset')
And here is the form:
class DataForm(forms.ModelForm):
class Meta:
model = SomeModel
fields = ("data", "num")
def save(self, commit=True):
send = super(DataForm, self).save(commit=False)
if commit:
send.save()
return send
Since i added the line using="dataset" shouldn't the data be sent to the dataset db? Or am i doing something else wrong? Any advice is appreciated!
Edit: i tried migrating the second database using manage.py migrate --database="dataset" but i get the error The connection dataset doesn't exist
You are missing quote '.
Try this
super(SomeModel, self).save(using='dataset')
You can lookup this Multi DB Save

Create custom Database Connection in Django

I am creating a project in DJango where I want to use a mixture of MySQL and ArangoDB. I am creating an API based on DJango REST Framework.
The Process for ArangoDB
User calls an API and request enters in the VIEW
View validates the request.data via serializers
If data is valid, .save() is called which saves the data in ArangoDB.
I won't use Models as there is no schema (that's why using NoSQL).
The Problem
How can I create a global connection with ArangoDB that I can use everywhere else?
Will the connection with ArangoDB be active or I need to re-connect?
What should I do?
Regardless of no schema, you should create models to work with. Take a look at this example.
Also you have to set similar DATABASES settings:
DATABASES = {
'default': {
'ENGINE': 'arangodb_driver',
'HOST': 'localhost',
'PORT': '8529',
'NAME': 'some_user',
'USER': 'root',
'PASSWORD': 'some_password',
}
}
And to install the driver via:
pip install git+git://github.com/pablotcarreira/django-arangodb

Is there any method in django to use database without creating models.py file?

I am having a database with some data filled in it and i want to use it in my new django app. So is their any way to use data of database in my django app.Actually i don't want to make any changes in my old database and only want to use its data. Anybody please suggest me what will be the better approach to do this.
While serching i also found a command-inspectdb
which can generate model.py file from database, but their are some issues with it that it does'nt map the foreign key in model.py, we need to rearrange our classes in model.py file and some more. So i am searching for some other alternative.
You could access data from legacy database using connection.cursor() from django.db module.
If you have two dabases
DATABASES = {
'default': {
'NAME': 'new_database',
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': '',
'PASSWORD': ''
},
'old': {
'NAME': 'old_database',
'ENGINE': 'django.db.backends.mysql',
'USER': '',
'PASSWORD': ''
}
}
...
from django.db import connections
cursor = connections['old'].cursor()
cursor.execute("SELECT...")
cursor.fetchall()
refer to docs:
Executing custom SQL directly
Multiple databases
But if you want to modify data in your old database it is better idea to create models.py file and use it as always. Using inspectdb or not is up to you. For example you cold generate model using inpsectdb in separate temporary project, make dumpdata to create json files and upload data to your active project somehow.

Force strict sql mode in django

I'm trying to force my django project to always use strict sql_mode. Is there another way than putting the following in manage.py? It seems overly complicated.
def set_strict_sql_mode(sender, **kwargs):
from django.conf import settings
if settings.DATABASES['default']['ENGINE'] == 'django.db.backends.mysql':
from django.db import connection
cursor = connection.cursor()
cursor.execute('SET session sql_mode=traditional')
from django.core.signals import request_started
request_started.connect(set_strict_sql_mode)
Actually asking proved to be a good rubber duck. Just after asking, I found the custom database OPTIONS one can supply in the DATABASES settings like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'OPTIONS': {
'sql_mode': 'traditional',
}
}
}
Hope it helps anyone!
You can also try with Adding below option in Database []
'OPTIONS': {
'init_command': "SET sql_mode='STRICT_TRANS_TABLES'",
},
Its working.
https://docs.djangoproject.com/en/1.11/ref/databases/#mysql-sql-mode
Setting sql_mode
From MySQL 5.7 onwards and on fresh installs of MySQL 5.6, the default
value of the sql_mode option contains STRICT_TRANS_TABLES. That option
escalates warnings into errors when data are truncated upon insertion,
so Django highly recommends activating a strict mode for MySQL to
prevent data loss (either STRICT_TRANS_TABLES or STRICT_ALL_TABLES).
If you need to customize the SQL mode, you can set the sql_mode
variable like other MySQL options: either in a config file or with the
entry 'init_command': "SET sql_mode='STRICT_TRANS_TABLES'" in the
OPTIONS part of your database configuration in DATABASES.
If you need to union the queryset, you can use python chain inster than union.
from itertools import chain
gobj_list = user.groups.all()
robj_list = [obj.roles.all() for obj in gobj_list]
ret_list = chain(*robj_list)

How to use schemas in Django?

I whould like to use postgreSQL schemas with django, how can I do this?
Maybe this will help.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'OPTIONS': {
'options': '-c search_path=your_schema'
},
'NAME': 'your_name',
'USER': 'your_user',
'PASSWORD': 'your_password',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
I get the answer from the following link:
http://blog.amvtek.com/posts/2014/Jun/13/accessing-multiple-postgres-schemas-from-django/
I've been using:
db_table = '"schema"."tablename"'
in the past without realising that only work for read-only operation. When you try to add new record it would fail because the sequence would be something like "schema.tablename"_column_id_seq.
db_table = 'schema\".\"tablename'
does work so far. Thanks.
As mentioned in the following ticket:
https://code.djangoproject.com/ticket/6148, we could set search_path for the django user.
One way to achieve this is to set search_path via psql client, like
ALTER USER my_user SET SEARCH_PATH TO path;
The other way is to modify the django app, so that if we rebuild the database, django won't spit all the tables in public schema.
To achieve this, you could override the DatabaseWrapper defined in django.db.backends.postgresql_psycopg2.base
Create the following directory:
app/pg/
├── __init__.py
└── base.py
Here's the content of base.py
from django.db.backends.postgresql_psycopg2.base import DatabaseWrapper
class DatabaseWrapper(DatabaseWrapper):
def __init__(self, *args, **kwargs):
super(DatabaseWrapper, self).__init__(*args, **kwargs)
def _cursor(self):
cursor = super(DatabaseWrapper, self)._cursor()
cursor.execute('SET search_path = path')
return cursor
In settings.py, add the following database configuration:
DATABASES = {
'default': {
'ENGINE': 'app.pg',
'NAME': 'db',
'USER': 'user',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
It's a bit more complicated than tricky escaping. Have a look at Ticket #6148 in Django for perhaps a solution or at least a patch. It makes some minor changes deep in the django.db core but it will hopefully be officially included in django.
After that it's just a matter of saying
db_schema = 'whatever_schema'
in the Meta class or for a global change set
DATABASE_SCHEMA = 'default_schema_name'
in settings.py
UPDATE: 2015-01-08
The corresponding issue in django has been open for 7 years and the patch there will not work any more.
The correct answer to this should be...
At the moment you can't use postgreSQL schemas in django out of the box.
I just developed a package for this problem: https://github.com/ryannjohnson/django-schemas.
After some configuration, you can simply call set_db() on your models:
model_cls = UserModel.set_db(db='db_alias', schema='schema_name')
user_on_schema = model_cls.objects.get(pk=1)
The package uses techniques described in https://stackoverflow.com/a/1628855/5307109 and https://stackoverflow.com/a/18391525/5307109, then wraps them for use with Django models.
I've had some success just saying
db_table = 'schema\".\"tablename'
in the Meta class, but that's really ugly. And I've only used it in limited scenarios - it may well break if you try something complicated. And as said earlier, it's not really supported...
There is no explicit Django support for postgreSQL schemas.
When using Django (0.95), we had to add a search_path to the Django database connector for PostgreSQL, because Django didn't support specifying the schema that the tables managed by the ORM used.
Taken from:
http://nxsy.org/using-postgresql-schemas-with-sqlalchemy-and-elixir
The general response is to use SQLAlchemy to construct the SQL properly.
Oh, and here's another link with some suggestions about what you can do with the Django base, extending it to try to support your scheme:
http://news.ycombinator.com/item?id=662901
I know that this is a rather old question, but a different solution is to alter the SEARCH_PATH.
Example
Lets say you have three tables.
schema1.table_name_a
schema2.table_name_b
table_name_c
You could run the command:
SET SEARCH_PATH to public,schema1,schema2;
And refer to the tables by their table names only in django.
See 5.7.3. The Schema Search Path
For SQL server database:
db_table = "[your_schema].[your_table]"
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-routers
urls.py
from django.urls import path, include
from rest_framework.routers import DefaultRouter
from my_app.my_views import ClientViewSet
router = DefaultRouter(trailing_slash=False)
router.register(r'', ClientViewSet, base_name='clients')
urlpatterns = [
path('', include(router.urls)),
]

Categories

Resources