create database automatically in django using settings.py or models.py - python

Current flow:
1) mysql >> CREATE {DATABASE} db_name [create_specification] ;
2) change database info in settings.py
3) python manage.py syncdb (assuming we have ready-made models)
Isn't there a way to do the same without using step 1. Maybe putting database name and specifications somewhere in settings.py so that i don't have to manually configure the db everytime i shift this project to some server
EDIT -
WHY I want to dodge the first step:
In my case, different columns of different tables have different collation types. So while development, whenever I recreate the database, i need to manually change the configurations of individual columns/tables, which is frustrating.

All you need is to know database user / password with grant create database, all other data is in settings. You can connect a custom command to pre_syncdb signal to gather this data.
Take a look to createsuperuser that is raised on post_syndb signal to learn about this.
EDITED
syncdb is not longer available. Since 1.9 you should to use pre_migrate signal.

I don't think its possible to dodge the step one, at least if you're using a different database backend except SQLite in django.
A note from docs: https://docs.djangoproject.com/en/dev/intro/tutorial01/#database-setup
If you’re using PostgreSQL or MySQL, make sure you’ve created a
database by this point. Do that with “CREATE DATABASE database_name;”
within your database’s interactive prompt.
If you’re using SQLite, you don’t need to create anything beforehand -
the database file will be created automatically when it is needed.
If you move your project to a server like Heroku, the db creation is automated, like when using PostgreSQL.
I'm wondering why you want to dogde the first step, however if you're desperate, you might still want to trying going the direct python way via psycopg2
Creating a postgresql DB using psycopg2

Granting the overall application database creation permissions to spare a minor manual initialization at deployment time sounds like a bad idea.
You should rather create a separate script that automates deployment for you. Fabric seems to have become the standard tool for that.

Done via django_extensions:
1) ./manage.py sqlcreate | ./manage.py dbshell
2) ./manage.py migrate
If at least the user on settings.py is already able to connect on the DB and have creation permissions, this should work.
To have it, I suggest to provide an envvar DATABASE_URL and use dj_database_url
At least is working for me on Postgres.

You can use sqlite while you are still in development stage.

Related

DJANGO development server using TestCase database

GOAL: Run the DJANGO development server, using the TestCase database.
Case: DJANGO produces this database when running a TestCase. Now I'm filling the database using DJANGO-autofixture. It would be really nice to start the DJANGO testserver using this database, so I can checkout how the website presents it. Unfortunately, I can't find anywhere how to do this.
Writing the test database to sqlite would make sense, but I don't see an options for this.
Any hints are appreciated! Thanks!
The testcase database only lives within the context of a test. It is created, migrated, and loaded with fixtures before you run your tests. Default behavior is that after your test suite runs (fail or succeed), the database is dropped.
I recommend just loading your fixtures via the django-admin testserver yourfixture.json command
If you really wanted to but I think it is a not a good idea.
You have the option to provide a --keepdb argument to your test command. This will keep your test database after your test case runs. The name of that db will your actual dbs name prefixed with test_ You can then connect to that database via the database settings.
test keep alive and test database
look at database settings

Django: How to disable Database status check at startup?

As far as I know Django apps can't start if any of the databases set in the settings.py are down at the start of the application. Is there anyway to make Django "lazyload" the initial database connection?
I have two databases configured and one of them is a little unstable and sometimes it can be down for some seconds, but it's only used for some specific use cases of the application. As you can imagine I don't want that all the application can't start because of that. Is there any solution for that?
I'm using Django 1.6.11, and we also use Django South for database migrations (in case that it's related somehow).
I do not know how safe it is but you can do the following:
In your settings.py start with empty DATABASES:
DATABASES = {}
In your apps.py, utilize the ready() hook to establish your database connections:
from settings.py import DATABASES
class YourAppConfig(AppConfig):
def ready(self):
DATABASES.update({
Your database connections
})
Now I cannot place my hand over fire for that one, but Django will try to reinitialize the database connection when needed.
I did some more tests and problem only happens when you are using de development server python manage.py runserver. In that case, it forces a connection with the database.
Using an actual WSGI server it doesn't happen as #Alasdair informed.
#JohnMoutafis in the end I didn't test your solution, but that could work.

Changing Database in run time and making the changes reflect in Django in run time

I am developing a Cloud based data analysis tool, and I am using Django(1.10) for that.
I have to add columns to the existing tables, create new tables, change data-type of columns(part of data-cleaning activity) at the run time and can't figure out a way to update/reflect those changes, in run time, in the Django model, because those changes will be required in further analysis process.
I have looked into 'inspectdb' and 'syncdb', but all of these options would require taking the portal offline and then making those changes, which I don't want.
Please can you suggest a solution or a work-around of how to achieve this.
Also, is there a way in which I can select what database I want to work from the list of databases on my MySQL server, after running Django.
Django's ORM might not be the right tool for you if you need to change your schema (or db) online - the schema is defined in python modules and loaded once when Django's web server starts.
You can still use Django's templates, forms and other libraries and write your own custom DB access layer that manipulates a DB dynamically using python.

Can I use an external database table for the login process in Django?

So I'm starting a new Django project that essentially requires the login & registration process be routed through an EXTERNAL & ALREADY created database.
Is it possible to have the User model use an EXTERNAL database table ONLY when Django is:
Logging in a user, to check if the login is valid
Registering a user, inserting data for that user in the external database
I would like for the rest of the Django server to use a local database.
If so, could someone either provide examples or guide me to documentation on the subject?
Easiest way to use multiple database with Django is to use a database routing. By default Django stick to single database, however, if you want to implement more interesting database routing system, you can define and install your own database routers.
Database routers are installed using the DATABASE_ROUTERS setting. You have to specify this setting in your settings.py file
What you have to do is write one AuthRouter as described Django documentation Django Multiple Database
"Yes, but"
What you are looking for in the docs is called "database router".
There is even an example for the auth app in the docs there.
But, there is s serious drawback to consider with this approach:
We cannot have cross-database relationships in the models. If auth tables are in a separate database, this means that any otehr app that needs a foreign key to User model is going to run into problems. You might be able to "fake" the relationships using a db that doesn't enforce relationship checks (SQLite or MyISAM/MySQL).
Out of the box, such apps are: session, authtoken, and admin (and probably more).
Alternatively, a single-sign-on solution might do a better job: django-sso, or django-mama-cas + django-cas-ng, or the commercial Stormpath.

Django app as database web based UI

I'm planning to develop web UI something like iSQL*Plus-oracle but, for most common databases. Just take input query from user and return result with save options,etc.
So, for connecting to external data bases, what is advisable way,
1. Using django model and raw sql or,
2. with modules outside django - sqlalchemy+mysqldb,psycopg..?
Going through django documentation my understanding is db connections has to be in settings.py and I could not add from user input. Is my understanding is true or not?
I'm new to django not to python.
An ORM (something like Django's models or sqlalchemy) is a really helpful abstraction to help map tabular data in the database to the objects its being used to model in your code. It won't help with connecting to databases provided by the user since you won't know what the schema of the database is you're connecting to, nor what you are going to receive back from a query.
With django, the database defined in settings.py is used to store information related to your app such as user credentials, migrations as well as whatever else you define in your models.py files. So definitely don't try to change that dynamically as it is being used to store the state of your application for all users.
If you need to connect to external databases and run user-supplied queries, you can do that inside a view using the appropriate database driver. So psycopg2 for postgres would be fine.

Categories

Resources