One django instance one database vs different django instances - python

I am building a software that will store and manage doctor's data. I would like to ask something that came to me and has to do with how I want to deploy my project. At first I wanted to have different django instance for each client. Each clinet would have his own database and run seperatelly from the others. But I am not sure I can automate this procedure, because for each instance I need to have different database, different password different db username, something that I am not sure It can be automated while creating a every new instance of my django project. Or is it in some way. Imagine the following scenario
User pays -> After successfull payment -> DB(MySQL) is created ->
django instance is created ->
somehow django settings file is updated with db credentials -> syncdb is run
and this procedure must be automated. Is it better to have one db with and the seperation to be done from models (with foreignkeys etc)? Or is one unified database not a good solution (security side)? What do you think?

maybe you can have a look on multi-tenant data architecture.
https://github.com/bernardopires/django-tenant-schemas
http://msdn.microsoft.com/en-us/library/aa479086.aspx

Related

Handle connections to user defined DB in Django

I have pretty simple model. User defines url and database name for his own Postgres server. My django backend fetches some info from client DB to make some calculations, analytics and draw some graphs.
How to handle connections? Create new one when client opens a page, or keep connections alive all the time?(about 250-300 possible clients)
Can I use Django ORM or smth like SQLAlchemy? Or even psycopg library?
Does anyone tackle such a problem before?
Thanks
In your case, I would rather go with Django internal implementation and follow Django ORM as you will not need to worry about handling connection and different exceptions that may arise during your own implementation of DAO model in your code.
As per your requirement, you need to access user database, there still exists overhead for individual users to create db and setup something to connect with your codebase. So, I thinking sticking with Django will be more profound.

Changing Database in run time and making the changes reflect in Django in run time

I am developing a Cloud based data analysis tool, and I am using Django(1.10) for that.
I have to add columns to the existing tables, create new tables, change data-type of columns(part of data-cleaning activity) at the run time and can't figure out a way to update/reflect those changes, in run time, in the Django model, because those changes will be required in further analysis process.
I have looked into 'inspectdb' and 'syncdb', but all of these options would require taking the portal offline and then making those changes, which I don't want.
Please can you suggest a solution or a work-around of how to achieve this.
Also, is there a way in which I can select what database I want to work from the list of databases on my MySQL server, after running Django.
Django's ORM might not be the right tool for you if you need to change your schema (or db) online - the schema is defined in python modules and loaded once when Django's web server starts.
You can still use Django's templates, forms and other libraries and write your own custom DB access layer that manipulates a DB dynamically using python.

Can I use an external database table for the login process in Django?

So I'm starting a new Django project that essentially requires the login & registration process be routed through an EXTERNAL & ALREADY created database.
Is it possible to have the User model use an EXTERNAL database table ONLY when Django is:
Logging in a user, to check if the login is valid
Registering a user, inserting data for that user in the external database
I would like for the rest of the Django server to use a local database.
If so, could someone either provide examples or guide me to documentation on the subject?
Easiest way to use multiple database with Django is to use a database routing. By default Django stick to single database, however, if you want to implement more interesting database routing system, you can define and install your own database routers.
Database routers are installed using the DATABASE_ROUTERS setting. You have to specify this setting in your settings.py file
What you have to do is write one AuthRouter as described Django documentation Django Multiple Database
"Yes, but"
What you are looking for in the docs is called "database router".
There is even an example for the auth app in the docs there.
But, there is s serious drawback to consider with this approach:
We cannot have cross-database relationships in the models. If auth tables are in a separate database, this means that any otehr app that needs a foreign key to User model is going to run into problems. You might be able to "fake" the relationships using a db that doesn't enforce relationship checks (SQLite or MyISAM/MySQL).
Out of the box, such apps are: session, authtoken, and admin (and probably more).
Alternatively, a single-sign-on solution might do a better job: django-sso, or django-mama-cas + django-cas-ng, or the commercial Stormpath.

Django app as database web based UI

I'm planning to develop web UI something like iSQL*Plus-oracle but, for most common databases. Just take input query from user and return result with save options,etc.
So, for connecting to external data bases, what is advisable way,
1. Using django model and raw sql or,
2. with modules outside django - sqlalchemy+mysqldb,psycopg..?
Going through django documentation my understanding is db connections has to be in settings.py and I could not add from user input. Is my understanding is true or not?
I'm new to django not to python.
An ORM (something like Django's models or sqlalchemy) is a really helpful abstraction to help map tabular data in the database to the objects its being used to model in your code. It won't help with connecting to databases provided by the user since you won't know what the schema of the database is you're connecting to, nor what you are going to receive back from a query.
With django, the database defined in settings.py is used to store information related to your app such as user credentials, migrations as well as whatever else you define in your models.py files. So definitely don't try to change that dynamically as it is being used to store the state of your application for all users.
If you need to connect to external databases and run user-supplied queries, you can do that inside a view using the appropriate database driver. So psycopg2 for postgres would be fine.

Synchronizing two databases via mappings in Python

I have a live Django 1.6 project that makes use of a postgresql database. I've currently been developing an API for the project to support a mobile app - making use of South migrations as I go along.
As a result of the of this: I have a new database structure and code-base that is quite far removed from my production database although many of the fields in the production database are still present in the new one.
So as it sits - I have the mobile app running off the new API and the website running off the old database.
How can I go about synchronizing data between the two databases while the front-end for the new website is being developed?
i.e. When a user makes use of the app and then later goes to the website, I'd like their data to be available and vice-versa.
Solutions I've tried:
I cannot use the API to update the new database because it would trigger activation emails for users that have already been activated.
I've tried to use peewee to create a synchronization script where each field in one database is mapped to a field in the other database. This has been effective for tables where the schema are similar but I've had trouble when it comes to keeping foreign key relationships in tact.

Categories

Resources