Synchronizing two databases via mappings in Python - python

I have a live Django 1.6 project that makes use of a postgresql database. I've currently been developing an API for the project to support a mobile app - making use of South migrations as I go along.
As a result of the of this: I have a new database structure and code-base that is quite far removed from my production database although many of the fields in the production database are still present in the new one.
So as it sits - I have the mobile app running off the new API and the website running off the old database.
How can I go about synchronizing data between the two databases while the front-end for the new website is being developed?
i.e. When a user makes use of the app and then later goes to the website, I'd like their data to be available and vice-versa.
Solutions I've tried:
I cannot use the API to update the new database because it would trigger activation emails for users that have already been activated.
I've tried to use peewee to create a synchronization script where each field in one database is mapped to a field in the other database. This has been effective for tables where the schema are similar but I've had trouble when it comes to keeping foreign key relationships in tact.

Related

Customize the Django ContentTypes and AdminLog and Migration model

I am working on a Django project and it requires 3 different database servers with different engines like MySql, PostgreSQL and SQLite.
SQLite has all the client side setting table, tables with not shared data [app has a feature to not upload some data to the server] and configuration tables and it is on the client machine through a tweaked C# app to create a django server environment and show the django site in a browser control like a native application.
PostgreSQL has tables that can be accessed by different clients on the network and has different tables with values of shared data. They maintain integrity with SQLite data through ContentType.
Django provides the contentType table that is created in both SQLite and PostgreSQL. Due to heavy use of other database managing tables we require them to be in a MySQL database on a separate server. So it would be great to have django_content_types, django_admin_log and django_migration in that MySQL server.
ContentTypes table is also used a lot and it is difficult to maintain contentTypes form two different database tables so we require it to be in one separate one with values form both the databases. And when one goes there why not the other two: django_migration, django_admin_log.
Also there is requirement to store raw SQL DDL queries in the database to track changes outside of the Django ORM and so it makes sense to add rows to django_migration and a new column to store raw SQL queries.
Also, this is the last thing for sure, django_admin_log only logs admin-site changes and also lacks data in message columns when new row is added or deleted. We really require it to store the JSON of values added to the particular table when adding and when deleting, the instance [json format of row] to message column about the record deleted. And require it to log all transactions on instances of both the databases.
I am looking to [customize/extend/create my own and override] these models in a Django App to gain the above expected behavior and have a portable app to use it in future such projects.
Regards,
    Aagam Sheth

Changing Database in run time and making the changes reflect in Django in run time

I am developing a Cloud based data analysis tool, and I am using Django(1.10) for that.
I have to add columns to the existing tables, create new tables, change data-type of columns(part of data-cleaning activity) at the run time and can't figure out a way to update/reflect those changes, in run time, in the Django model, because those changes will be required in further analysis process.
I have looked into 'inspectdb' and 'syncdb', but all of these options would require taking the portal offline and then making those changes, which I don't want.
Please can you suggest a solution or a work-around of how to achieve this.
Also, is there a way in which I can select what database I want to work from the list of databases on my MySQL server, after running Django.
Django's ORM might not be the right tool for you if you need to change your schema (or db) online - the schema is defined in python modules and loaded once when Django's web server starts.
You can still use Django's templates, forms and other libraries and write your own custom DB access layer that manipulates a DB dynamically using python.

Can I use an external database table for the login process in Django?

So I'm starting a new Django project that essentially requires the login & registration process be routed through an EXTERNAL & ALREADY created database.
Is it possible to have the User model use an EXTERNAL database table ONLY when Django is:
Logging in a user, to check if the login is valid
Registering a user, inserting data for that user in the external database
I would like for the rest of the Django server to use a local database.
If so, could someone either provide examples or guide me to documentation on the subject?
Easiest way to use multiple database with Django is to use a database routing. By default Django stick to single database, however, if you want to implement more interesting database routing system, you can define and install your own database routers.
Database routers are installed using the DATABASE_ROUTERS setting. You have to specify this setting in your settings.py file
What you have to do is write one AuthRouter as described Django documentation Django Multiple Database
"Yes, but"
What you are looking for in the docs is called "database router".
There is even an example for the auth app in the docs there.
But, there is s serious drawback to consider with this approach:
We cannot have cross-database relationships in the models. If auth tables are in a separate database, this means that any otehr app that needs a foreign key to User model is going to run into problems. You might be able to "fake" the relationships using a db that doesn't enforce relationship checks (SQLite or MyISAM/MySQL).
Out of the box, such apps are: session, authtoken, and admin (and probably more).
Alternatively, a single-sign-on solution might do a better job: django-sso, or django-mama-cas + django-cas-ng, or the commercial Stormpath.

Django app as database web based UI

I'm planning to develop web UI something like iSQL*Plus-oracle but, for most common databases. Just take input query from user and return result with save options,etc.
So, for connecting to external data bases, what is advisable way,
1. Using django model and raw sql or,
2. with modules outside django - sqlalchemy+mysqldb,psycopg..?
Going through django documentation my understanding is db connections has to be in settings.py and I could not add from user input. Is my understanding is true or not?
I'm new to django not to python.
An ORM (something like Django's models or sqlalchemy) is a really helpful abstraction to help map tabular data in the database to the objects its being used to model in your code. It won't help with connecting to databases provided by the user since you won't know what the schema of the database is you're connecting to, nor what you are going to receive back from a query.
With django, the database defined in settings.py is used to store information related to your app such as user credentials, migrations as well as whatever else you define in your models.py files. So definitely don't try to change that dynamically as it is being used to store the state of your application for all users.
If you need to connect to external databases and run user-supplied queries, you can do that inside a view using the appropriate database driver. So psycopg2 for postgres would be fine.

Multiple database (PG and Mongo) Python model package

I'm running into an architectural issue in regards to how to organize a certain project.
The project is to create a package of models and relative database connections to be used in multiple different web applications.
I wish to create a model hierarchy with the following models:
User
Application
Log
There are more models but these three exemplify the basic underlying problem.
The User model has a many to many relationship with Application.
And an Application has a one to many relationship with Log.
User * - * Application
Application 1 - * Log
The problem i am running into is that i want User and Application to exist in a Postgres database and the logs to exist in a Mongo datastore.
The main reason i have for putting Logs into a different data store is that i don't need to do any updates, just inserts for said model. I will not need immediate consistency with the inserts. And i want to utilize the aggregation framework and map reductions on a weekly basis. Also the amount of documents of logs i will have will trump the amount of users/applications i will have. On a scale of millions to 1. The server i will be using this model mostly in will be communicating over websockets so the amount of traffic and inserts will be high and i don't really need postgres for it. (But please feel free to (nicely) steer me in a different direction if you feel so obliged).
Now i am using python with SQLAlchemy for the modeling, engine, and the migrations.
And while i haven't started yet i plan on using MongoKit for the mongo model(s).
So far i have the SQLAlchemy engine setup with the models and migrations for the User and Application models.
How do i setup the entire package so that when i include this repo as a dependency for a Flask application i can utilize both engines and connection pools?
Because i can already see that having a property on the Application for retrieving the latest logs may become an issue.

Categories

Resources