I have a Django app that has several database backends - all connected to different instances of Postgresql database. One of them is not guaranteed to be always online. It even can be offline when application starts up.
Can I somehow configure Django to use lazy connections? I would like to:
Try querying
return "sorry, try again later" if database is offline
or return the results if database is online
Is this possible?
The original confusion is that Django tries to connect to its databases on startup. This is actually not true. Django does not connect to database, until some app tries to access the database.
Since my web application uses auth and site apps, it looks like it tries to connect on startup. But its not tied to startup, its tied to the fact that those app access the database "early".
If one defines second database backend (non-default), then Django will not try connecting to it unless application tries to query it.
So the solution was very trivial - originally I had one database that hosted both auth/site data and also "real" data that I've exposed to users. I wanted to make "real" database connection to be volatile. So I've defined separate psql backend for it and switched default backend to sqlite.
Now when trying to access "real" database through Query, I can easily wrap it with try/except and handle "Sorry, try again later" over to the user.
Related
I'm a noobie to Django and I'm almost at the stage of deploying a web server.
I was just having some doubts with Django's database. Currently I'm using the default sqlite3 database to store all the user models as well as the info models. I'm thinking of using AWS to deploy my web sever.
So when I get to that stage, should I continue with sqlite or should I switch to AWS's database or something like Firebase. If I continue with sqlite, where and how exactly will the information be stored? And what if I switch to something like PostgreSQL, where will the information be stored and will it be secure/fast (even if I manage to get thousands of users)?
Thanks so much, this question might be really basic but I'm super confused.
sqlite is a flat file database, it uses an exposed file in your project to save your data, this is fine in local environment, but when deploying you need to consider that the server and the database are in the same machine and using the same disk. that means if you accidentally remove the machine -and its disk space- used to serve the application, then the database itself will be deleted with all records.
Plus you will face problems if you tried to scale your servers, that is every server will have his own copy of the database and syncing all those files will be huge headache.
If your data is not that important then you can keep using sqlite, but if you are expecting high traffic and complex db structure, then I would recommend you consider a db engine like Mysql or maybe look up the databases offered by amazon here:
https://aws.amazon.com/products/databases/
For django, you will need to change the adapter when using a different db like mysql, sqlite or anything else.
https://docs.djangoproject.com/en/3.0/ref/databases/
I have pretty simple model. User defines url and database name for his own Postgres server. My django backend fetches some info from client DB to make some calculations, analytics and draw some graphs.
How to handle connections? Create new one when client opens a page, or keep connections alive all the time?(about 250-300 possible clients)
Can I use Django ORM or smth like SQLAlchemy? Or even psycopg library?
Does anyone tackle such a problem before?
Thanks
In your case, I would rather go with Django internal implementation and follow Django ORM as you will not need to worry about handling connection and different exceptions that may arise during your own implementation of DAO model in your code.
As per your requirement, you need to access user database, there still exists overhead for individual users to create db and setup something to connect with your codebase. So, I thinking sticking with Django will be more profound.
I have a flask application which connects to an initial database which simply holds login data (user credentials, passwords etc...) - this is working fine.
But, Each user gets their own database when they create an account (necessary for security purposes and individual access)
Once the user logs in, the app queries the users login credentials for their unique database url.
How do I take this database url and connect the second database "on the fly" so to say. Is it even possible?
Using python, flask and sqlalchemy.
Just manage each sqlalchemy object within each request. You won't be able to make use of flask-sqlalchemy for this, you'll have to use regular sqlalchemy. Functionize the following, taking in your engine args, returning a session. Performance will also be worse (though probably not enough to really care), but that's unavoidable.
engine = create_engine(config.engine_str, **config.engine_args)
Session = sessionmaker(bind=engine)
session = Session()
You should also rethink having a database for each user. It's very likely that it's unnecessary.
So I'm starting a new Django project that essentially requires the login & registration process be routed through an EXTERNAL & ALREADY created database.
Is it possible to have the User model use an EXTERNAL database table ONLY when Django is:
Logging in a user, to check if the login is valid
Registering a user, inserting data for that user in the external database
I would like for the rest of the Django server to use a local database.
If so, could someone either provide examples or guide me to documentation on the subject?
Easiest way to use multiple database with Django is to use a database routing. By default Django stick to single database, however, if you want to implement more interesting database routing system, you can define and install your own database routers.
Database routers are installed using the DATABASE_ROUTERS setting. You have to specify this setting in your settings.py file
What you have to do is write one AuthRouter as described Django documentation Django Multiple Database
"Yes, but"
What you are looking for in the docs is called "database router".
There is even an example for the auth app in the docs there.
But, there is s serious drawback to consider with this approach:
We cannot have cross-database relationships in the models. If auth tables are in a separate database, this means that any otehr app that needs a foreign key to User model is going to run into problems. You might be able to "fake" the relationships using a db that doesn't enforce relationship checks (SQLite or MyISAM/MySQL).
Out of the box, such apps are: session, authtoken, and admin (and probably more).
Alternatively, a single-sign-on solution might do a better job: django-sso, or django-mama-cas + django-cas-ng, or the commercial Stormpath.
I'm planning to develop web UI something like iSQL*Plus-oracle but, for most common databases. Just take input query from user and return result with save options,etc.
So, for connecting to external data bases, what is advisable way,
1. Using django model and raw sql or,
2. with modules outside django - sqlalchemy+mysqldb,psycopg..?
Going through django documentation my understanding is db connections has to be in settings.py and I could not add from user input. Is my understanding is true or not?
I'm new to django not to python.
An ORM (something like Django's models or sqlalchemy) is a really helpful abstraction to help map tabular data in the database to the objects its being used to model in your code. It won't help with connecting to databases provided by the user since you won't know what the schema of the database is you're connecting to, nor what you are going to receive back from a query.
With django, the database defined in settings.py is used to store information related to your app such as user credentials, migrations as well as whatever else you define in your models.py files. So definitely don't try to change that dynamically as it is being used to store the state of your application for all users.
If you need to connect to external databases and run user-supplied queries, you can do that inside a view using the appropriate database driver. So psycopg2 for postgres would be fine.