Can I somehow work with remote databases (if they can do it) with the Django ORM?
It is understood that the sitting has spelled out the local database. And periodically to make connection to various external databases and perform any sort of commands such as load dump.
If you can connect to the database remotely, then you can simply specify its host/port in settings.py exactly as you would a local one.
Related
I haven't been able to find any documentation regarding whether it's possible to access SQLITE3 (using Python) when the SQLITE database is hosted externally:
I have my SQLITE3 database hosted on my VPS (alongside some other stuff that doesn't really matter) - rather than having it as a local file with my Python program.
Therefore, is it possible for me to connect to the SQLITE database which is hosted on my VPS, or will the SQLITE DB have to be hosted locally for me to be able to do this?
The reason I want it to be accessible from my VPS is because I want to be able to run the program on multiple computers and them all have the same access to the database- if this isn't possible, are there any other options which would allow me to do this?
If you want to have a database server with external, possibly remote, applications interacting a client-server protocol switch to PostgreSQL, MariaDB, etc.
see: How to connect to SQLite3 database server?
I am trying to deploy a Django REST API on Heroku. Normally I wouldn't have any issues with this but for this app, I am using a legacy database that exists on AWS. Is it possible for me to continue to use this remote database after deploying Django to Heroku? I have the database credentials all set up in settings.py so I would assume that it should work but I am not sure.
It should not pose any problem to connect with an database on AWS.
But be sure that the database on AWS is configured to accept external access, so that Heroku can connect.
And I would sugest that you take the credentials out of the source code and put it in the Config Vars that Heroku provide (environment variables).
Will it work? I think yes, provided you configure your project and database for external access.
Should you want it? How may queries does an average page execute? Some applications may make tens of queries for every endpoint and added wait can combine into seconds of waiting for every request.
We are developing a b2b application with django. For each client, we launch a new virtual server machine and a database. So each client has a separate installation of our application. (We do so because by the nature of our application, one client may require high use of resources at certain times, and we do not want one client's state to affect the others)
Each of these installations are binded to a central repository. If we update the application code, when we push to the master branch, all installations detect this, pull the latest version of the code and restart the application.
If we update the database schema on the other hand, currently, we need to run migrations manually by connecting to each db instance one by one (settings.py file reads the database settings from an external file which is not in the repo, we add this file manually upon installation).
Can we automate this process? i.e. given a list of databases, is it possible to run migrations on these databases with a single command?
If we update the application code, when we push to the master branch,
all installations detect this, pull the latest version of the code and
restart the application.
I assume that you have some sort of automation to pull the codes and restart the web server. You can just add the migration to this automation process. Each of the server's settings.py would read the database details from the external file and run the migration for you.
So the flow should be something like:
Pull the codes
Migrate
Collect Static
Restart the web server
First, I'd really look (very hard) for a way to launch a script that does as masnun suggests on the client side, really hard.
Second, if that does not work, then I'd try the following:
Configure on your local machine all client databases in the settings variable DATABASES
Make sure you can connect to all the client databases, this may need some fiddling
Then you run the "manage.py migrate" process with the extra flag --database=mydatabase (where "mydatabase" is the handle provided in the configuration) for EACH client database
I have not tried this, but I don't see why it wouldn't work ...
I am newbie to python and django, but this time I need a fast solution. I've got a problems, using hosting where my django application is deployed so I need to migrate to another server, but I have no ssh or telnet to server, only ftp connection for this server. I need to export data from django database. I wanted to write a script and put it somewhere in django application for data export, but when I put my modification on server behavior does not change(as nothing changed). Also when I remove .pyc files from djagno (for example. views.pyc) - no changes, and when I remove .py file - nothings changes (for example views.py).
As far as I read about django, it is possible that server is running with option "-noreload".
So question is it any possible way to dump database only via ftp and django/python?
(remote connection via mysql is disabled)
ftp stands for "file transfer protocol", not for "remote shell", so no, you cannot use ftp to execute a command / program / script / whatever. But why don't you just ask your hosting how to get a dump of your data ?
I have the following setting:
a restricted system that contains SQLite database and is able to use python.
a usual PC-system.
My aim is to write an application (preferable JAVA) for the PC-system to connect to the SQLite databse on the remote System to read, alter, etc. tables. Unfortunaetly I'm not able to install a Webserver on the remote system because system restrictions deny this intention. So i have been asking myself if it is possible to connect to the database anyway?! I thought of something like a Python connection wrapper, that redirects all database calls. Hope someone can give me a hint for solving this problem.
SQLite is not "db server" but "db file". You can't connect remotly to it because it is not server (you had to write own server). You can copy file with data to another computer and use it but you get two seperated databases. If you can share (in network) folder with that file you can use it on all computers - but there can be problem with concurent writing. SQLite is not designed to work with many users at the same time.