We are developing a b2b application with django. For each client, we launch a new virtual server machine and a database. So each client has a separate installation of our application. (We do so because by the nature of our application, one client may require high use of resources at certain times, and we do not want one client's state to affect the others)
Each of these installations are binded to a central repository. If we update the application code, when we push to the master branch, all installations detect this, pull the latest version of the code and restart the application.
If we update the database schema on the other hand, currently, we need to run migrations manually by connecting to each db instance one by one (settings.py file reads the database settings from an external file which is not in the repo, we add this file manually upon installation).
Can we automate this process? i.e. given a list of databases, is it possible to run migrations on these databases with a single command?
If we update the application code, when we push to the master branch,
all installations detect this, pull the latest version of the code and
restart the application.
I assume that you have some sort of automation to pull the codes and restart the web server. You can just add the migration to this automation process. Each of the server's settings.py would read the database details from the external file and run the migration for you.
So the flow should be something like:
Pull the codes
Migrate
Collect Static
Restart the web server
First, I'd really look (very hard) for a way to launch a script that does as masnun suggests on the client side, really hard.
Second, if that does not work, then I'd try the following:
Configure on your local machine all client databases in the settings variable DATABASES
Make sure you can connect to all the client databases, this may need some fiddling
Then you run the "manage.py migrate" process with the extra flag --database=mydatabase (where "mydatabase" is the handle provided in the configuration) for EACH client database
I have not tried this, but I don't see why it wouldn't work ...
Related
I am developing a Django application in Atom locally, and then on pythonanywhere, when I'm ready, I'm doing a GIT PUSH command after syncing those changes to GITHUB. The problem is, at some point, those changes which were pushed through from my local development env have overwritten all my live users with just the local dummy users I've been using for testing and development.
Basically, I'm testing the login and registration system locally, logging in and registering with lots of dumb emails. Once I was happy it was working, I synced the Django code I'd changed to GITHUB (with the desktop app) and then did a GIT PUSH command on a PythonAnywhere (my server) console. The sqlite DB is included in those updates/sync - is that correct? Or should it just be totally ignored?
I just realised, that one (perhaps all?) pushes have overwritten my sqlite DB, and there were perhaps 30 or so actual users who had signed up on the website whose data is no longer registered on the site. I managed to find a trace of them in the Django Admin logs, and I've found the version history of the Sqlite DB on GITHUB, but my question is - how do I avoid this happening?
What is the workflow to avoid this situation in the future? And is there a command I can run in shell to get those users back into my 'live' database from the backedup SQlite file?
I know this is a simple question, but I'm new to development and I'm slowly getting there with troubleshooting the code, but versioning, GIT, and workflow are tricky things to get my head around.
I have a Flask (Python) application running on Passenger which works absolutely fine when I test it and I'm the only user.
But as soon as I attempt to have several concurrent connections, each client waits forever for a response. I have tried it with 50 concurrent users, which seems like it should easily be supported.
The app is very simple, reading and writing to a SQLite database once or twice. (Concurrent access of SQLite by this small number of users is not a problem.)
What am I missing?
In the Passenger docs it makes the following suggestion:
Passenger also supports the magic file 'tmp/always_restart.txt'. If
this file exists, Passenger will restart your application after every
request. This way you do not have to invoke the restart command often.
Activate this mechanism by creating the file:
$ mkdir -p tmp
$ touch tmp/always_restart.txt
This is great for development because it means you only need to save your Python files for the latest version of the app to be available to clients.
But it's terrible for production because every client makes its own request and that restarts the Python app. This is a very major overhead for the server, so users are likely to timeout before receiving a response.
Delete the file tmp/always_restart.txt and your concurrency limits will shoot up.
I have a python django application that I published to heroku by connecting to github. I want some people to be able to add some information to the database from the website. If I make changes to the code, push to github and deploy the branch the database will go back to how it was at first. How can update my code for the app without changing the database?
If you host your database on a separate server, like with Amazon RDS or Heroku Postgres, and configure your code to connect to this remote host, you should have sufficient decoupling to avoid what you are talking about.
I am newbie to python and django, but this time I need a fast solution. I've got a problems, using hosting where my django application is deployed so I need to migrate to another server, but I have no ssh or telnet to server, only ftp connection for this server. I need to export data from django database. I wanted to write a script and put it somewhere in django application for data export, but when I put my modification on server behavior does not change(as nothing changed). Also when I remove .pyc files from djagno (for example. views.pyc) - no changes, and when I remove .py file - nothings changes (for example views.py).
As far as I read about django, it is possible that server is running with option "-noreload".
So question is it any possible way to dump database only via ftp and django/python?
(remote connection via mysql is disabled)
ftp stands for "file transfer protocol", not for "remote shell", so no, you cannot use ftp to execute a command / program / script / whatever. But why don't you just ask your hosting how to get a dump of your data ?
I'm pretty new to Python but have been running a few programs locally using Komodo edit, and then uploading the results manually to my website's MySQL database.
I'm looking into letting Python do this on it's own, but as i understand it i have to open my MySQL database to anyone regardless of if they are running scripts on my server or not if I'm to do this.
I'm guessing this is due to with security reasons, but i don't know how vulnerable this can make my site? Is it a bad idea to do it this way, or would it be better to run my python program from the server itself? (I've never run python code from my server, and my python code too, might be insecure)
If you have a access to the entire server (i.e. not just the hosting directory as is common on some shared hosting setups), and can ssh into the server, then your safest (though not easiest) option is to place the script on the server outside of the web hosting folder. This will stop anyone from remotely accessing the script, and will let you connect to the db without enabling remote connections.
You could enable remote connections if your hosting server set up allows it (not sure if any hosting companies disable, or prevent it, though you may have to enable it from the start when you create the database) Just select a nice strong password. Then you can use your script locally, and you'd be as secure as your password.