I am newbie to python and django, but this time I need a fast solution. I've got a problems, using hosting where my django application is deployed so I need to migrate to another server, but I have no ssh or telnet to server, only ftp connection for this server. I need to export data from django database. I wanted to write a script and put it somewhere in django application for data export, but when I put my modification on server behavior does not change(as nothing changed). Also when I remove .pyc files from djagno (for example. views.pyc) - no changes, and when I remove .py file - nothings changes (for example views.py).
As far as I read about django, it is possible that server is running with option "-noreload".
So question is it any possible way to dump database only via ftp and django/python?
(remote connection via mysql is disabled)
ftp stands for "file transfer protocol", not for "remote shell", so no, you cannot use ftp to execute a command / program / script / whatever. But why don't you just ask your hosting how to get a dump of your data ?
Related
We are developing a b2b application with django. For each client, we launch a new virtual server machine and a database. So each client has a separate installation of our application. (We do so because by the nature of our application, one client may require high use of resources at certain times, and we do not want one client's state to affect the others)
Each of these installations are binded to a central repository. If we update the application code, when we push to the master branch, all installations detect this, pull the latest version of the code and restart the application.
If we update the database schema on the other hand, currently, we need to run migrations manually by connecting to each db instance one by one (settings.py file reads the database settings from an external file which is not in the repo, we add this file manually upon installation).
Can we automate this process? i.e. given a list of databases, is it possible to run migrations on these databases with a single command?
If we update the application code, when we push to the master branch,
all installations detect this, pull the latest version of the code and
restart the application.
I assume that you have some sort of automation to pull the codes and restart the web server. You can just add the migration to this automation process. Each of the server's settings.py would read the database details from the external file and run the migration for you.
So the flow should be something like:
Pull the codes
Migrate
Collect Static
Restart the web server
First, I'd really look (very hard) for a way to launch a script that does as masnun suggests on the client side, really hard.
Second, if that does not work, then I'd try the following:
Configure on your local machine all client databases in the settings variable DATABASES
Make sure you can connect to all the client databases, this may need some fiddling
Then you run the "manage.py migrate" process with the extra flag --database=mydatabase (where "mydatabase" is the handle provided in the configuration) for EACH client database
I have not tried this, but I don't see why it wouldn't work ...
im working on python application that requiring database connections..I had developed my application with sqlite3 but it start showing the error(the database is locked).. so I decided to use MySQL database instead.. and it is pretty good with no error..
the only one problem is that I need to ask every user using my application to install MySQL server on his pc (appserv for example) ..
so can I make mysql to be like sqlite3 apart of python lib. so I can produce a python script can be converted into exe file by the tool pyInstaller.exe and no need to install mysql server by users???
update:
after reviewing the code I found opened connection not closed correctly and work fine with sqllite3 ..thank you every body
It depends (more "depends" in the answer).
If you need to share the data between the users of your application - you need a mysql database server somewhere setup, your application would need to have an access to it. And, the performance can really depend on the network - depends on how heavily would the application use the database. The application itself would only need to know how to "speak" with the database server - python mysql driver, like MySQLdb or pymysql.
If you don't need to share the data between users - then sqlite may be an option. Or may be not - depends on what do you want to store there, what for and what do you need to do with the data.
So, more questions than answers, probably it was more suitable for a comment. At least, think about what I've said.
Also see:
https://stackoverflow.com/questions/1009438/which-database-should-i-use-for-my-desktop-application
Python Desktop Application Database
Python Framework for Desktop Database Application
Hope that helps.
If your application is a stand-alone system such that each user maintains their own private database then you have no alternative to install MySQL on each system that is running the application. You cannot bundle MySQL into your application such that it does not require a separate installation.
There is an embedded version of MySQL that you can build into your application (thanks, Carsten, in the comments, for pointing this out). More information is here: http://mysql-python.blogspot.com/. It may take some effort to get this working (on Windows you apparently need to build it from source code) and will take some more work to get it packaged up when you generate your executable, but this might be a MySQL solution for you.
I've just finished updating a web application using SQLite which had begun reporting Database is locked errors as the usage scaled up. By rewriting the database code with care I was able to produce a system that can handle moderate to heavy usage (in the context of a 15 person company) reliably still using SQLite -- you have to be careful to keep your connections around for the minimum time necessary and always call .close() on them. If your application is really single-user you should have no problem supporting it using SQLite -- and that's doubly true if it's single-threaded.
I have the following setting:
a restricted system that contains SQLite database and is able to use python.
a usual PC-system.
My aim is to write an application (preferable JAVA) for the PC-system to connect to the SQLite databse on the remote System to read, alter, etc. tables. Unfortunaetly I'm not able to install a Webserver on the remote system because system restrictions deny this intention. So i have been asking myself if it is possible to connect to the database anyway?! I thought of something like a Python connection wrapper, that redirects all database calls. Hope someone can give me a hint for solving this problem.
SQLite is not "db server" but "db file". You can't connect remotly to it because it is not server (you had to write own server). You can copy file with data to another computer and use it but you get two seperated databases. If you can share (in network) folder with that file you can use it on all computers - but there can be problem with concurent writing. SQLite is not designed to work with many users at the same time.
I'm writing a web application in Python (on Apache server on a Linux system) that needs to connect to a Postgres database. It therefore needs a valid password for the database server. It seems rather unsatisfactory to hard code the password in my Python files.
I did wonder about using a .pgpass file, but it would need to belong to the www-data user, right? By default, there is no /home/www-data directory, which is where I would have expected to store the .pgpass file. Can I just create such a directory and store the .pgpass file there? And if not, then what is the "correct" way to enable my Python scripts to connect to the database?
No matter what approach you use, other apps running as www-data will be able to read your password and log in as you to the database. Using peer auth won't help you out, it'll still trust all apps running under www-data.
If you want your application to be able to isolate its data from other databases you'll need to run it as a separate user ID. The main approaches with this are:
Use the apache suexec module to run scripts as a separate user;
Use fast-cgi (fcgi) or scgi to run the cgi as a different user; or
Have the app run its own minimal HTTP server and have Apache reverse proxy for it
Of these, by far the best option is usually to use scgi/fcgi. It lets you easily run your app as a different unix user but avoids the complexity and overhead of reverse proxying.
Install the application and its config files in its own directory different from the static files directory and only readable by the application user.
Set another user to run the application and use the WSGIDaemonProcess directive.
All of that and much more is clearly described in the mod_wsgi site, in the Quick Configuration Guide, Configuration Guidelines and Configuration Directives
I'm pretty new to Python but have been running a few programs locally using Komodo edit, and then uploading the results manually to my website's MySQL database.
I'm looking into letting Python do this on it's own, but as i understand it i have to open my MySQL database to anyone regardless of if they are running scripts on my server or not if I'm to do this.
I'm guessing this is due to with security reasons, but i don't know how vulnerable this can make my site? Is it a bad idea to do it this way, or would it be better to run my python program from the server itself? (I've never run python code from my server, and my python code too, might be insecure)
If you have a access to the entire server (i.e. not just the hosting directory as is common on some shared hosting setups), and can ssh into the server, then your safest (though not easiest) option is to place the script on the server outside of the web hosting folder. This will stop anyone from remotely accessing the script, and will let you connect to the db without enabling remote connections.
You could enable remote connections if your hosting server set up allows it (not sure if any hosting companies disable, or prevent it, though you may have to enable it from the start when you create the database) Just select a nice strong password. Then you can use your script locally, and you'd be as secure as your password.