I know you can easily import a YAML file into a Django database (in order to populate the database before starting the project for instance) but how can I do the opposite (ie save the complete database into a single .yaml file).
I read there is a way to export one single table into a file:
YAMLSerializer = serializers.get_serializer("yaml")
yaml_serializer = YAMLSerializer()
with open("file.yaml", "w") as out:
yaml_serializer.serialize(SomeModel.objects.all(), stream=out)
but I need to do it on the complete database (which has many tables with complex relations between each ones).
I could write a script to do that for me, but I don't want to redo something which has probably been done already, and I wouldn't know how to do it the better way so that Django has no difficulties to read it after.
So far, I've been working on a SQLITE3 database engine.
Any ideas?
You need the dumpdata management command.
pip install pyyaml
python manage.py dumpdata --format=yaml > /path/to/dump_file.yaml
Related
I know the above question is incomplete. Let me explain this in brief. I downloaded the repository from this github link https://github.com/datacharmer/test_db and as per instructions on readme file i tried to create a database locally but i got syntax error when i ran a command mysql < employees.sql . I tried from both windows cli and MYSQL clc. Can someone help me to create a mysql database using above github data.
Thanks in advance
The SQL queries in that repository are quite destructive, many starting with a DROP DATABASE, which can bite you if you're not paying attention.
Do yourself a favour and create the databases manually. Look at the .sql files and you will see the CREATE DATABASE and CREATE TABLE statements. Run them one by one. This will help you become accustomed to how to create databases and tables in MySQL.
At the bottom of the .sql files, you'll see lines that look like this:
SELECT 'LOADING departments' as 'INFO';
source load_departments.dump ;
What you are interested in is the file name that comes after source. Open those files and you will see the INSERT statements that populate the tables you created in the previous step. This will help you become accustomed to inserting records into your tables.
Once this is done, you will have a new database with tables and data for you to work with. Do not trust just any SQL or bash script with the contents of your database. Ever.
I am already having one mysql database with a lot of data, of which tables and migrations are written in sql. Now I want to use the same mysql database in django so that I can use the data in that database.I am expecting that there will not be need for making the migrations as I am not going to write the models in Django again, also what will be the changes/modification I will have to do. For eg: as in middlewares?. Can anyone please help me in this?
From what I know there is no 100 % automatic way to achieve that.
You can use the following command
python manage.py inspectdb
It will generate a list of unmanaged models that you can export to a model.py file and integrate in your django project.
However it is not magical and there are a lot of edge cases so the generated list of model should be manually inspected before being integrated.
More info here : https://docs.djangoproject.com/en/3.0/ref/django-admin/#django-admin-inspectdb
I am using flask sqlalchemy to create db which in turns create a app.db files to store tables and data. Now for the backup it should be simple to just take a copy of app.db somewhere in the server. But suppose while the app is writing data to app.db and we make a copy at that time then we might have inconsistent app.db file.
How do we maintain the consistency of the backup. I can implement locks to do so. But I was wondering on standard and good solutions for database backup and how is this implemented in python.
SQLite has the backup API for this, but it is not available in the built-in Python driver.
You could
use the APSW library for the backup; or
execute the .backup command in the sqlite3 command-line shell; or
run BEGIN IMMEDIATE to prevent other connections from writing to the DB, and copy the file(s).
I wanted also create backups of my SQL file, which stores data from my Flask.
Since my Flask web don't have too many users, and the database don't carry more than 4 small-medium size tables I did the following:
#Using Linux cmd
cd pathOfMyDb
nano backupDbRub.sh # backupDBRun is the name of the file that will be called periodically
Nano text editor will open a file and you can write the following
#!/bin/bash
current_time=$(date "+%Y.%m.%d-%H.%M.%S");
sep='"';
file_name = '_site.db"';
functionB = '.backup ';
queryWrite = "$functionB$sep$current_time$file_name";
echo "$queryWrite";
sqlite3 yoursqlitefile.db "$queryWrite"
Basically with this code we create a function ($queryWrite) which will do a copy of the current database with a timestamp. Now in Unix you will have to modify the crontab file.
Write in the Unix terminal
crontab -e
in this new file , as explained in this post you should add the periodicity (in my case , for testing purposed, I wanted to run it every minute to check it is working.
* * * * * /backupDbRub.sh
Think that the copies will appear in the folder you choose in the $querywrite sentence. So you can find them there.
In case you want to test that the script works, you must go to the folder where this script and run
/backupDbRub.sh
This should print the 'echo' sentence called in your file. Sometimes you did not provide permissions in this file, so you should add permissions with 'chmod -x', or whatever other you need to set up.
I am building a database interface using Python's peewee module. I am trying to figure out how to insert data into an existing database where I do not know the schema.
My idea is to use playhouse.reflection.Introspector to find out the database schema, then use that information to create class objects which can then be inserted into the existing database.
So far I've gotten to:
introspector = Introspector.from_database(database)
models = introspector.generate_models()
I'm don't know where to go from there.
1) Can I create database objects in this manner? What is the next step?
2) Is there an easier way to do this?
peewee includes an introspection tool called pwiz that can (basically) introspect a database and produce model definitions. It is run as a command line script and dumps the model definitions to stdout, so invokation is like any other unix tool. Here is an example from the docs:
python -m pwiz -e postgresql my_postgres_db > mymodels.py
From there edit mymodels.py to get what you need.
You could do this on the fly, but it would require a few steps and is hackish (not to mention pointless if you really don't know anything about the schema):
Run pwiz as an os command
Read it to pick out the model names
Import whatever you find
BUT
If you really don't know the schema to start with then you have no idea what the semantics of the database are anyway, which means whatever you find is literally meaningless. Unless you at least know some schema/table/column names you are hunting for (in which case you do know something about the schema) there isn't really much you can do with regard to inserting data (not in a sane way), though you could certainly dump data from the db. But if you just wanted a database dump then pg_dump would have been easier.
I suspect this is actually an X-Y problem. What problem is it you are trying to solve by using this technique? What effect is it supposed to achieve within the context of your system?
If you want to create a GUI, check out the sqlite_web project. It uses Peewee to create a web-based SQLite database manager.
I have a requirement where I need to insert the postgres data into mysql. Suppose I have user table in postgres. I have user table also in mysql. I tried to do something like this:
gts = 'cd '+js_browse[0].js_path #gts prints correct folder name/usr/local/myfolder_name
os.system(gts)
gts_home = 'export GTS_HOME='+js_browse[0].js_path
os.system(gts_home)
tt=gts+'&& sh bin/admin.sh User --input-dir /tmp/import'
#inside temp/import i import store my postgres user table data
#bin is the folder inside myfolder_name
In mysql if I use the command it works perfectly fine:
cd /usr/local/myfolder_name
bin/admin.sh User -account=1 user=hamid -create'
I am unable to store data inside mysql this way. Any help shall be appreciated.
You don't really give us much information. And why would go from postgres to mysql?
But you can use one of these tools - I have seen people talk good about them
pg2mysql or pgs2sql
Hope it works out.
PostgreSQL provides possibility to dump data into the CSV format using COPY command.
The easiest path for you will be to spend time once to copy schema objects from PostgreSQL to MySQL, you can use pg_dump -s for this on the PostgreSQL side. IMHO, it will be the biggest challenge to properly move schemas.
And then you should import CSV-formatted data dumps into the MySQL, check this for reference. Scrolling down to the comments you'll find recipes for Windows also. Something like this should do the trick (adjust parameters accordingly):
LOAD DATA LOCAL INFILE C:\test.csv
INTO TABLE tbl_temp_data
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';