I have googled tutorials/examples on how to retrieve data from a postgresql database which already has tables, but most of the them are using Models. As in they are tutorials about making a DB from scratch.
But, what of a situation where a database already exists retrieval achieved?
is it achieved by using psycopg2.cursor?
Related
I have a PostgreSQL database with existing tables. I wish to :
Create a set of Python models (plain classes, SQLAlchemy models or other) based on the existing database
Then manage changes in these models with a migrations tool.
The second part I think is easy to achieve as long as I manage to get my initial schema created. How can this be achieved?
So if someone is willing to use SQLAlchemy I found these two solutions:
Straight with SQLAlchemy reflection and automapping
With sqlacodegen
I need to dynamically create database tables depending on user requirements. so apart from a few predefined databases, all other databases should be created at runtime after taking table characteristics(like no of cols, primary key etc.) from user.
I read a bit of docs, and know about django.db.connection but all examples there are only for adding data to a database, not creating tables. (ref: https://docs.djangoproject.com/en/4.0/topics/db/sql/#executing-custom-sql-directly)
So is there anyway to create tables without models in django, this condition is a must, so if not possible with django, which other framework should I look at?
note: I am not good at writing questions, ask if any other info is needed.
Thanks!
You can use inspectdb to automatically generate the models from the legacy database. You can check about it in here.
Or you can use SQL directly. Although, you will have to process the tables in python. Check it here.
I'm making an application that will fetch data from a/n (external) postgreSQL database with multiple tables.
Any idea how I can use inspectdb only on a SINGLE table? (I only need that table)
Also, the data in the database would by changing continuously. How do I manage that? Do I have to continuously run inspectdb? But what will happen to junk values then?
I think you have misunderstood what inspectdb does. It creates a model for an existing database table. It doesn't copy or replicate that table; it simply allows Django to talk to that table, exactly as it talks to any other table. There's no copying or auto-fetching of data; the data stays where it is, and Django reads it as normal.
I need in my project (based on Postgresql) to export few models as SQLite dump. It must be made 'on-demand' f.e. on user request.
I can prepare appropriate database manually, but I would like to omit the duplication of information about schema. I dream about solution like 'dumpdata app-name' but instead of JSON/XML/YAML there should be SQLite.
Is there such solution?
P.S. For too overbearing - it's not broad question. Possibilities are only two: there is such snippet, helper etc. or there is not and it should be done individually. I can't find it by my own so I ask for help.
To sum up details (some people could not figure them out and could put my question 'on hold'):
there is Django project with main Postgresql database
I'm already processing request from users (through an API)
one of request is "make a dump of some models (tables) in SQLite format"
I can prepare temporary SQLite database and manually fill it with data
I'm looking for powerful and universal tool (solution) which will do such export automatically (from some of my Django models to SQLite)
I would like to use Pyramid and SQLAlchemy with an already existing MySQL database.
Is it possible to automatically create the models from the MySQL tables. I do not want to write them all by hand.
This could be either by retrieving the tables and structure from the Server or using a MySQL "Create Table..." script, which contains all the tables.
Thanks in advance,
Linus
In SQLAlchemy you can reflect your database like this:
from sqlalchemy import create_engine, MetaData
engine = create_engine(uri)
meta = MetaData(bind=engine)
meta.reflect()
Then, meta.tables are your tables.
By the way, it is described here: http://docs.sqlalchemy.org/en/latest/core/reflection.html
To generate the code based on the database tables there are packages such as https://pypi.python.org/pypi/sqlacodegen and http://turbogears.org/2.0/docs/main/Utilities/sqlautocode.html , but I haven't used them.