I have a PostgreSQL database with existing tables. I wish to :
Create a set of Python models (plain classes, SQLAlchemy models or other) based on the existing database
Then manage changes in these models with a migrations tool.
The second part I think is easy to achieve as long as I manage to get my initial schema created. How can this be achieved?
So if someone is willing to use SQLAlchemy I found these two solutions:
Straight with SQLAlchemy reflection and automapping
With sqlacodegen
Related
I need to dynamically create database tables depending on user requirements. so apart from a few predefined databases, all other databases should be created at runtime after taking table characteristics(like no of cols, primary key etc.) from user.
I read a bit of docs, and know about django.db.connection but all examples there are only for adding data to a database, not creating tables. (ref: https://docs.djangoproject.com/en/4.0/topics/db/sql/#executing-custom-sql-directly)
So is there anyway to create tables without models in django, this condition is a must, so if not possible with django, which other framework should I look at?
note: I am not good at writing questions, ask if any other info is needed.
Thanks!
You can use inspectdb to automatically generate the models from the legacy database. You can check about it in here.
Or you can use SQL directly. Although, you will have to process the tables in python. Check it here.
I am already having one mysql database with a lot of data, of which tables and migrations are written in sql. Now I want to use the same mysql database in django so that I can use the data in that database.I am expecting that there will not be need for making the migrations as I am not going to write the models in Django again, also what will be the changes/modification I will have to do. For eg: as in middlewares?. Can anyone please help me in this?
From what I know there is no 100 % automatic way to achieve that.
You can use the following command
python manage.py inspectdb
It will generate a list of unmanaged models that you can export to a model.py file and integrate in your django project.
However it is not magical and there are a lot of edge cases so the generated list of model should be manually inspected before being integrated.
More info here : https://docs.djangoproject.com/en/3.0/ref/django-admin/#django-admin-inspectdb
I am using MongoDB as a database for my project and I want to store a list of the dictionary as an array of objects in an object in MongoDB from Django. For Eg. {"id":1,"products":[{...},{...},{...},{...}..]} how do I define my model and how to query this. please help
Have a look at Djongo to start with maybe, it's database connector for MongoDB that uses the default Django ORM.
So you will define your models just like you do with mysql, sqlite, etc. But it enables some extra features.
I want to use flask peewee as ORM for a relational db (MySQL) but my problem is changes in structure of models... like adding new attributes for a model (this means columns in db).
I want to know if I can do this automatically without writing SQL manually?
It looks like the Peewee module does support migrations.
http://peewee.readthedocs.org/en/latest/peewee/playhouse.html#schema-migrations
We developed https://github.com/keredson/peewee-db-evolve for our company's use that sounds like it may be helpful for you.
Rather than manually writing migrations, db-evolve calculates the diff between the existing schema and your defined models. It then previews and applies the non-destructive SQL commands to bring your schema into line. We've found it to be a much more robust model for schema management. (For example, switching between arbitrary branches with different schema changes is trivial this way, vs. virtually impossible w/ manually authored migrations.)
Example:
Think of it as a non-destructive version of Peewee's create_tables(). (In fact we use it for exactly that all the time, to build the schema from scratch in tests.)
I've wrote a simple migration engine for Peewee https://github.com/klen/peewee_migrate
What is the best way to migrate MySQL tables to Google Datastore and create python models for them?
I have a PHP+MySQL project that I want to migrate to Python+GAE project. So far the big obstacle is migrating the tables and creating corresponding models. Each table is about 110 columns wide. Creating a model for the table manually is a bit tedious, let alone creating a loader and importing a generated csv table representation.
Is there a more efficient way for me to do the migration?
In general, generating your models automatically shouldn't be too difficult. Suppose you have a csv file for each table, with lines consisting of (field name, data type), then something like this would do the job:
# Maps MySQL types to Datastore property classes
type_map = {
'char': 'StringProperty',
'text': 'TextProperty',
'int': 'IntegerProperty',
# ...
}
def generate_model_class(classname, definition_file):
ret = []
ret.append("class %s(db.Model):" % (classname,))
for fieldname, type in csv.reader(open(definition_file)):
ret.append(" %s = db.%s()" % (fieldname, type_map[type]))
return "\n".join(ret)
Once you've defined your schema, you can bulk load directly from the DB - no need for intermediate CSV files. See my blog post on the subject.
approcket can mysql⇌gae or gae builtin remote api from google
In your shoes, I'd write a one-shot Python script to read the existing MySQL schema (with MySQLdb), generating a models.py to match (then do some manual checks and edits on the generated code, just in case). That's assuming that a data model with "about 110" properties per entity is something you're happy with and want to preserve, of course; it might be worth to take the opportunity to break things up a bit (indeed you may have to if your current approach also relies on joins or other SQL features GAE doesn't give you), but that of course requires more manual work.
Once the data model is in place, bulk loading can happen, typically via intermediate CSV files (there are several ways you can generate those).
you don't need to
http://code.google.com/apis/sql/
:)
You could migrate them to django models first
In particular use
python manage.py inspectdb > models.py
And edit models.py until satisfied. You might have to put ForeignKeys in, adjusts the length of CharFields etc.
I've converted several legacy databases to django like this with good success.
Django models however are different to GAE models (which I'm not very familiar with) so that may not be terribly helpful I don't know!