Django dynamic creation of table in database - python

I'm trying to find a way to dynamically create table in database (SQLite3). What I'm trying to do is to get file from user and based on rows in file create table with the same amount of columns. Everything I found is from 6-10 years so it doesn't work anymore.

I suspect that you are not finding many examples are Django is abstracting out raw SQL.
Have you looked at using Raw SQL queries, specifically Executing custom SQL directly
table_name = 'Your_Name'
with connection.cursor() as cursor:
cursor.execute(f"create table if not exists {table_name} ( id integer PRIMARY KEY )")
Will create a table called 'Your_Name' with a column called id, you will need to read the CSV, there is an example of how to do that here, if you follow that example you could add the DDL into views.py

Related

migrate autonumber columns from access to sqlite

I'm trying to migrate the database for an existing application from access to SQLite. The application uses Autonumbers to generate unique IDs in Access and some tables reference rows from other tables by these unique IDs.
What's a good way to migrate these and keep this functionality intact?
From what I've read, SQLite uses Auto indexing for this. How would I create the links between the tables? Do I have to search the other tables for the row with that unique ID and replace the reference with the SQL generated ID?
example:
table 1, has a column linkedID with a row with the value {7F99297A-DE91-4BD6-9ED8-FC13D668CDA2}, which is linked to a row in table 2 with primaryKey {7F99297A-DE91-4BD6-9ED8-FC13D668CDA2}.
Well, there not really a automated way to do this.
but, what I do to migrate data?
I setup a linked table in Access. Double check if that linked table works (you need to install the odbc driver).
Assuming you have a working linked table?
Then you can do this to export the Access table in VBA to sqlite.
Dim LocalTable As String ' name of local table link
Dim ServerTable As String ' name of table on SQL Lite
LocalTable = "Table1"
ServerTable = "TableSLite"
Dim strCon As String
strCon = CurrentDb.TableDefs("Test1").Connect
' above is a way to steal and get a working connection from a valid
' working linked table (I hate connection strings in code)
Debug.Print strCon
DoCmd.TransferDatabase acExport, "ODBC Database", strCon, acTable, LocalTable, ServerTable
Debug.Print "done export of " & LocalTable
That will get you the table in sqlite. But, there are no DDL (data definition commands) in sqlite to THEN change that PK id from Access to a PK and autonumber.
However, assuming you say have "db browser"?
Then simple export the table(s) as per above.
Now, in db browrser, open up the table, and choose modify, and simple check the AI (auto increemnt, and then PK settings - in fact if you check box AI, then the PK useally selects for you. So, after I done the above export. (and you should consider close down Access - since you had/have linked tables).
So, in db browser, we now do this:
so, for just a few tables, the above is not really hard.
However, the above export (transfer) of data does not set the PK, and auto increment for you.
If you need to do this with code, and this is not a one time export/transfer, then I don't have a good solution.
Unfortantly, SqlLite does NOT allow a alter table command to set PK and set auto increment (if that was possbile, then after a export, you could execute the DDL command in sqlite (or send the command from your client software) to make this alteration.
I not sure if sql lite can spit out the "create table" command that exists for a given table (but, I think it can). So, you might export the schema, get the DDL command, modify that command, drop the table, re-run the create table command (with current PK and auto increment), and THEN use a export or append query in Access.
But, transfer of the table(s) in question can be done quite easy as per above, but the result(s) do not set nor include the PK setting(s) for you.
However, if this is one time export? Then export of tables - and even the dirty work of figuring out the correct data types to be used?
The above works well - but you have to open up the tables in a tool like say db browser, and then set PK and auto increment.
I do the above quite often for transfer of Access tables to sqlLite tables, but it does then require some extra steps to setup the PK and auto increment.
Another possbile way if this had to be done more then one time?
I would export as per above, and then add the PK (and auto increment).
I would then grab say the 8 tables create commands from sqlLite, and save those create table commands in the client software.
then you execute the correct create table command, and then do a append query from Access. So, it really depends if this is a one time export, or this process of having to create the table(s) in sqlLite is to occur over and over.

Upload data to Exasol from python Dataframe

I wonder if there's anyways to upload a dataframe and create a new table in Exasol? import_from_pandas assumes the table already exists. Do we need to run a SQL separately to create the table? for other databases, to_sql can just create the table if it doesn't exist.
Yes, As you mentioned import_from_pandas requires a table. So, you need to create a table before writing to it. You can run a SQL create table ... script by connection.execute before using import_from_pandas. Also to_sql needs a table since based on the documentation it will be translated to a SQL insert command.
Pandas to_sql allows to create a new table if it does not exist, but it needs an SQLAlchemy connection, which is not supported for Exasol out of the box. However, there seems to be a SQLAlchemy dialect for Exasol you could use (haven't tried it yet): sqlalchemy-exasol.
Alternatively, I think you have to use a create table statement and then populate the table via pyexasol's import_from_pandas.

Python Alter Statement in psycopg2 not updating Peewee

Working on Postgres DB within python using pscopg2, with an ORM of pewee. I created the initial tables using pewee and I needed to perform an ALTER statement:
improt psycopg2
cur.execute("ALTER TABLE Test_Table ADD COLUMN filename VARCHAR(100)")
conn.commit()
Which after executed, I do a select * from Test_Table from and the table is present.
However, when I do a select using the pewee ORM, that column filename does not exist in the Test_Table.
What do I need to do in order for that ALTER statement to show up using peewee?
Peewee models are not dynamically-created based on the state of the database schema. They are declarative.
So if you are adding a column to your database, you would add a corresponding field instance to the model class. Typically this is done offline (e.g., not in the middle of while your application is running).
Refer here for docs on Peewee's schema migration utilities:
http://docs.peewee-orm.com/en/latest/peewee/playhouse.html#migrate

python sqlalchemy built temporary table sub-select

I'm working on a very complex raw sql statement that has to be run with sqlalchemy. One part of the statement is a JOIN with a temporary table that is filled with data from a csv file.
The select is looking as follows:
(SELECT * FROM (VALUES ('1','xyz#something.com','+99123456798')) AS t (id, email, phone))
To prevent any sql injections I cannot simply copy paste everything from within the csv into the select.
I know that sqlalchemy has the option of inserting values with :x and then pass the actual value in the execute method, but I have A LOT of values and substituting them will be impossible.
Is there a way to build this temporary table from the csv with the necessary sql injection protection using sqlalchemy?

Sqlalchemy - copy data from a database to another database with exactly same schema

I have 2 mysql database connect with sqlalchemy in 2 docker machine.
They both have same schema and data inside.
Now I want to copy data in 1 of the database to another to merge 2 mysql database into 1. Is it possible to do so?
Yes, you can:
Run this on the database you want to copy over:
mysqldump --no-create-info {yourdbname} > {yourdbname}.sql
And then on the database you want to import this:
mysql {yourdbname} < {yourdbname}.sql
You might run into an issue with the primary keys and/or duplicate records. Sadly from that point on you will have to generate SQL to export your data with an altered primary key, while keeping your foreign key relations intact.
This second complex issue a procedure is required to query the information schema:
Select the table names from the information schema
Select the primary key using the information schema
Select max(primary key) for every table.
Use the information schema to create SELECT queries in which you add the previous max(primary key) to the primary key, and write the output to the data file.
On foreign keys related you will also have to add this same max(primary key) value from the related table.
It will be a bit more code to write, but 50-80 lines stored procedure should do it.

Categories

Resources