I use sqlalchemy engine for insertion data in postgresql table. I won't to insert list in one row as if list be a string with many value.
query = text('INSERT INTO table (list_id, list_name) VALUES ({}, {}) RETURNING'.format(my_list,'list_name'))
result_id = self.engine.execute(query)
when i tried execute my code I received error:
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.SyntaxError) syntax error at or near "["
LINE 1: ...INTO table (list_id, list_name) VALUES (['str1... ^
[SQL: INSERT INTO table (list_id, list_name) VALUES (['str1', 'str1', 'str1'], 'list_name') RETURNING id]
I tried to represent my list as str(my list) but result was same. Also i try str(['str1', 'str1', 'str1']).replace('[', '{').replace(']', '}')
My table query:
CREATE TABLE api_services (
id SERIAL PRIMARY KEY,
list_id VARCHAR,
list_name VARCHAR(255) NOT NULL
);
Related
I have any values in list:
list = ['hostname', 'ip-address', 'public-ip-address']
I need to alter table adding new columns if not exists in list.
Is possible add the new colun in query PostgreSQL? I'm trying...
def alter_table_system(self, keys: list[str]):
try:
fields = ', '.join(['%s'] * len(keys))
alter_table = "ALTER TABLE IF EXISTS tbl_pa_system ADD COLUMN IF NOT EXISTS (%s)" %(fields)
self.cursor.execute(alter_table)
self.conn.commit()
except(Exception, psycopg2.Error) as error:
print(error)
How can verify the actual table and columns, and add a new columns in list?
I receive the error:
syntax error at or near "("
LINE 1: ... IF EXISTS tbl_pa_system ADD COLUMN IF NOT EXISTS (%s, %s, %...
I'm using PostgreSQL 11.
I am retrieving data from a DBF file and I need to generate a SQL script to load the data into a database. I already have this but the values are stored in a tuple and before i create the SQL script I want to strip each item of the tuple. For example, I am getting this:
INSERT INTO my_table (col1,col2,col3) VALUES('Value 1 ', 'TESTE123', ' ADAD ')
And I need to get this:
INSERT INTO my_table (col1,col2,col3) VALUES('Value 1', 'TESTE123', 'ADAD')
For that I am trying with this code:
with dbf.Table(filename) as table:
for record in table:
fields = dbf.field_names(table)
fields = ','.join(fields)
place_holders = ','.join(['?'] * len(fields))
values = tuple(record.strip())
sql = "insert into %s (%s) values(%s)" & ('my_table', fields, values)
And I am getting the following error:
dbf.FieldMissingError: 'STRIP' no such field in table
What do you purpose?
dbf.Record is not a str, and doesn't have string methods.
If every field in the record is text (e.g. Character or Memo, not Numeric or Date) then you can:
values = [v.strip() for v in record]
I am trying to insert None from python in SQL table, but it is inserting as string 'null'
I am using below set of code.
query_update = f'''update table set Name = '{name}',Key = '{Key}' where Id = {id} '''
stmt.execute(query_update)
conn.commit()
I am getting values in python for variable 'Key' these values I am trying to update in column "Key". These two columns are nvarchar columns.
Now sometime we get None values Null in variable 'Key' as below
Key = 'Null'
So when I insert above value in SQL it is getting updated as string instead of NULL, as I had to put quote in script while updating from Python.
Can anyone help me here, How can I avoid inserting string while inserting Null in SQL from Python
The problem that your Null values are actually a string and not "real" Null.
If you want to insert Null, your key should be equal to None.
You can can convert it as follows:
Key = Key if Key != 'Null' else None
I guess the problem here is that you are placing quotes around the placeholders. Have a look how I think the same can be done.
query_update = 'update table set Name = {name}, Key = {Key} where Id = {id}'
query_update.format(name='myname', Key = 'mykey', Id = 123)
stmt.execute(query_update)
conn.commit()
Don't use dynamic SQL. Use a proper parameterized query:
# test data
name = "Gord"
key = None # not a string
id = 1
query_update = "update table_name set Name = ?, Key = ? where Id = ?"
stmt.execute(query_update, name, key, id)
conn.commit()
I having issues writing WKB or Well Known Binary into a sqlite database/geopackage (they are the same type of database).
Here is the create statement to generate a polyline dataset.
CREATE TABLE line
(OBJECTID INTEGER primary key autoincrement not null,
Shape MULTILINESTRING)
The WKB is as follows:
bytearray(b'\x01\x05\x00\x00\x00\x02\x00\x00\x00\x01\x02\x00\x00\x00\x04\x00\x00\x00\x96C\x8b\x9a\xd0#`\xc18\xd6\xc5=\xd2\xc5RA\x93\xa9\x825\x02=`\xc1\xb0Y\xf5\xd1\xed\xa6RAZd;W\x913`\xc1 Zd\xfb\x1c\xc0RA\xaa\x82Q%p/`\xc18\xcd;\x92\x19\xaeRA\x01\x02\x00\x00\x00\x03\x00\x00\x00z\xc7)TzD`\xc1\xb8\x8d\x06\xb8S\x9fRA\xbb\xb8\x8d:{"`\xc1X\xec/\xb7\xbb\x9eRA\x00\x91~Eo"`\xc1\x00\xb3{F\xff]RA')
When I go to insert the binary data into the column, I do not get an error, it just inserts NULL.
sql = """INSERT INTO line (Shape)
VALUES(?)"""
val = [binaryfromabove]
cur.execute(sql, val)
I've also tried using sqlite3.Binary() as well:
sql = """INSERT INTO line (Shape)
VALUES(?)"""
val = [sqlite3.Binary(binaryfromabove)]
cur.execute(sql, val)
The row that is insert is always NULL.
Any ideas what is going on?
I tried to insert row values for code column in statements table as a foreign key from companies Table. i took the following steps:
Creating Tables
cur.execute("CREATE TABLE IF NOT EXISTS companies (code INT NOT NULL PRIMARY KEY, short_name VARCHAR(255) NOT NULL, long_name VARCHAR(255) NOT NULL)")
cur.execute("CREATE TABLE IF NOT EXISTS statements (statement_id SERIAL NOT NULL PRIMARY KEY, statement_name VARCHAR(255) NOT NULL, code INT NOT NULL, FOREIGN KEY (code) REFERENCES companies_list (code))")
What code column contains in companies table ( i.e. )
code |
-----------
113
221
344
The next step is inserting wanted data to statements table as below :
statement_name = ["balance_sheet", "income_statement", "cash_flow"]
code = "SELECT code FROM companies_list WHERE code IS NOT NULL"
statements = [tuple((t,)) for t in zip(statement_name, code)]
query = "INSERT INTO statements (statement_name, code) VALUES %s"
cur.executemany(query, statements)
i got the following error :
psycopg2.DataError: invalid input syntax for integer: "S"
LINE 1: ...ents (statement_name, code) VALUES ('balance_sheet', 'S')
The Final result i want to get is like below :
statement_id | statement_name | code
---------------------------------------------
1 balance_sheet 113
2 income_statement 113
3 cash_flow 113
4 balance_sheet 221
5 income_statement 221
6 cash_flow 221
The error arises from this line:
code = "SELECT code FROM companies_list WHERE code IS NOT NULL"
This does not perform an actual query, it assigns the SQL select statement string to the code variable. The next line then zips the statement names with code which, because code is a string (an iterable), results in the first 3 characters of code being zipped with the items from statement_name, the result being:
[(('balance_sheet', 'S'),), (('income_statement', 'E'),), (('cash_flow', 'L'),)]
So that's where the 'S' is coming from - it's the first character of "SELECT" in the code string. 'S' is a string, not an integer as defined in the schema for the statements table, hence the error.
You can see the queries generated with cursor.mogrify():
>>> statement_name = ["balance_sheet", "income_statement", "cash_flow"]
>>> code = "SELECT code FROM companies_list WHERE code IS NOT NULL"
>>> statements = [tuple((t,)) for t in zip(statement_name, code)]
>>> query = "INSERT INTO statements (statement_name, code) VALUES %s"
>>> for args in statements:
... print(cur.mogrify(query, args))
...
INSERT INTO statements (statement_name, code) VALUES ('balance_sheet', 'S')
INSERT INTO statements (statement_name, code) VALUES ('income_statement', 'E')
INSERT INTO statements (statement_name, code) VALUES ('cash_flow', 'L')
One way of fixing this is to execute the query contained in code to get a list of company codes, then use that to construct the INSERT query:
import itertools
cur.execute("SELECT code FROM companies_list WHERE code IS NOT NULL")
codes = [row[0] for row in cur.fetchall()]
query = 'INSERT INTO statements (statement_name, code) VALUES (%s, %s)'
args = itertools.product(statement_name, codes)
cur.executemany(query, args)
Here itertools.product() is used to form the Cartesian product of the statement names and the company codes. This is mimicking database join functionality, so if the statement types are available in your database, it might be better to do it in SQL rather than Python.