Unexpected TIMESTAMP format in SQL Server - python

I'm trying to populate my time column which is a TIMESTAMP datatype with an INSERT command from my Python script. This is my current code for the insert:
usercount= ("INSERT INTO UserNum(Amount, Network) \
VALUES ('%s', '%s')" % \
(num_user, nets_id["name"]))
I haven't included to insert a TIMESTAMP value as I believe this gets automatically generated upon INSERT
But when I look at the UserNum table the time column is populated by values such as AAAAAAAAE5U= is there something I'm doing wrong?
Any help with this would be much appreciated!

You do not need to make your arguments strings. Python handles it.
Try this:
usercount = "INSERT INTO UserNum(Amount, Network) VALUES (%s, %s)" % (num_user, nets_id["name"])
but be sure that they are of the same type as in your DB.

Related

Getting column is of type jsonb[] but expression is of type text[] when trying to insert with psycopg2

I'm using execute_values to insert the content of many tables on other tables, I already set the adapter of dict to json when I received other error, however I don't know how to treat this one:
psycopg2.ProgrammingError: column "rules" is of type jsonb[] but expression is of type text[]
LINE 1: ...UES (1,'tturxvrtgvvsrqgzsedcoyqujakyepjordrbbjdw',ARRAY['{"i...
The only way to handle that was something like this issue, but I would need to treat each column ...
Maybe there are ways to create a new adapter, but I couldn't achieve how to do that by the docs.
register_adapter(dict, Json)
execute_values(
dest_cursor,
f'''
INSERT INTO {t} VALUES %s ON CONFLICT DO NOTHING;
''',
records,
)
Is there a automatic way to deal with that, like the register_adapter?
I've found the answer by this issue I learned I could use the template parameter, like so:
execute_values(
dest_cursor,
f'''
INSERT INTO {t} VALUES %s ON CONFLICT DO NOTHING;
''',
records,
template="(%s, %s, %s::jsonb[], %s, %s)" # as many placeholders requested
)

Using a variable for a table name with python sql cursor

I am using python sql cursor to dynamically access my database and I am in a situation where I want to use a variable in place of a table name. So far all of my attempts have resulted in syntax errors, although I (think?) I am doing things as expected? Unless a table name as a variable is different from a value as a variable:
here is what I currently have:
cursor.execute("INSERT INTO %s (word=%s,item_id=%s,word_tag=%s,unstemmed_word=%s, word_position=%s, TF=%s, normalized_term_frequency=%s, sentence=%s,anthology_id=%s) "%(table_name, stemmedWord,fle.split()[0], str(word[1]), uniqeWord, word_pos, TF, normalized_term_frequency, sentence, fle.split()[1].split(".")[0]))
and I have also tried this:
cursor.execute("INSERT INTO %s (word,item_id,word_tag,unstemmed_word, word_position, TF, normalized_term_frequency, sentence,anthology_id) values(%s, %s,%s, %s, %s, %s, %s, %s, %s)",(table_name, stemmedWord,fle.split()[0], str(word[1]), uniqeWord, word_pos, TF, normalized_term_frequency, sentence, fle.split()[1].split(".")[0]))
You cannot dynamically bind object names, only values. You'll have to resort to string manipulation for the table's name. E.g.:
sql = "INSERT INTO {} (word=%s,item_id=%s,word_tag=%s,unstemmed_word=%s, word_position=%s, TF=%s, normalized_term_frequency=%s, sentence=%s,anthology_id=%s)".format(table_name)
cursor.execute(sql % (stemmedWord,fle.split()[0], str(word[1]), uniqeWord, word_pos, TF, normalized_term_frequency, sentence, fle.split()[1].split(".")[0]))
If you are on python >= 3.6 this is probably better:
cursor.execute(f'INSERT INTO {table_name} (word="{stemmedWord}",item_id={fle.split()[0]},word_tag={str(word[1])},unstemmed_word="{oword_posrmuniqeWord}", word_position=word_pos, TF={TF}, normalized_term_frequency={normalized_term_frequency}, sentence="{sentence}",anthology_id={fle.split()[1].split(".")[0])}'
but I think your syntax errors are coming from two things:
you have provided a string to split fle on. (Correction this defaults to space - so is OK!)
you haven't quoted what seem to be obvious strings in you sql fields.

Successful connection to MS SQL DB on Azure using pymssql, INSERT statement executing with no error message but 0 rows inserted

I'm using pymssql to connect to a MS SQL DB on Azure and insert records from a CSV file. I've verified that the connection is successful, and that the list used for the executemany parameter contains all of the data in the correct format and with the correct number of values. However, when I run the script 0 rows are inserted - but no error is thrown.
I looked around and it seems like most others that have experienced something similar were missing the commit(), but that isn't the issue here.
Here is the code. Any help is greatly appreciated.
with open('file.csv') as csvfile:
data = csv.reader(csvfile)
next(data)
dicts = ({'col1': line[0], 'col1': line[1], 'col3': line[2], 'col4': int(line[3]), 'col5': int(line[4]), 'col6': float(line[5])} for line in data)
to_db = ((i['col1'], i['col2'], i['col3'], i['col4'], i['col5'], i['col6']) for i in dicts)
cursor.executemany(
'INSERT INTO myTable VALUES (%s, %s, %s, %d, %d, %f)',
to_db)
print str(cursor.rowcount) + " rows inserted"
conn.commit()
Edit: If I execute the query using cursor.execute() and include the values explicitly in the query then I can successfully insert rows into the database (see below for example).
cursor.execute("INSERT INTO myTable VALUES ('4/18/2016','test','test',0,0,0.0)")
But if I user the cursor.executemany(operation,parameters) syntax and pass a list of the values as the parameter then it results in an incorrect syntax error.
cursor.executemany("INSERT INTO myTable VALUES(%s,%s,%s,%d,%d,%f)",list_of_values)
I was just reading in the module reference that only %s and %d are supported. So I'm thinking that might be the issue. But how do I pass a float?
Using the float placeholder (%f) was in fact the issue. Only %s and %d are supported, but are purely placeholders and do not have any impact on formatting the way that they typically do in python, so really only %s is needed. The working code is as follows:
cursor.executemany("INSERT INTO myTable VALUES(%s,%s,%s,%s,%s,%s)",list_of_values)

Python variables in MySQL: INSERT INTO %s

This question has been asked bevor (here), but no answer is working for me and unfortunally I am not allowed to add a comment, because I'm new here.
I didn't know what else to do than asking the question again, sorry for this - please tell me the right way to do.
I want to insert Python variables into a MySQL table named by a Python variable.
I figured out, to create the table by:
curs.execute ("""CREATE TABLE IF NOT EXISTS %s LIKE table""" %(today))
I also figured out to insert values like this:
curs.execute (
""" INSERT INTO table (column)
VALUES (%s) """,
(variable))
Now I tried
today = "table_name"
variable = "name"
curs.execute (
""" INSERT INTO %s (column)
VALUES (%s) """,
( table, variable ))
I'll get this error:
(1064, "You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''table_name' (column …' at line 1")
I also tried:
today = "table_name"
variable = "name"
curs.execute (
""" INSERT INTO %s (column)
VALUES (%s) """
% ( table, variable ))
I'll get this error:
(1054, "Unknown column 'name' in 'field list'")
I guess there's something wrong with the strings …
Thank you for answers!
Try replacing the %s with ? and let sqlite handle the insertion. (Also helps preventing SQL injection attacks in web applications)
table_name = "today"
variable = "name"
curs.execute ("INSERT INTO ? (column) VALUES (?)",(table_name, variable))
Try:
today = "table_name"
variable = "name"
curs.execute (
""" INSERT INTO :table_name (column)
VALUES (:variable_name) """,
{'table_name': today, 'variable_name': variable})
You have muddled your variables in your third, not working code snippet. instead try:
table_name = "today"
variable = "name"
curs.execute (
""" INSERT INTO %s (column)
VALUES (%s) """,
( table_name, variable ))
This will create a table named 'Today' with one column named 'column' with one data value in it 'name'.
For this to work you will need to have previously created a table that has this column available.
So your create code needs to change:
"""create table "today" ("column" varchar2(16BYTE), *add extra columns here if you wish*)"""

Insert python list into Postgres database

I am having trouble in formatting the list for insertion using psycopg.
Here is a sample of code i am trying to do.
Basically I am just reading data from one table and trying to insert it into another table.
Code:
cur.execute("""select data from \"Table1\" where lat=-20.004189 and lon=-63.848004""")
rows = cur.fetchall()
print rows
cur.execute("""INSERT INTO \"%s\" (data) VALUES (ARRAY%s)""" % (args.tableName,rows)))
The result returned by first select query is like this:
[([6193, 3975, 4960, 5286, 3380, 970, 3328, 3173, 2897, 2457, 2443, 2674, 2172, 2740, 3738, 4907, 3691, 4234, 3651, 3215],)]
When I try to insert this into another table I get the following format error.
cur.execute(cur.mogrify("""INSERT INTO \"%s\" (data) VALUES (%s)""" % (args.tableName,rows)))
psycopg2.ProgrammingError: syntax error at or near "["
LINE 1: INSERT INTO "DUMMY1km" (data) VALUES ([([6193, 3975, 4960, 5...
I tried cur.mogrify, but it does not seem to help.
Please let me know if anyone has a work around for this issue.
Thanks
Adi
I don't think mogrify is needed here. Use executemany and pass rows as the second argument.
cur.executemany(
"""INSERT INTO "%s" (data) VALUES (%%s)""" % (args.tableName),rows)
Using parametrized arguments helps prevent SQL injection.
The table name can not be parametrized, so we do have to use string interpolation to place the table name in the SQL query. %%s gets escapes the percent sign and becomes %s after string interpolation.
By the way, (as a_horse_with_no_name has already pointed out) you can use the INSERT INTO ... SELECT form of INSERT to perform both SQL queries as one:
cur.execute(
"""INSERT INTO %s (data)
SELECT data FROM Table1
WHERE lat=-20.004189 AND lon=-63.848004""" % (args.tableName))
Per the question in the comments, if there are multiple fields, then the SQL becomes:
cur.executemany(
"""INSERT INTO {t} (lat,lon,data1,data2)
VALUES (%s,%s,%s,%s)""".format(t=args.tableName),rows)
(If you use the format method, then you don't have to escape all the other %ss.)

Categories

Resources