I tried to output all values in a column in mysql however, it also outputs extra comma in the end.
def numbers():
db = getDB();
cur = db.cursor()
sql = "SELECT mobile_number FROM names"
cur.execute(sql)
result = cur.fetchall()
for x in result:
print(x)
It looks like this in shell:
(0123456789,)
(9876543210,)
a couple more options:
# validate that the results contain exactly one column
for [x] in result:
print(x)
and
# using argument unpacking
for x in result:
print(*x)
Basically as Ben's comment says the comma doesn't technically exist, it is there because it is a tuple.
but is you still want to remove it try this:
rows=[i[0] for i in rows]
Related
I have a list of string that I need to pass to an sql query.
listofinput = []
for i in input:
listofinput.append(i)
if(len(listofinput)>1):
listofinput = format(tuple(listofinput))
sql_query = f"""SELECT * FROM countries
where
name in {listofinput};
"""
This works when I have a list, but in case of just one value it fails.
as listofinput = ['USA'] for one value
but listofinput ('USA', 'Germany') for multiple
also I need to do this for thousands of input, what is the best optimized way to achieve the same. name in my table countries is an indexed column
You can just convert to tuple and then if the second last character is a coma, remove it.
listofinput = format(tuple(input))
if listofinput[-2] == ",":
listofinput = f"{listofinput[:-2]})"
sql_query = f"""SELECT * FROM countries
where name in {listofinput};"""
Change if(len(listofinput)>1): to if(len(listofinput)>=1):
This might work.
Remove condition if(len(listofinput)>1) .
Because if you don't convert to tuple your query should be like this:
... where name in ['USA']
or
... where name in []
and in [...] not acceptable in SQL, only in (...) is acceptable.
You can remove format() too:
listofinput = tuple(listofinput)
Final Code:
listofinput = []
for i in input:
listofinput.append(i)
listofinput = tuple(listofinput)
sql_query = f"""SELECT * FROM countries
WHERE
name IN {listofinput};
"""
Yes the tuple with one element will required a ","
To circumvent your problem, maybe you can use string instead by just changing your code to the below:
listofinput = []
for i in input:
listofinput.append(i)
if(len(listofinput)>1):
listofinput = format(tuple(listofinput))
else:
listofinput='('+listofinput[0]+')'
import sqlite3
conn = sqlite3.connect('contacten.db')
c = conn.cursor()
#c.execute("""CREATE TABLE mail (
# mail text
# )""")
#c.execute("INSERT INTO mail VALUES ('test#gmail.com')")
conn.commit()
c.execute("SELECT * FROM mail")
print(c.fetchall())
conn.commit()
conn.close()
This is the code I've made but as a result, I get:
[('test#gmail.com',), ('test1#gmail.com',), ('test2#gmail.com',)]
But in this array I have a comma too much. ,), like this. does anyone of you know how to get rid of this extra comma?
The commas are there for a good reason, your result is a list of tuples; this is a consequence of how sqlite represents the result set, the data itself doesn't contain commas:
result = c.fetchall()
print(result)
=> [('test#gmail.com',), ('test1#gmail.com',), ('test2#gmail.com',)]
That's because each row can have more than one field. In your case you only have one field, but Python can't just remove the comma, because if we did you'd end up with a list of elements between brackets, not a list of tuples (see here to understand why).
Of course, if you're certain that the result will only have one field per row, you can simply get rid of the tuples by extracting the one (and only) field from each row at the moment of printing the result:
result = c.fetchall()
print([f[0] for f in result])
=> ['test#gmail.com', 'test1#gmail.com', 'test2#gmail.com']
Using python zip:
>>> emails = [('test#gmail.com',), ('test1#gmail.com',), ('test2#gmail.com',)]
>>> emails, = zip(*emails)
>>> type(emails)
<type 'tuple'>
>>> emails
('test#gmail.com', 'test1#gmail.com', 'test2#gmail.com')
>>> list(emails)
['test#gmail.com', 'test1#gmail.com', 'test2#gmail.com']
import sqlite3
conn = sqlite3.connect('contacten.db')
c = conn.cursor()
#c.execute("""CREATE TABLE mail (
# mail text
# )""")
#c.execute("INSERT INTO mail VALUES ('test#gmail.com')")
#c.execute("INSERT INTO mail VALUES ('test1#gmail.com')")
#c.execute("INSERT INTO mail VALUES ('test2#gmail.com')")
conn.commit()
c.execute("SELECT * FROM mail")
#print(c.fetchall())
for item in c.fetchall():
print item[0]
conn.commit()
conn.close()
I am using the below code to delete a row from sqlite table.
def deleteFromTable(item):
conn = sqlite3.connect("lite.db")
cur = conn.cursor()
cur.execute("DELETE FROM store WHERE item=?", (item,))
conn.commit()
conn.close()
Why do i need to use comma after item (item,) while passing the argument?
('String') evaluates into string, but ('string',) evaluates into tuple. that's why you need comma.
I am trying to python to generate a script that generates unload command in redshift. I not an expert Python programmer. I need to where I can generate all columns for the unload list. If the column is of specific name, I need to replace with a function. The challenge I am facing is it appending "," to last item in the dictionary. Is there a way I can avoid the last comma? Any help would be appreciated.
import psycopg2 from psycopg2.extras
import RealDictCursor
try:
conn = psycopg2.connect("dbname='test' port='5439' user='scott' host='something.redshift.amazonaws.com' password='tiger'");
except:
print "Unable to connect to the database"
conn.cursor_factory = RealDictCursor
cur = conn.cursor()
conn.set_isolation_level( psycopg2.extensions.ISOLATION_LEVEL_AUTOCOMMIT )
try:
cur.execute("SELECT column from pg_table_def where schema_name='myschema' and table_name ='tab1' " );
except:
print "Unable to execute select statement from the database!"
result = cur.fetchall()
print "unload mychema.tab1 (select "
for row in result:
for key,value in row.items():
print "%s,"%(value)
print ") AWS Credentials here on..."
conn.close()
Use the join function on the list of values in each row:
print ",".join(row.values())
Briefly, the join function is called on a string which we can think of as the "glue", and takes a list of "pieces" as its argument. The result is a string of "pieces" "held together" by the "glue". Example:
>>> glue = "+"
>>> pieces = ["a", "b", "c"]
>>> glue.join(pieces)
"a+b+c"
(since row.values() returns a list, you don't actually need the comprehension, so it's even simpler than I wrote it at first)
Infact, this worked better.
columns = []
for row in result:
if (row['column_name'] == 'mycol1') or (row['column_name'] == 'mycol2') :
columns.append("func_sha1(" + row['column_name'] + "||" + salt +")")
else:
columns.append(row['column_name'])
print selstr + ",".join(columns)+" ) TO s3://"
Thanks for your help, Jon
Hello everyone i currently have this:
import feedparser
d = feedparser.parse('http://store.steampowered.com/feeds/news.xml')
for i in range(10):
print d.entries[i].title
print d.entries[i].date
How would i go about making it so that the title and date are on the same line? Also it doesn't need to print i just have that in there for testing, i would like to dump this output into a mysql db with the title and date, any help is greatly appreciated!
If you want to print on the same line, just add a comma:
print d.entries[i].title, # <- comma here
print d.entries[i].date
To insert to MySQL, you'd do something like this:
to_db = []
for i in range(10):
to_db.append((d.entries[i].title, d.entries[i].date))
import MySQLdb
conn = MySQLdb.connect(host="localhost",user="me",passwd="pw",db="mydb")
c = conn.cursor()
c.executemany("INSERT INTO mytable (title, date) VALUES (%s, %s)", to_db)
Regarding your actual question: if you want to join two strings with a comma you can use something like this:
print d.entries[i].title + ', ' + str(d.entries[i].date)
Note that I have converted the date to a string using str.
You can also use string formatting instead:
print '%s, %s' % (d.entries[i].title, str(d.entries[i].date))
Or in Python 2.6 or newer use str.format.
But if you want to store this in a database it might be better to use two separate columns instead of combining both values into a single string. You might want to consider adjusting your schema to allow this.