Consider with this following piece of code:
for count in range(28, -1, -1):
crsr.execute("SELECT board, win1, win2 FROM positions WHERE balls = ?", (count,))
response = crsr.fetchall()
print count, len(response)
for possibility in response:
internal = possibility[0]
player = count & 1
victor = 1 - player
opponent = 2 - player
victory = possibility[opponent]
if victory:
crsr.execute("UPDATE positions SET result = ? WHERE board = ?", (victor, internal))
else:
subsequent = derive((internal, count))
for derived in subsequent:
external = reduce(derived[0])
crsr.execute("SELECT result FROM positions WHERE board = ?", (external,))
colour = crsr.fetchall()
if colour[0][0] == player:
victor = player
break
crsr.execute("UPDATE positions SET result = ? WHERE board = ?", (victor, internal))
Consider with the line:
response = crsr.fetchall()
Whenever that there are as much as 107 rows in response, the above statement returns a memory error, even on a system with 8 GB of RAM.
So, I decided that I would change with the following piece of code:
for count in range(28, -1, -1):
crsr.execute("SELECT board, win1, win2 FROM positions WHERE balls = ?", (count,))
response = crsr.fetchall()
print count, len(response)
for possibility in response:
internal = possibility[0]
to:
for count in range(28, -1, -1):
crsr.execute("SELECT COUNT(board) FROM positions WHERE balls = ?", (count,))
sum = crsr.fetchall()
total = sum[0][0]
print count, total
crsr.execute("SELECT board, win1, win2 FROM positions WHERE balls = ?", (count,))
for possibility in range(total):
response = crsr.fetchone()
internal = response[0]
Now that the line:
response = crsr.fetchone()
makes use of the crsr variable for performing with SQLite3 selection query for every iteration of possibility in range(total).
There are already other crsr statements in the same 'for' loop:
crsr.execute("UPDATE positions SET result = ? WHERE board = ?", (victor, internal))
with that statement occurring twice, and
crsr.execute("SELECT result FROM positions WHERE board = ?", (external,)).
with that statement occurring once.
So, whenever the crsr variable from the line: response = crsr.fetchall() changes with every iteration of possibility in range(total), will it not conflict with the other crsr statements already in the same 'for' loop?
We cannot create with other cursor variables for executing with different SQLite3 queries, because crsr is defined by using crsr = connection.cursor() for a specific database file, as soon as it is initialized (whichever is spline.db, in this particular case).
So, I would like to know that if there are any other alternative solutions available for it whichever are efficient enough quite directly.
A result set is part of the cursor object, so whenever you call execute(), any previous query on the same cursor object is aborted. The only way to avoid this is to use fetchall() to read all result rows before the next query is executed.
To be able to execute multiple simultaneous queries, you must use multiple cursors. Simply call connection.cursor() multiple times.
Please note that you must not modify a table that you are still reading from (even if you are using multiple cursors); changed rows might be skipped or read twice by the read cursor. If you cannot use fetchall(), put the results of the first query into a temporary table:
crsr1.execute("CREATE TEMP TABLE temp_pos(board, win1, win2)")
for count in ...:
crsr1.execute("INSERT INTO temp_pos SELECT board, win1, win2 ...")
crsr1.execute("SELECT board, win1, win2 FROM temp_pos")
for row in crsr1:
if ...:
crsr2.execute("UPDATE positions ...")
else:
crsr2.execute("SELECT ... FROM positions ...")
...
crsr1.execute("DELETE FROM temp_pos")
crsr1.execute("DROP TABLE temp_pos")
Related
how can I print unique code according to the loop if there is the same value?
FullChar = 'CEFLMPRTVWXYK0123456789'
total = 1000
count = 10
count = int(count)
entries = []
bcd = ""
flg = ""
rll = ""
try:
conn = psycopg2.connect(host="192.168.13.10",database="postgres", port="5432", user="postgres", password="potatona1")
cursor = conn.cursor()
def inputDatabase(data):
postgres_insert_query = """INSERT INTO unique_code(unique_code, barcode, flag, roll) VALUES (%s,%s,%s,%s)"""
cursor.executemany(postgres_insert_query, data)
conn.commit()
for i in range(5):
for x in range(total): # banyaknya code yang di print
unique_code = ''.join(random.sample(FullChar, count - 1))
unique_code = ''.join(random.sample(unique_code, len(unique_code)))
entry = (unique_code, bcd, flg, rll)
entries.append(entry)
inputDatabase(entries)
print(i)
count = cursor.rowcount
print (count, "Record inserted successfully into mobile table")
except (Exception, psycopg2.DatabaseError) as error:
print(error)
conn.rollback()
exmp :
if this code MTY9X4L2E show up again/ duplicate, the loop will stop
and i get this message
duplicate key value violates unique constraint "unique_code_pkey"
To keep track of unique values, use a set.
unique_codes = set()
...
for i in range(5):
for x in range(total): # banyaknya code yang di print
unique_code = ''.join(random.sample(FullChar, count - 1))
unique_code = ''.join(random.sample(unique_code, len(unique_code)))
if unique_code in unique_codes:
# The unique code has already been used.
# Do something?
else:
# Add the code to the set of used codes.
unique_codes.add(unique_code)
It's not very clear what those loops are doing; unique_code gets overwritten in every iteration of the inner loop.
The example code has another problem: the entries list is never cleared, so the second iteration of the outer loop will cause a duplicate key error because entries contains not only the new data but also the data from the previous iteration. entries should be cleared or reinitialised after each call to inputDatabase.
inputDatabase(entries)
entries.clear()
I am trying to check if a certain number exists in a database using python.
When it detects the number I am looking for, then I want it to do variable += 1. I do not have any specific code, but here is some example code of what I want it to do.
import pyodbc
one = 0
conn = pyodbc.connect(r'DSN=MACCD')
cursor = conn.cursor()
cursor.execute('SELECT first,second,third,fourth,fifth FROM ExampDatabase')
if "1 is detected in the database":
one += 1
print(one)
Reading the pyodbc docs, you could try this:
import pyodbc
conn = pyodbc.connect(...)
cursor = conn.cursor()
cursor.execute(...)
# whatever number you are looking for
my_target = ...
target_counter = 0
row_counter = 0
while True:
row = cursor.fetchone()
if row is None:
break
row_counter += 1
print(row_counter, row)
if my_target in row:
target_counter += 1
print('rows returned:', row_counter)
print('targets found:', target_counter)
I have my python script which reads an excel column row by row and returns all rows str(values).
I want to write another script which will allow put these values to sql db. I've already written connect method:
def db_connect():
adr = 'some_addr'
uid = 'some_uid'
pwd = 'pwd'
port = port
dsn_tns = cx_Oracle.makedsn(adr, port, SID)
db = cx_Oracle.connect('username', 'pass', dsn_tns)
cur = db.cursor()
cur.execute('update TABLE set ROW = 666 where ANOTHER_ROW is null')
db.commit()
This method does an update but it sets 666 for ALL rows. How to do it by kind of iteration in sql? For example, first row of output == 1, second == 23, third == 888.
If I understand correctly what you are trying to do here it should be done in two phases. First select all rows for update (based on chosen condition), then you can iteratively update each of these rows.
It cannot be done in single query (or on only single condition that does not change through a number of queries), because SQL works on sets, that's why each time your query is executed you are updating whole table, and in the end only getting result of the last query.
You can use the "rownum" expression, as in:
cur.execute("update TABLE set ROW = rownum where ANOTHER_ROW is null")
This will start with the value 1 and increment up by one for each row updated.
If you want more control over the value to set, you can also do the following in PL/SQL (untested):
cur.execute("""
declare
t_NewValue number;
cursor c_Data is
select ROW, ANOTHER_ROW
from TABLE
where ANOTHER_ROW is null
for update;
begin
t_NewValue := 1;
for row in c_Data loop
update TABLE set ROW = t_NewValue
where current of c_Data;
t_NewValue := t_NewValue + 1;
end loop;
end;""")
This gives you the most control. You can use whatever logic you require to control what the new value should be.
Please take a look at another method which is writing to excel:
adr = 'some_addr'
uid = 'some_uid'
pwd = 'pwd'
port = port
dsn_tns = cx_Oracle.makedsn(adr, port, SID)
db = cx_Oracle.connect('username', 'pass', dsn_tns)
cur = db.cursor()
cells = excel.read_from_cell()
indices_and_statuses = []
stat = execute_script(some_js)
for req_id in cells:
indices_and_statuses.append((cells.index(req_id), stat))
cur.execute("""update TABLE set ROW ="""+"'"+req_id+"'"+"""where ANOTHER_ROW is null""")
db.commit()
db.close()
And in this code when you put print(req_id) in this FOR statement, you will see that req_id is changing. But in DB only the last req_id is saved.
I am using cursors to get results from GAE Full text search API. The roblem is that the cursor remains same in each iteration:
cursor = search.Cursor()
files_options = search.QueryOptions(
limit=5,
cursor=cursor,
returned_fields='state'
)
files_dict = {}
query = search.Query(query_string=text_to_search, options=files_options)
index = search.Index(name='title')
while cursor != None:
results = index.search(query)
cursor = results.cursor
The cursor never become None even when the search returns only 18 results
The problem is that you getting the same 5 results over and over again. Every time you do results = index.search(query) inside your loop, you're retrieving the first five results because your query options specify a limit of 5 and empty cursor. You need to create a new query starting a the new cursor on every iteration.
cursor = search.Cursor()
index = search.Index(name='title')
while cursor != None:
options = search.QueryOptions(limit=5, cursor=cursor, returned_fields='state'))
results = index.search(search.Query(query_string=text_to_search, options=options))
cursor = results.cursor
Take a look at the introduction section of this page: https://developers.google.com/appengine/docs/python/search/queryclass
I'm fairly new to Python. Here's a script I have that gathers info from our MySQL server hosting our Helpdesk tickets, and will pop up a message box (using EasyGUI's "msgbox()" function) whenever a new ticket arrives.
The issue is that I want my program to continue processing after the popup, regardless of whether the user clicks "OK" or not, even if that means message boxes could keep popping up over each other and must be dismissed one by one; that would be fine with me.
I looked into threading, and either it doesn't work or I did something wrong and need a good guide. Here's my code:
import MySQLdb
import time
from easygui import *
# Connect
db = MySQLdb.connect(host="MySQL.MyDomain.com", user="user", passwd="pass", db="db")
cursor = db.cursor()
# Before-and-after arrays to compare; A change means a new ticket arrived
IDarray = ([0,0,0])
IDarray_prev = ([0,0,0])
# Compare the latest 3 tickets since more than 1 may arrive in my time interval
cursor.execute("SELECT id FROM Tickets ORDER BY id DESC limit 3;")
numrows = int(cursor.rowcount)
for x in range(0,numrows):
row = cursor.fetchone()
for num in row:
IDarray_prev[x] = int(num)
cursor.close()
db.commit()
while 1:
cursor = db.cursor()
cursor.execute("SELECT id FROM Tickets ORDER BY id DESC limit 3;")
numrows = int(cursor.rowcount)
for x in range(0,numrows):
row = cursor.fetchone()
for num in row:
IDarray[x] = int(num)
if(IDarray != IDarray_prev):
cursor.execute("SELECT Subject FROM Tickets ORDER BY id DESC limit 1;")
subject = cursor.fetchone()
for line in subject:
# -----------------------------------------
# STACKOVERFLOW, HERE IS THE MSGBOX LINE!!!
# -----------------------------------------
msgbox("A new ticket has arrived:\n"+line)
# My time interval -- Checks the database every 8 seconds:
time.sleep(8)
IDarray_prev = IDarray[:]
cursor.close()
db.commit()
You can use Python GTK+
It offers non-modal using
set_modal(False)