UPDATE statement on Access database fails silently under pyodbc - python

I have a problem with a simple UPDATE statement. I wrote a Python tool which creates a lot of UPDATE statements and after creating them I want to execute them on my Access database but it doesn't work This is one statement for example:
UPDATE FCL_B_COVERSHEET_A SET BRANCH = 0 WHERE OBJ_ID = '1220140910132011062005';
The statement syntax is not the problem. I tested it and it works.
This next code snippet shows the initialization for the connect object.
strInputPathMDB = "C:\\Test.mdb"
DRV = '{Microsoft Access Driver (*.mdb)}';
con = pyodbc.connect('Driver={0};Dbq={1};Uid={2};Pwd={3};'.format(DRV,strInputPathMDB,"administrator",""))
After that I wrote a method which execute one SQL statement
def executeSQLStatement(conConnection, strSQL):
arcpy.AddMessage(strSQL)
cursor = conConnection.cursor()
cursor.execute(strSQL)
conConnection.commit()
and if I execute this code everything seems to work - no error message or anything like that - but also the data is not updated and I don't know what I'm doing wrong ...
for strSQL in sqlStateArray:
executeSQLStatement(con, strSQL)
con.close()
I hope you understand what my problem is. Thanks for your help.
Chris

The issue here was that the .mdb file was in the root folder of the C: drive. Root folders often restrict normal users to read-only access so the database file was being opened as read-only. Moving the .mdb file to a public folder solved the problem.

Related

Code works on Jupyter notebook but not as as .py script

Simplified example of my code, please ignore syntax errors:
import numpy as np
import pandas as pd
import pymysql.cursors
from datetime import date, datetime
connection = pymysql.connect(host=,
user=,
password=,
db=,
cursorclass=pymysql.cursors.DictCursor)
df1 = pd.read_sql()
df2 = pd.read_sql(
df3 = pd.read_sql()
np.where(a=1, b, c)
df1.append([df2, d3])
path = r'C:\Users\\'
df.to_csv(path+'a.csv')
On Jupyternotebook it outputs the csv file like it supposed to. However, it I download the .py and run with python. It will only output a csv the first time I run it after restarting my computer. Other times it will just run and nothing happens. Why this is happening is blowing my mind.
I think you have added the path wrongly
If you change your path to df.to_csv(path+'\a.csv') then it will be correct
It's hard to say without knowing what your actual code is, but one thought is that the connection you have to your DB is never closed, and is somehow locking the DB so you are unable to make another connection.
The first connection would end, of course, when you restart your computer.
To see if this is an issue, you could use the MySQL command SHOW PROCESSLIST that would list out the different connections for you; if, after running the script the first time, one of the processes is still the same connection from your machine you just made, that could be the issue. Here's the docs on the command: https://dev.mysql.com/doc/refman/8.0/en/show-processlist.html
Alternatively, you could wrap the DB connection code in a try/except block with some print statements for good measure, to ascertain whether or not that's an issue, like so:
try:
print "Right before connection"
connection = pymysql.connect(host=,
user=,
password=,
db=,
cursorclass=pymysql.cursors.DictCursor)
print "Right after connection"
except Exception as e:
print "The Exception is:{}".format(str(e))
Also, you should most definitely print the objects that you're trying to write to CSV, to see if they're still valid the second time around (i.e. make sure you've actually populated those variables and they're not just Nones)

Calling python script in web2py framework over webserver

I have NGINX UWSGI and WEB2PY installed on the server. Web2py application performing only one function by accessing the database and printing rows in the table.
def fetch():
import psycopg2
conn = psycopg2.connect(database="postgres",
user="postgres",
password="qwerty",
host="127.0.0.1")
cur = conn.cursor()
cur.execute("SELECT id, name from TEST")
rows = cur.fetchall()
conn.close()
return rows
When the function is called locally the table contents is returned.
But when I'm trying to call the function from remote machine I get an internal error 500.
One more interesting thing, is when function looks like this:
def hello():
return 'hello'
String 'hello' is returned. Starting adding it an import directive immediately causes error page to be generated.
Can any one please suggest the proper application syntax/logic?
My guess is that your MySQL service doesn't allow remote access. Could you check your MySQL configuration?
vim /etc/mysql/my.cnf
Comment out the following lines.
#bind-address = 127.0.0.1
#skip-networking
If there is no skip-networking line in your configuration file, just add it and comment out it.
And then restart the mysql service.
service mysql restart
Forgive the stupid question but have you checked if the module is available on your server?
When you say that the error appears in your hello function as soon as you try to import, it's the same directive import psycopg2?
Try this:
Assuming that fetch() it's defined on controllers/default.py
open folder views/default and create a new file called fetch.html
paste this inside
{{extend 'layout.html'}}
{{=rows}}
fetch.html is a view or a template if you prefer
Modify fetch() to return a dictionary with rows for the view to print
return dict(rows=rows)
this is very basic tough, you can find more information about basic steps in the book -> http://www.web2py.com/books/default/chapter/29/03/overview#Postbacks

Dump data from malformed SQLite in Python

I have a malformed database. When I try to get records from any of two tables, it throws an exception:
DatabaseError: database disk image is malformed
I know that through commandline I can do this:
sqlite3 ".dump" base.db | sqlite3 new.db
Can I do something like this from within Python?
As far as i know you cannot do that (alas, i might be mistaken), because the sqlite3 module for python is very limited.
Only workaround i can think of involves calling the os command shell (e.g. terminal, cmd, ...) (more info) via pythons call-command:
Combine it with the info from here to do something like this:
This is done on an windows xp machine:
Unfortunately i can't test it on a unix machine right now - hope it will help you:
from subprocess import check_call
def sqliterepair():
check_call(["sqlite3", "C:/sqlite-tools/base.db", ".mode insert", ".output C:/sqlite-tools/dump_all.sql", ".dump", ".exit"])
check_call(["sqlite3", "C:/sqlite-tools/new.db", ".read C:/sqlite-tools/dump_all.sql", ".exit"])
return
The first argument is calling the sqlite3.exe. Because it is in my system path variable, i don't need to specify the path or the suffix ".exe".
The other arguments are chained into the sqlite3-shell.
Note that the argument ".exit" is required so the sqlite-shell will exit. Otherwise the check_call() will never complete because the outer cmd-shell or terminal will be in suspended.
Of course the dump-file should be removed afterwards...
EDIT: Much shorter solution (credit goes to OP (see comment))
os.system("sqlite3 C:/sqlite-tools/base.db .dump | sqlite3 C:/sqlite-tools/target.db")
Just tested this: it works. Apparently i was wrong in the comments.
If I understood properly, what you want is to duplicate an sqlite3 database in python. Here is how I would do it:
# oldDB = path to the corrupted db,
# newDB = path to the new db
def duplicateDB(oldDB, newDB):
con = sqlite3.connect(oldDB)
script = ''.join(con.iterdump())
con.close()
con = sqlite3.connect(newDB)
con.executescript(script)
con.close()
print "duplicated %s into %s" % (oldDB,newDB)
In your example, call duplicateDB('base.db', 'new.db'). The iterdump function is equivalent to dump.
Note that if you use Python 3, you will need to change the print statement.

Python sqlite3.OperationalError: no such table:

I am trying to store data about pupils at a school. I've done a few tables before, such as one for passwords and Teachers which I will later bring together in one program.
I have pretty much copied the create table function from one of these and changed the values to for the Pupil's information. It works fine on the other programs but I keep getting:
sqlite3.OperationalError: no such table: PupilPremiumTable
when I try to add a pupil to the table, it occurs on the line:
cursor.execute("select MAX(RecordID) from PupilPremiumTable")
I look in the folder and there is a file called PupilPremiumTable.db and the table has already been created before, so I don't know why it isn't working.
Here is some of my code, if you need more feel free to tell me so, as I said it worked before so I have no clue why it isn't working or even what isn't working:
with sqlite3.connect("PupilPremiumTable.db") as db:
cursor = db.cursor()
cursor.execute("select MAX(RecordID) from PupilPremiumTable")
Value = cursor.fetchone()
Value = str('.'.join(str(x) for x in Value))
if Value == "None":
Value = int(0)
else:
Value = int('.'.join(str(x) for x in Value))
if Value == 'None,':
Value = 0
TeacherID = Value + 1
print("This RecordID is: ",RecordID)
You are assuming that the current working directory is the same as the directory your script lives in. It is not an assumption you can make. Your script is opening a new database in a different directory, one that is empty.
Use an absolute path for your database file. You can base it on the absolute path of your script:
import os.path
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
db_path = os.path.join(BASE_DIR, "PupilPremiumTable.db")
with sqlite3.connect(db_path) as db:
You can verify what the current working directory is with os.getcwd() if you want to figure out where instead you are opening the new database file; you probably want to clean up the extra file you created there.
I had the same problem and here's how I solved it.
I killed the server by pressing Ctrl+C
I deleted the pychache folder. You'll find this folder in your project folder.
I deleted the sqlite db.
I made migrations with python manage.py makemigrations <app_name> where <app_name> is the specific app that contains the model that's causing the error. In my case it was the mail app so I ran python manage.py makemigrations app.
I migrated in the normal way.
Then I started the server and it was all solved.
I believe the issue is as Jorge Cardenas said:
Maybe you are loading views or queries to database but you haven´t
granted enough time for Django to migrate the models to DB. That's why
the "table doesn't exist".
This solution is based on this youtube video
First, you need to check if that table 100% exist in the database. You can use sqlite viewer for that: https://inloop.github.io/sqlite-viewer/.
If the table exists, then you can write your table name in '', for example:
Select * from 'TableName'
Whatever your query is, I am just using Select * as an example.
I have to face same issue and there are a couple of approaches, but the one I think is the most probable one.
Maybe you are loading views or queries to database but you haven´t granted enough time for Django to migrate the models to DB. That's why the "table doesn't exist".
Make sure you use this sort of initialization in you view's code:
form RegisterForm(forms.Form):
def __init__(self, *args, **kwargs):
super(RegisterForm, self).__init__(*args, **kwargs)
A second approach is you clean previous migrations, delete the database and start over the migration process.
I had the same issue when I was following the flask blog tutorial. I had initialized the database one time when it started giving me the sqlite3.OperationalError: then I tried to initialize again and turns out I had lots of errors in my schema and db.py file. Fixed them and initialized again and it worked.
Adding this worked for me:
import os.path
BASE_DIR = os.path.dirname(os.path.abspath(__file__))
db_dir = (BASE_DIR + '\\PupilPremiumTable.db')
Note the need for \\ before PupilPremiumTable.bd for the code to work .

SQLite Insert command in Python script Doesn't work on web

I'm trying to use an SQLite insert operation in a python script, it works when I execute it manually on the command line but when I try to access it on the web it won't insert it in the database. Here is my function:
def insertdb(unique_id,number_of_days):
conn = sqlite3.connect('database.db')
print "Opened database successfully";
conn.execute("INSERT INTO IDENT (ID_NUM,DAYS_LEFT) VALUES (?,?)",(unique_id,number_of_days));
conn.commit()
print "Records created successfully";
conn.close()
When it is executed on the web, it only shows the output "Opened database successfully" but does not seem to insert the value into the database. What am I missing? Is this a server configuration issue? I have checked the database permissions on writing and they are correctly set.
The problem is almost certainly that you're trying to create or open a database named database.db in whatever happens to be the current working directory, and one of the following is true:
The database exists and you don't have permission to write to it. So, everything works until you try to do something that requires write access (like commiting an INSERT).
The database exists, and you have permission to write to it, but you don't have permission to create new files in the directory. So, everything works until sqlite needs to create a temporary file (which it almost always will for execute-ing an INSERT).
Meanwhile, you don't mention what web server/container/etc. you're using, but apparently you have it configured to just swallow all errors silently, which is a really, really bad idea for any debugging. Configure it to report the errors in some way. Otherwise, you will never figure out what's going on with anything that goes wrong.
If you don't have control over the server configuration, you can at least wrap all your code in a try/except and manually log exceptions to some file you have write access to (ideally via the logging module, or just open and write if worst comes to worst).
Or, you can just do that with dumb print statements, as you're already doing:
def insertdb(unique_id,number_of_days):
conn = sqlite3.connect('database.db')
print "Opened database successfully";
try:
conn.execute("INSERT INTO IDENT (ID_NUM,DAYS_LEFT) VALUES (?,?)",(unique_id,number_of_days));
conn.commit()
print "Records created successfully";
except Exception as e:
print e # or, better, traceback.print_exc()
conn.close()

Categories

Resources