using an sqlite3 database with WAL enabled -Python - python

I'm trying to modify the two database files used by Google Drive to redirect my sync folder via a script (snapshot.db and sync_conf.db). While I can open the files in certain sqlite browsers (not all) I cant get python to execute a query. I just get the message: sqlite3.DatabaseError: file is encrypted or is not a database
Apparently google is using a Write-Ahead-logging (WAL) configuration on the databases and it can be turned off by running PRAGMA journal_mode=DELETE; (according to sqlite.org) against the database, but I can't figure out how to run that against the database if python can't read it.
heres what I have (I tried executing the PRAGMA command and commiting and then reopening but it didnt work):
import sqlite3
snapShot = 'C:\Documents and Settings\user\Local Settings\Application Data\Google\Drive\snapshot.db'
sync_conf = 'C:\Documents and Settings\user\Local Settings\Application Data\Google\Drive\sync_config.db'
sync_folder_path = 'H:\Google Drive'
conn = sqlite3.connect(snapShot)
cursor = conn.cursor()
#cursor.execute('PRAGMA journal_mode=DELETE;')
#conn.commit()
#conn= sqlite3.connect(snapShot)
#cursor = conn.cursor()
query = "UPDATE local_entry SET filename = '\\?\\" + sync_folder_path +"' WHERE filename ='\\?\C:Users\\admin\Google Drive'"
print query
cursor.execute(query)

problem solved. I just downloaded the latest version of sqlite from http://www.sqlite.org/download.html and overwrote the old .dll in my python27/DLL directory. Works fine now.
What a nusance.

I don't think the journal_mode pragma should keep sqlite3 from being able to open the db at all. Perhaps you're using an excessively old version of the sqlite3 lib? What version of Python are you using, and what version of the sqlite3 library?
import sqlite3
print sqlite3.version

Related

Correct Format for Absolute Path for SQLite Database in Python

I have a small Python application that references a folder on my Raspberry Pi. I'm using a direct link to the .db file, and want to update to use an absolute link. However, my tries have failed and the path isn't correct. How do I format this for the database link to work anywhere it's installed? The code creates a new database.db file if one doesn't exist, but I want to use code that will always work even without the /home/name/code/... location specified. Thanks!
(I read this, and it didn't help me.)
#Connect to database
sqliteConnection = sqlite3.connect('/home/name/code/bot/database.db')
cursor = sqliteConnection.cursor()
I'm trying to do something more like this, but it isn't working:
#Connect to database
dbdir = os.path.direname(__file__)
dbpath = os.path.join(dbdir, "..", "database.db")
sqliteConnection = sqlite3.connect(dbpath)
#Dillon Davis got me on the right track. I'm not sure if this is the most elegant way to handle this, but it works:
filename = os.path.abspath(__file__)
dbdir = filename.rstrip('filename.py')
dbpath = os.path.join(dbdir, "database.db")
sqliteConnection = sqlite3.connect(dbpath)
cursor = sqliteConnection.cursor()

Importing .bak MySQL database with Python using pymssql

The title is pretty self explanotory.
I've tried the following code :
import _mssql
conn = _mssql.connect(server='', user='', password='', database='')
conn.execute_non_query("IF EXISTS (SELECT 0 FROM sys.databases WHERE name = 'mydb') BEGIN ALTER DATABASE mydb MODIFY NAME = mydb_old END")
conn.execute_non_query("RESTORE DATABASE mydb FROM DISK='C:\mydb.bak'")
But I get the following error : No module named '_mssql'.
I have the version 2.2.2 of pymssql and I use Python 3.9.
I'm just trying to write and read from this database, and I only have the .bak file. I'm quite new to SQL, so I might be doing it the wrong way ? I only have a .bak file though, nothing else.
Thank you for your time.
As mentioned in the comments:
MySQL <> MS SQL
And I have to fill the server = "", user = "", etc ...

MySQL - LOAD DATA LOCAL

Running a MySQL (8.0) database on a Ubuntu (20.04) VPS. My current objective is trying to load a .CSV automatically into a table via a Python script. The script is theoretically correct and should work, it's the ability to process the data from the CSV into the table.
dbupdate.py:
import mysql.connector
import os
import string
db = mysql.connector.connect (
host="localhost",
user="root",
passwd="********",
db="Rack_Info"
)
sqlLoadData = "LOAD DATA LOCAL INFILE '/home/OSA_ADVA_Dashboard/Processed_CSV/DownloadedCSV.csv' INTO TABLE BerT FIELDS TERMINATED BY ',' ENCLOSED BY '*' IGNORE 1 LINES;"
try:
curs = db.cursor()
curs.execute(sqlLoadData)
db.commit()
print ("SQL execution complete")
resultSet = curs.fetchall()
except IOError:
print ("Error incurred: ")
db.rollback()
db.close()
print ("Data loading complete.\n")
I have consulted the official documentation and enabled local_infile on both the server and client, configured the my.cnf and in SQL.
The my.cnf file:
#
# The MySQL database server configuration file.
#
# You can copy this to one of:
# - "/etc/mysql/my.cnf" to set global options,
# - "~/.my.cnf" to set user-specific options.
#
# One can use all long options that the program supports.
# Run program with --help to get a list of available options and with
# --print-defaults to see which it would actually understand and use.
#
# For explanations see
# http://dev.mysql.com/doc/mysql/en/server-system-variables.html
#
# * IMPORTANT: Additional settings that can override those from this file!
# The files must end with '.cnf', otherwise they'll be ignored.
#
!includedir /etc/mysql/conf.d/
!includedir /etc/mysql/mysql.conf.d/
[client]
local_infile=1
[mysql]
local_infile=1
[mysqld]
local_infile=1
I have restarted both php and MySQL services to no avail, as well as the server. At a loss here at what to do. Any help would be much appreciated.
if im not mistaken php has its own config file where you have to enable load data local infile
I investigated the php.ini file and uncommented the load data lines, still nothing.
Turns out one of mysqld's variable, secure_file_priv, was pointing at an empty/default directory. All I had to do was change the directory to the where my files were located. All working now.

SQLITE3 not creating database

import sqlite3
conn = sqlite3.connect("test.db")
cursor = conn.cursor()
It should create the database, but it does not. Any help?
This code will create an sqlite db file called "test.db" in the same directory you are running your script from.
For example, if you have your python file in:
/home/user/python_code/mycode.py
And you run it from:
/home/user/
With:
python python_code/mycode.py # or python3
It will create an "empty" sqlite db file at
/home/user/test.db
If you can't find the test.db file, make sure you pass it the full path of where you want it to be located.
i.e.
conn = sqlite3.connect("/full/path/to/location/you/want/test.db")
I had the same problem, my .db file wasn't appearing because I forgot to add test.db at the end of path, see line 2 below
import sqlite3
databaseFile = "/home/user/test.db" #don't forget the test.db
conn = sqlite3.connect(databaseFile)
cursor = conn.cursor()
I suspect the DB will not be created on disk until you create at least one table in it. Just calling conn.cursor() is not sufficient.
Console sqlite3 utility behaves this way, too.

Fast MySQL Import

Writing a script to convert raw data for MySQL import I worked with a temporary textfile so far which I later imported manually using the LOAD DATA INFILE... command.
Now I included the import command into the python script:
db = mysql.connector.connect(user='root', password='root',
host='localhost',
database='myDB')
cursor = db.cursor()
query = """
LOAD DATA INFILE 'temp.txt' INTO TABLE myDB.values
FIELDS TERMINATED BY ',' LINES TERMINATED BY ';';
"""
cursor.execute(query)
cursor.close()
db.commit()
db.close()
This works but temp.txt has to be in the database directory which isn't suitable for my needs.
Next approch is dumping the file and commiting directly:
db = mysql.connector.connect(user='root', password='root',
host='localhost',
database='myDB')
sql = "INSERT INTO values(`timestamp`,`id`,`value`,`status`) VALUES(%s,%s,%s,%s)"
cursor=db.cursor()
for line in lines:
mode, year, julian, time, *values = line.split(",")
del values[5]
date = datetime.strptime(year+julian, "%Y%j").strftime("%Y-%m-%d")
time = datetime.strptime(time.rjust(4, "0"), "%H%M" ).strftime("%H:%M:%S")
timestamp = "%s %s" % (date, time)
for i, value in enumerate(values[:20], 1):
args = (timestamp,str(i+28),value, mode)
cursor.execute(sql,args)
db.commit()
Works as well but takes around four times as long which is too much. (The same for construct was used in the first version to generate temp.txt)
My conclusion is that I need a file and the LOAD DATA INFILE command to be faster. To be free where the textfile is placed the LOCAL option seems useful. But with MySQL Connector (1.1.7) there is the known error:
mysql.connector.errors.ProgrammingError: 1148 (42000): The used command is not allowed with this MySQL version
So far I've seen that using MySQLdb instead of MySQL Connector can be a workaround. Activity on MySQLdb however seems low and Python 3.3 support will probably never come.
Is LOAD DATA LOCAL INFILE the way to go and if so is there a working connector for python 3.3 available?
EDIT: After development the database will run on a server, script on a client.
I may have missed something important, but can't you just specify the full filename in the first chunk of code?
LOAD DATA INFILE '/full/path/to/temp.txt'
Note the path must be a path on the server.
To use LOAD DATA INFILE with every accessible file you have to set the
LOCAL_FILES client flag while creating the connection
import mysql.connector
from mysql.connector.constants import ClientFlag
db = mysql.connector.connect(client_flags=[ClientFlag.LOCAL_FILES], <other arguments>)

Categories

Resources