Hi,
Can I download a file from my database from a column type bytea in python?
I'm trying to do with the psycopg2, I upload a .txt file but when I tried to retrieved it to my local machine, it just save a .txt file with non-readable data, the txt file starts like this "U0ZUUCBhZGRy...." so looks like bytes info, the same as the DB saves.
An screenshot of the DB in dbeaver example_dbeaver_column
This is the code I used.
import psycopg2
connection = psycopg2.connect(dbname=dbname,
host=host,
port=port,
user=user,
password=password)
# get cursor
cursor = connection.cursor()
query = "select c.file from my_table t where t.file_name = 'credentials.txt'"
cursor.execute(query)
data = cursor.fetchall()
file_binary=data[0][0].tobytes()
with open('my_text.txt','wb') as file:
file.write(file_binary)
Any ideas how I can solve this problem?
Thanks for your help
I found it!
In postgre it's encoded in base64 so I need to decoded with base64 library.
import base64
import psycopg2
connection = psycopg2.connect(dbname=dbname,
host=host,
port=port,
user=user,
password=password)
# get cursor
cursor = connection.cursor()
query = "select c.file from my_table t where t.file_name = 'credentials.txt'"
cursor.execute(query)
data = cursor.fetchall()
file_binary=data[0][0].tobytes()
with open('my_text.txt','wb') as file:
file.write(base64.b64decode(file_binary))
Related
I am getting below error
query = command % processed_params TypeError: not all arguments
converted during string formatting
I am trying to pull data from SQL server and then inserting it into Snowflake
my below code
import pyodbc
import sqlalchemy
import snowflake.connector
driver = 'SQL Server'
server = 'tanmay'
db1 = 'testing'
tcon = 'no'
uname = 'sa'
pword = '123'
cnxn = pyodbc.connect(driver='{SQL Server}',
host=server, database=db1, trusted_connection=tcon,
user=uname, password=pword)
cursor = cnxn.cursor()
cursor.execute("select * from Admin_tbldbbackupdetails")
rows = cursor.fetchall()
#for row in rows:
# #data = [(row[0], row[1],row[2], row[3],row[4], row[5],row[6], row[7])]
print (rows[0])
cnxn.commit()
cnxn.close()
connection = snowflake.connector.connect(user='****',password='****',account='*****')
cursor2 = connection.cursor()
cursor2.execute("USE WAREHOUSE FOOD_WH")
cursor2.execute("USE DATABASE Test")
sql1="INSERT INTO CN_RND.Admin_tbldbbackupdetails_ip"
"(id,dbname, dbpath, backupdate, backuptime, backupStatus, FaildMsg, Backupsource)"
"values (?,?,?,?,?,?,?,?)"
cursor2.execute(sql1,*rows[0])
It's obviously string parsing error.
You missed to provide parameter to %s printout.
If you cannot fix it step back and try another approach.
Use another script to achieve the same and get back to you bug tomorrow :-)
My script is doing pretty much the same:
1. Connect to SQL Server
-> fetchmany
-> multipart upload to s3
-> COPY INTO Snowflake table
Details are here: Snowpipe-for-SQLServer
I am trying to fetch data in python from MySQL database using username that has read-only permission. I am using mysql.connector package to connect to database.
It gets connected to database properly, as I checked using following:
connection = mysql.connector.connect(host = HOSTNAME, user = USERNAME, passwd = PASSWORD, db = DATABASE, port=PORT)
print(connection.cmd_statistics())
But when I try to fetch data from Database using cursor, it returns 'None'.
My code is:
cursor = connection.cursor()
try:
query1 = 'SELECT * FROM table_name'
result = cursor.execute(query1)
print(result)
finally:
connection.close()
And the output is:
None
It works for python 3.6.5 and mysql_workbench 8.0 but not tried in other python -version**
import _mysql_connector
avi = _mysql_connector.MySQL()
avi.connect(host='127.0.0.1',user='root',port=3306, password='root',database='hr_table')
avi.query("select * from hr_table.countries")
row = avi.fetch_row()
while row:
print(row)
row = avi.fetch_row()
avi.free_result()
avi.close()
I'm trying to follow the method for inserting a Panda data frame into SQL Server that is mentioned here as it appears to be the fastest way to import lots of rows.
However I am struggling with figuring out the connection parameter.
I am not using DSN , I have a server name, a database name, and using trusted connection (i.e. windows login).
import sqlalchemy
import urllib
server = 'MYServer'
db = 'MyDB'
cxn_str = "DRIVER={SQL Server Native Client 11.0};SERVER=" + server +",1433;DATABASE="+db+";Trusted_Connection='Yes'"
#cxn_str = "Trusted_Connection='Yes',Driver='{ODBC Driver 13 for SQL Server}',Server="+server+",Database="+db
params = urllib.parse.quote_plus(cxn_str)
engine = sqlalchemy.create_engine("mssql+pyodbc:///?odbc_connect=%s" % params)
conn = engine.connect().connection
cursor = conn.cursor()
I'm just not sure what the correct way to specify my connection string is. Any suggestions?
I have been working with pandas and SQL server for a while and the fastest way I found to insert a lot of data in a table was in this way:
You can create a temporary CSV using:
df.to_csv('new_file_name.csv', sep=',', encoding='utf-8')
Then use pyobdc and BULK INSERT Transact-SQL:
import pyodbc
conn = pyodbc.connect(DRIVER='{SQL Server}', Server='server_name', Database='Database_name', trusted_connection='yes')
cur = conn.cursor()
cur.execute("""BULK INSERT table_name
FROM 'C:\\Users\\folders path\\new_file_name.csv'
WITH
(
CODEPAGE = 'ACP',
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)""")
conn.commit()
cur.close()
conn.close()
Then you can delete the file:
import os
os.remove('new_file_name.csv')
It was a second to charge a lot of data at once into SQL Server. I hope this gives you an idea.
Note: don't forget to have a field for the index. It was my mistake when I started to use this lol.
Connection string parameter values should not be enclosed in quotes so you should use Trusted_Connection=Yes instead of Trusted_Connection='Yes'.
Is there any way of sending either JSON, XML, or CSV data to a local MySQL server?
I'm new to MySQL, and wasn't able to find anything online.
Either data type will work as I have code that can covert all of my data into whichever format I require, i.e. JSON, XML, and CSV.
Any help is appreciated!
1). I am going to give you answer for JSON >> How to store JSON data in MySQL DB using python ?
If your JSON format is following and you want to store associative in MySQL database >> table then you can follow the first example.
Example: 1
JSON format
{
"first_key" : 10,
"second_key" : 20
}
Python core script for JSON.
import MySQLdb
myjson = json.loads(jdata)
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myjson_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (array_key, array_value) VALUES (%s, %s)"""
for array_key, array_value in myjson.items():
cursor.execute(sql, (array_key, array_value))
If you want to store data in only one column then you can follow the second one as per follow.
Example: 2
import MySQLdb
myjson = json.loads(jdata)
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myjson_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (json_column) VALUES (%s)"""
cursor.execute(sql, (myjson))
2). Lets start with XML >> How to store XML data in MySQL DB using
python ?
XML data
<?xml version="1.0" encoding="UTF-8" ?>
<first_key>10</first_key>
<second_key>20</second_key>
Next step is: please install: Python script for converts XML to JSON from here import and import xml2json in our python core script.
Python Core script for XML
import MySQLdb
import xml2json
import json
xml_data = json.loads(xml2json.xml2json(xmldata))
### data store functionality or logic is same as example 1 and example 2
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='myxml_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
cursor = db.cursor()
sql = """INSERT INTO my_table (xml_data) VALUES (%s)"""
cursor.execute(sql, (xml_data))
3). Lets discuss for CSV >> How to store CSV data in MySQL DB using
python ?
import csv
import MySQLdb
csv_data = csv.reader(file('my_csv_file.csv'))
def dbconnect():
try:
db = MySQLdb.connect(
host='localhost',
user='root',
passwd='',
db='mycsv_db'
)
except Exception as e:
sys.exit("Can't connect to database")
return db
for row in csv_data:
cursor.execute('INSERT INTO my_csv_table(csv_first_column, \
csv_first_column)' \
'VALUES("%s", "%s")',
row)
I'm unaware of anyway of inserting JSON, XML or CSV into a MySQL database directly.
You can parse the data to a script which can insert it into a Database by using a module such as MySQL-python.
My python isn't great but hopefully this example should suffice.
#!usr/bin/python
# Import mySQL module to interact with database.
import MySQLdb
# Import json module to convert the JSON into a Python data structure.
import json
# Convert the JSON to a usable format.
data = json.loads(yourjson)
# Connect to MySQL server.
db = mySQLdb.connect(host='yourhost',
user='youruser',
passwd='yourpassword',
db='yourschema')
# Create an object to handle SQL statements.
cur = db.cursor()
# Attempt to execute the SQL statement, if not revert any changes.
try:
cur.execute('INSERT INTO table SET col1 = %s, col2 = %s', data.foo, data.bar)
db.commit()
except:
db.rollback()
I have an SQL database and am wondering what command you use to just get a list of the table names within that database.
To be a bit more complete:
import MySQLdb
connection = MySQLdb.connect(
host = 'localhost',
user = 'myself',
passwd = 'mysecret') # create the connection
cursor = connection.cursor() # get the cursor
cursor.execute("USE mydatabase") # select the database
cursor.execute("SHOW TABLES") # execute 'SHOW TABLES' (but data is not returned)
now there are two options:
tables = cursor.fetchall() # return data from last query
or iterate over the cursor:
for (table_name,) in cursor:
print(table_name)
SHOW tables
15 chars
show tables will help. Here is the documentation.
It is also possible to obtain tables from a specific scheme with execute the single query with the driver below.
python3 -m pip install PyMySQL
import pymysql
# Connect to the database
conn = pymysql.connect(host='127.0.0.1',user='root',passwd='root',db='my_database')
# Create a Cursor object
cur = conn.cursor()
# Execute the query: To get the name of the tables from a specific database
# replace only the my_database with the name of your database
cur.execute("SELECT table_name FROM information_schema.tables WHERE table_schema = 'my_database'")
# Read and print tables
for table in [tables[0] for tables in cur.fetchall()]:
print(table)
output:
my_table_name_1
my_table_name_2
my_table_name_3
...
my_table_name_x