Snowflake connection string provided by Azure Key Vault - python

Here is my Snowflake connection in Python:
import snowflake.connector
ctx = snowflake.connector.connect(user='someuser#somedomain.com',
password='somepassword',
account='someaccount',
warehouse='somewarehouse',
database='somedb',
schema='someschema'
authenticator='someauth')
It works fine, but now I need to store my connection details in Azure Key Vault and is far as I understand it will be coming back as a string, which I will need to feed into snowflake.connector.connect()
So I tried to convert connection parameters into string:
connection_string = "user=someuser#somedomain.com;password=somepassword;account=someaccount;authenticator=someauth;warehouse=somewarehouse;database = somedb"
ctx = snowflake.connector.connect(connection_string)
but got back error message:
TypeError Traceback (most recent call last)
<ipython-input-19-ca89ef96ad7d> in <module>
----> 1 ctx = snowflake.connector.connect(connection_string)
TypeError: Connect() takes 0 positional arguments but 1 was given
I also tried extracting python dictionary from string with ast library and feeding it into snowflake.connector.connect(), but got back the same error.
So is there way to solve it? Am I missing something conceptually?

Please check if given references can help:
Snowflake connector variables are separate .we may need to separate by split method.
Conn=Connection_string.split(‘;’) and use it in snowflake connector.
Note : For the ACCOUNT parameter, use your account identifier ,the account identifier does not include the snowflakecomputing.com domain name/ suffix. Snowflake automatically appends this when creating the connection.So it may work for snowflake connector.
So try giving connection variables first before and then use the variables in snowflake connector and connection string. Or
Try to include driver and server in the connection string by saving the complete connection string including accountname, username, password, database, warehouse details in Azure Key Vault .
Sample connection string format :
"jdbc:snowflake://<accountname>.snowflakecomputing.com/?user=<username>&db=<database>&warehouse=<warehouse>&role=<myRole>"
References:
Configuring the JDBC Driver — Snowflake Documentation
ODBC connection string to Snowflake for Access Pass Thru Query -
Stack Overflow
azure-docs/connector-snowflake.md at main · MicrosoftDocs/azure-docs
· GitHub
connect-snowflake-using-python-pyodbc-odbc-driver

Related

Parameterize pyodbc connect string using keyword arguments from my qodbc API?

Help me to understand if I can parameterize my connection string for a pyodbc connection to this qodbc interface for QuickBooks:
pyodbc.connect(r'DSN=qremote_dsn;IPAddress=192.168.0.50;Port=4500;RemoteDSN=login_dsn;OpenMode=F;OLE DB Services=-2;', autocommit=True)
I have several different DSNs, RemoteDSNs and servers which I'd like to loop over. Several SO posts (here and here) point to this code.google documentation suggesting I can use strings, keywords, or both with pyodbc's connect function.
pyodbc works with many different odbc APIs, so how do determine if any of the pyodbc keywords will map to my required qodbc keywords?
My search for the qodbc keywords in the pyodbc documentation returns no results. Must I conclude f-strings are my only option?
Must I conclude f-strings are my only option?
Not at all.
pyodbc is built to deal with any ODBC driver, so it does not identify which keywords are "legal" and which ones arent. As explained here there are a few keywords that are specified by the DBAPI spec, and a few keywords reserved for pyodbc internal use, and they "are not passed to the odbc driver" implying that other keywords are passed to the ODBC driver.
Example: When I use this connect call ...
cnxn = pyodbc.connect(
driver="ODBC Driver 17 for SQL Server",
server="192.168.0.179,49242",
database="myDb",
uid="sa", pwd="_whatever_",
trusted_connection="no"
)
... the ODBC trace shows that this is the connection string passed to the driver
[ODBC][2404][1589493655.363466][SQLDriverConnectW.c][290]
Entry:
Connection = 0xf7d9c0
Window Hdl = (nil)
Str In = [driver=ODBC Driver 17 for SQL Server;server=192.168.0.179,49242;database=myDb;uid=sa;pwd=_whatever_;trusted_connection=no;][length = 122 (SQL_NTS)]
Note that trusted_connection is specific to SQL Server.

how to insert python logs in postgresql table?

I need to insert the logs from my test case into a table in postgresql data base.
I was able to connect to the db but I can't figure out how to insert this line result in the tabble, I have tried the below but it doesnt work
import logging
import psycopg2
from io import StringIO
from config import config
params = config()
conn = psycopg2.connect(**params)
print(conn)
curr = conn.cursor()
try:
if not hw.has_connection():
logging.error('Failure: Unable to reach websb! ==> '+ str(driver.find_element_by_xpath('//span[#jsselect="heading" and #jsvalues=".innerHTML:msg"]').text))
return
elif hw.is_websiteReachable():
logging.info("Success: website is reachable!")
curr.execute("""INSERT INTO db_name(logs) VALUES (%s)""", ("Success: website is reachable!"))
conn.commit()
except:
logging.error("Failure: Unable to reach website!")
return
Iam a total beginner in this. I have searched but I couldnt find a clear example or guide about it. the above code throws the exception eventhough the website is reachable. sorry if I sound dumb.
It looks like you're incorrectly constructing your SQL statement. Instead of INSERT INTO db_name(table_name) ... it should be INSERT INTO table_name(column_name) .... If you've correctly connected to the appropriate database in your connection settings, you usually don't have to specify the database name each time you write your SQL.
Therefore I would recommend, the following modification (assuming your table is called logs and it has a column named message):
# ...
sql = 'INSERT INTO logs(message) VALUES (%s);'
msg = 'Success: website is reachable!'
curr.execute(sql, (msg))
conn.commit()
You can read the pyscopg2 docs here for more information as well if that would help with passing named parameters to your SQL queries in Python.
You can check a good solution that I personally use in my in-server projects. You just need to give a connection-string to the CRUD object and all the things will be done. For Postgres you can use:
'postgresql+psycopg2://username:password#host:port/database'
or
'postgresql+pg8000://username:password#host:port/database'
for more details check SQLAlchemy Engine Configuration.

pypyodbc error 'Associated statement is not prepared'

I am trying to create an 'upsert' function for pypyodbc SQL Server. I have validated that the query built up will run in SSMS with the desired outcome, but when trying to execute and commit with pypyodbc I receive the following error: 'HY007', '[HY007] [Microsoft][ODBC SQL Server Driver]Associated statement is not prepared'.
Here is the upsert function:
def sql_upsert(sql_connection, table, key_field, key_value, **kwargs):
keys = ["{key}".format(key=k) for k in kwargs]
values = ["'{value}'".format(value=v) for v in kwargs.values()]
update_columns = ["{key} = '{value}'".format(key=k, value=v) for k, v in kwargs.items()]
sql = list()
#update
sql.append("UPDATE {table} SET ".format(table=table))
sql.append(", ".join(update_columns))
sql.append(" WHERE {} = '{}'".format(key_field, key_value))
sql.append(" IF ##ROWCOUNT=0 BEGIN ")
# insert
sql.append("INSERT INTO {table} (".format(table=table))
sql.append(", ".join(keys))
sql.append(") VALUES (")
sql.append(", ".join(values))
sql.append(")")
sql.append(" END")
query = "".join(sql)
print(query)
The function builds up a query string in a format based on this other thread How to insert or update using single query?
Here is an example of the output:
UPDATE test SET name='john' WHERE id=3012
IF ##ROWCOUNT=0 BEGIN
INSERT INTO test(name) VALUES('john')
END
The error message you cited is produced by the ancient "SQL Server" ODBC driver that ships as part of Windows. A more up-to-date driver version like "ODBC Driver 17 for SQL Server" should produce a meaningful error message.
If you look here or here you'll see people complaining about this over a decade ago.
Apparently SQL Server's ODBC driver returns that error when you're executing two statements that fail due to a field value being too long, or perhaps due to foreign key violations.
Using SSMS to see which statement causes this problem, or better - stop using ODBC and use pymssql
This error may also come when you don't give correct permissions to stored procedure
Go the SQL server --> Right click on your sp-->properties-->permissions
Add required users and roles which are going to execute this sp
This may help resolving the issue

Pushing dataframe to postgres using sqlalchemy and psycogp2

I am trying to write dataframes to postgres . For that DBAPI used is psycogp2.
localconn='postgresql+psycopg2://postgres/PSWD#localhost:5432/localPG'
localEng=engine.create_engine(localconn)
df.to_sql("DUMMY", localEng)
But its throwing error (psycopg2.OperationalError) could not translate host name postgres to address: Name or service not known
localPG is the database name.
Where I am doing wrong?
The format you have written is wrong, use the following:
localEng = create_engine('postgresql+psycopg2://[user]:[pass]#[host]:[port]/[schema]', echo=False)
and of course, you should replace every parameter between the bracket with the equivalent database credentials.

SQLObject throws: Unknown database 'dbname?charset=utf8'

I've got third-party Python script, it looks like it has to connect to MySQL database by means of SQLObject package.
Considering I've provided correct DSN, the script throws
sqlobject.dberrors.OperationalError: Unknown database
'dbname?charset=utf8'
I've traced the problem to this piece of code
ar['charset'] = 'utf8'
conn = connectionForURI(uri, **ar)
which calls this function.
And it connects fine when ar['charset'] = 'utf8' is commented, so no query string is provided.
I have this issue on Windows,
MySQL 5.5.25
Python 2.7.2
MySQL-python 1.2.5
SQLObject 3.0.0a1dev-20150327
What exactly is going on there, and how it is supposed to be fixed? Does the problem lie in dependencies or the script itself?
I have done some research and found out that recent version of SQLObject uses the following code to extract connection parameters from URI. Unfortunately, urlparse function works that way so path that is used as DB name further parsed with a query string.
As a workaround for this issue, I could suggest passing DB encoding parameter explicitly to the connection object as follow:
conn = connectionForURI(uri)
conn.dbEncoding = 'utf-8'
It might help but it's worth to make a pull request to fix DB name extraction from URI.
UPD: Older version like 2.x uses a different code to parse connection URL which works well.
Comma, not question mark:
db = MySQLdb.connect(host=DB_HOST, user=DB_USER, passwd=DB_PASS,
db=DB_NAME, charset="utf8", use_unicode=True)
If you can't get past the 3rd party software, abandon it.

Categories

Resources