The following works:
import pyodbc
pyodbc.connect('DRIVER={FreeTDS};Server=my.db.server;Database=mydb;UID=myuser;PWD=mypwd;TDS_Version=8.0;Port=1433;')
The following fails:
import sqlalchemy
sqlalchemy.create_engine("mssql://myuser:mypwd#my.db.server:1433/mydb?driver=FreeTDS& odbc_options='TDS_Version=8.0'").connect()
The error message for above is:
DBAPIError: (Error) ('08001', '[08001] [unixODBC][FreeTDS][SQL Server]Unable to connect to data source (0) (SQLDriverConnectW)') None None
Can someone please point me in the right direction? Is there a way I can simply tell sqlalchemy to pass a specific connect string through to pyodbc?
Please Note: I want to keep this DSN-less.
The example by #Singletoned would not work for me with SQLAlchemy 0.7.2. From the SQLAlchemy docs for connecting to SQL Server:
If you require a connection string that is outside the options presented above, use the odbc_connect keyword to pass in a urlencoded connection string. What gets passed in will be urldecoded and passed directly.
So to make it work I used:
import urllib
quoted = urllib.quote_plus('DRIVER={FreeTDS};Server=my.db.server;Database=mydb;UID=myuser;PWD=mypwd;TDS_Version=8.0;Port=1433;')
sqlalchemy.create_engine('mssql+pyodbc:///?odbc_connect={}'.format(quoted))
This should apply to Sybase as well.
NOTE: In python 3 the urllib module has been split into parts and renamed. Thus, this line in python 2.7:
quoted = urllib.quote_plus
has to be changed to this line in python3:
quoted = urllib.parse.quote_plus
I'm still interested in a way to do this in one line within the sqlalchemy create_engine statement, but I found the following workaround detailed here:
import pyodbc, sqlalchemy
def connect():
pyodbc.connect('DRIVER={FreeTDS};Server=my.db.server;Database=mydb;UID=myuser;PWD=mypwd;TDS_Version=8.0;Port=1433;')
sqlalchemy.create_engine('mssql://', creator=connect)
UPDATE: Addresses a concern I raised in my own comment about not being able to pass arguments to the connect string. The following is a general solution if you need to dynamically connect to different databases at runtime. I only pass the database name as a parameter, but additional parameters could easily be used as needed:
import pyodbc
import os
class Creator:
def __init__(self, db_name='MyDB'):
"""Initialization procedure to receive the database name"""
self.db_name = db_name
def __call__(self):
"""Defines a custom creator to be passed to sqlalchemy.create_engine
http://stackoverflow.com/questions/111234/what-is-a-callable-in-python#111255"""
if os.name == 'posix':
return pyodbc.connect('DRIVER={FreeTDS};'
'Server=my.db.server;'
'Database=%s;'
'UID=myuser;'
'PWD=mypassword;'
'TDS_Version=8.0;'
'Port=1433;' % self.db_name)
elif os.name == 'nt':
# use development environment
return pyodbc.connect('DRIVER={SQL Server};'
'Server=127.0.0.1;'
'Database=%s_Dev;'
'UID=user;'
'PWD=;'
'Trusted_Connection=Yes;'
'Port=1433;' % self.db_name)
def en(db_name):
"""Returns a sql_alchemy engine"""
return sqlalchemy.create_engine('mssql://', creator=Creator(db_name))
This works:
import sqlalchemy
sqlalchemy.create_engine("DRIVER={FreeTDS};Server=my.db.server;Database=mydb;UID=myuser;PWD=mypwd;TDS_Version=8.0;Port=1433;").connect()
In that format, SQLAlchemy just ignores the connection string and passes it straight on to pyodbc.
Update:
Sorry, I forgot that the uri has to be url-encoded, therefore, the following works:
import sqlalchemy
sqlalchemy.create_engine("DRIVER%3D%7BFreeTDS%7D%3BServer%3Dmy.db.server%3BDatabase%3Dmydb%3BUID%3Dmyuser%3BPWD%3Dmypwd%3BTDS_Version%3D8.0%3BPort%3D1433%3B").connect()
Internally "my.db.server:1433" is passed as part of a connection string like SERVER=my.db.server:1433;.
Unfortunately unixODBC/FreeTDS won't accept a port in the SERVER bit. Instead it wants SERVER=my.db.server;PORT=1433;
To use the sqlalchemy syntax for a connection string, you must specify the port as a parameter.
sqlalchemy.create_engine("mssql://myuser:mypwd#my.db.server:1433/mydb?driver=FreeTDS& odbc_options='TDS_Version=8.0'").connect()
becomes:
sqlalchemy.create_engine("mssql://myuser:mypwd#my.db.server/mydb?driver=FreeTDS&port=1433& odbc_options='TDS_Version=8.0'").connect()
To pass various parameters to your connect function, it sounds like format string might do what you want:
def connect(server, dbname, user, pass):
pyodbc.connect('DRIVER={FreeTDS};Server=%s;Database=%s;UID=%s;PWD=%s;TDS_Version=8.0;Port=1433;' % (server, dbname, user, pass))
And you would then call it with something like:
connect('myserver', 'mydatabase', 'myuser', 'mypass')
More info on format strings is here: http://docs.python.org/library/string.html#formatstrings
Related
Using the psycopg2 module to connect to the PostgreSQL database using python. Able to execute all queries using the below connection method. Now I want to specify a different schema than public to execute my SQL statements. Is there any way to specify the schema name in the connection method?
conn = psycopg2.connect(host="localhost",
port="5432",
user="postgres",
password="password",
database="database",
)
I tried to specify schema directly inside the method.
schema="schema2"
But I am getting the following programming error.
ProgrammingError: invalid dsn: invalid connection option "schema"
When we were working on ThreadConnectionPool which is in psycopg2 and creating connection pool, this is how we did it.
from psycopg2.pool import ThreadedConnectionPool
db_conn = ThreadedConnectionPool(
minconn=1, maxconn=5,
user="postgres", password="password", database="dbname", host="localhost", port=5432,
options="-c search_path=dbo,public"
)
You see that options key there in params. That's how we did it.
When you execute a query using the cursor from that connection, it will search across those schemas mentioned in options i.e., dbo,public in sequence from left to right.
You may try something like this:
psycopg2.connect(host="localhost",
port="5432",
user="postgres",
password="password",
database="database",
options="-c search_path=dbo,public")
Hope this might help you.
If you are using the string form you need to URL escape the options argument:
postgresql://localhost/airflow?options=-csearch_path%3Ddbo,public
(%3D = URL encoding of =)
This helps if you are using SQLAlchemy for example.
I'm trying to unit test a class connection to a database. To avoid hardcoding a database, I'd like to assert that the mysql.connector.connect method is called with adequate values.
from mysql.connector import connect
from mysql.connector import Error
from discovery.database import Database
class MariaDatabase(Database):
def connect(self, username, password):
"""
Don't forget to close the connection !
:return: connection to the database
"""
try:
return connect(host=str(self.target),
database=self.db_name,
user=username,
password=password)
I've read documentation around mocks and similar problems (notably this one Python Unit Test : How to unit test the module which contains database operations?, which I thought would solve my problem, but mysql.connector.connect keeps being called instead of the mock).
I don't know what could I do to UTest this class
sqlalchemy, a db connection module for Python, uses SQL Authentication (database-defined user accounts) by default. If you want to use your Windows (domain or local) credentials to authenticate to the SQL Server, the connection string must be changed.
By default, as defined by sqlalchemy, the connection string to connect to the SQL Server is as follows:
sqlalchemy.create_engine('mssql://*username*:*password*#*server_name*/*database_name*')
This, if used using your Windows credentials, would throw an error similar to this:
sqlalchemy.exc.DBAPIError: (Error) ('28000', "[28000] [Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for us
er '***S\\username'. (18456) (SQLDriverConnect); [28000] [Microsoft][ODBC SQL Server Driver][SQL Server]Login failed for us
er '***S\\username'. (18456)") None None
In this error message, the code 18456 identifies the error message thrown by the SQL Server itself. This error signifies that the credentials are incorrect.
In order to use Windows Authentication with sqlalchemy and mssql, the following connection string is required:
ODBC Driver:
engine = sqlalchemy.create_engine('mssql://*server_name*/*database_name*?trusted_connection=yes')
SQL Express Instance:
engine = sqlalchemy.create_engine('mssql://*server_name*\\SQLEXPRESS/*database_name*?trusted_connection=yes')
If you're using a trusted connection/AD and not using username/password, or otherwise see the following:
SAWarning: No driver name specified; this is expected by PyODBC when using >DSN-less connections
"No driver name specified; "
Then this method should work:
from sqlalchemy import create_engine
server = <your_server_name>
database = <your_database_name>
engine = create_engine('mssql+pyodbc://' + server + '/' + database + '?trusted_connection=yes&driver=ODBC+Driver+13+for+SQL+Server')
A more recent response if you want to connect to the MSSQL DB from a different user than the one you're logged with on Windows. It works as well if you are connecting from a Linux machine with FreeTDS installed.
The following worked for me from both Windows 10 and Ubuntu 18.04 using Python 3.6 & 3.7:
import getpass
from sqlalchemy import create_engine
password = getpass.getpass()
eng_str = fr'mssql+pymssql://{domain}\{username}:{password}#{hostip}/{db}'
engine = create_engine(eng_str)
What changed was to add the Windows domain before \username.
You'll need to install the pymssql package.
Create Your SqlAlchemy Connection URL From Your pyodbc Connection String OR Your Known Connection Parameters
I found all the other answers to be educational, and I found the SqlAlchemy Docs on connection strings helpful too, but I kept failing to connect to MS SQL Server Express 19 where I was using no username or password and trusted_connection='yes' (just doing development at this point).
Then I found THIS method in the SqlAlchemy Docs on Connection URLs built from a pyodbc connection string (or just a connection string), which is also built from known connection parameters (i.e. this can simply be thought of as a connection string that is not necessarily used in pyodbc). Since I knew my pyodbc connection string was working, this seemed like it would work for me, and it did!
This method takes the guesswork out of creating the correct format for what you feed to the SqlAlchemy create_engine method. If you know your connection parameters, you put those into a simple string per the documentation exemplified by the code below, and the create method in the URL class of the sqlalchemy.engine module does the correct formatting for you.
The example code below runs as is and assumes a database named master and an existing table named table_one with the schema shown below. Also, I am using pandas to import my table data. Otherwise, we'd want to use a context manager to manage connecting to the database and then closing the connection like HERE in the SqlAlchemy docs.
import pandas as pd
import sqlalchemy
from sqlalchemy.engine import URL
# table_one dictionary:
table_one = {'name': 'table_one',
'columns': ['ident int IDENTITY(1,1) PRIMARY KEY',
'value_1 int NOT NULL',
'value_2 int NOT NULL']}
# pyodbc stuff for MS SQL Server Express
driver='{SQL Server}'
server='localhost\SQLEXPRESS'
database='master'
trusted_connection='yes'
# pyodbc connection string
connection_string = f'DRIVER={driver};SERVER={server};'
connection_string += f'DATABASE={database};'
connection_string += f'TRUSTED_CONNECTION={trusted_connection}'
# create sqlalchemy engine connection URL
connection_url = URL.create(
"mssql+pyodbc", query={"odbc_connect": connection_string})
""" more code not shown that uses pyodbc without sqlalchemy """
engine = sqlalchemy.create_engine(connection_url)
d = {'value_1': [1, 2], 'value_2': [3, 4]}
df = pd.DataFrame(data=d)
df.to_sql('table_one', engine, if_exists="append", index=False)
Update
Let's say you've installed SQL Server Express on your linux machine. You can use the following commands to make sure you're using the correct strings for the following:
For the driver: odbcinst -q -d
For the server: sqlcmd -S localhost -U <username> -P <password> -Q 'select ##SERVERNAME'
pyodbc
I think that you need to put:
"+pyodbc" after mssql
try this:
from sqlalchemy import create_engine
engine = create_engine("mssql+pyodbc://user:password#host:port/databasename?driver=ODBC+Driver+17+for+SQL+Server")
cnxn = engine.connect()
It works for me
Luck!
If you are attempting to connect:
DNS-less
Windows Authentication for a server not locally hosted.
Without using ODBC connections.
Try the following:
import sqlalchemy
engine = sqlalchemy.create_engine('mssql+pyodbc://' + server + '/' + database + '?trusted_connection=yes&driver=SQL+Server')
This avoids using ODBC connections and thus avoids pyobdc interface errors from DPAPI2 vs DBAPI3 conflicts.
I would recommend using the URL creation tool instead of creating the url from scratch.
connection_url = sqlalchemy.engine.URL.create("mssql+pyodbc",database=databasename, host=servername, query = {'driver':'SQL Server'})
engine = sqlalchemy.create_engine(connection_url)
See this link for creating a connection string with SQL Server Authentication (non-domain, uses username and password)
I'm guessing this is a pretty basic question, but I can't figure out why:
import psycopg2
psycopg2.connect("postgresql://postgres:postgres#localhost/postgres")
Is giving the following error:
psycopg2.OperationalError: missing "=" after
"postgresql://postgres:postgres#localhost/postgres" in connection info string
Any idea? According to the docs about connection strings I believe it should work, however it only does like this:
psycopg2.connect("host=localhost user=postgres password=postgres dbname=postgres")
I'm using the latest psycopg2 version on Python2.7.3 on Ubuntu12.04
I would use the urlparse module to parse the url and then use the result in the connection method. This way it's possible to overcome the psycop2 problem.
from urlparse import urlparse # for python 3+ use: from urllib.parse import urlparse
result = urlparse("postgresql://postgres:postgres#localhost/postgres")
username = result.username
password = result.password
database = result.path[1:]
hostname = result.hostname
port = result.port
connection = psycopg2.connect(
database = database,
user = username,
password = password,
host = hostname,
port = port
)
The connection string passed to psycopg2.connect is not parsed by psycopg2: it is passed verbatim to libpq. Support for connection URIs was added in PostgreSQL 9.2.
To update on this, Psycopg3 does actually include a way to parse a database connection URI.
Example:
import psycopg # must be psycopg 3
pg_uri = "postgres://jeff:hunter2#example.com/db"
conn_dict = psycopg.conninfo.conninfo_to_dict(pg_uri)
with psycopg.connect(**conn_dict) as conn:
...
Another option is using SQLAlchemy for this. It's not just ORM, it consists of two distinct components Core and ORM, and it can be used completely without using ORM layer.
SQLAlchemy provides such functionality out of the box by create_engine function. Moreover, via URI you can specify DBAPI driver or many various postgresql settings.
Some examples:
# default
engine = create_engine("postgresql://user:pass#localhost/mydatabase")
# psycopg2
engine = create_engine("postgresql+psycopg2://user:pass#localhost/mydatabase")
# pg8000
engine = create_engine("postgresql+pg8000://user:pass#localhost/mydatabase")
# psycopg3 (available only in SQLAlchemy 2.0, which is currently in beta)
engine = create_engine("postgresql+psycopg://user:pass#localhost/test")
And here is a fully working example:
import sqlalchemy as sa
# set connection URI here ↓
engine = sa.create_engine("postgresql://user:password#db_host/db_name")
ddl_script = sa.DDL("""
CREATE TABLE IF NOT EXISTS demo_table (
id serial PRIMARY KEY,
data TEXT NOT NULL
);
""")
with engine.begin() as conn:
# do DDL and insert data in a transaction
conn.execute(ddl_script)
conn.exec_driver_sql("INSERT INTO demo_table (data) VALUES (%s)",
[("test1",), ("test2",)])
conn.execute(sa.text("INSERT INTO demo_table (data) VALUES (:data)"),
[{"data": "test3"}, {"data": "test4"}])
with engine.connect() as conn:
cur = conn.exec_driver_sql("SELECT * FROM demo_table LIMIT 2")
for name in cur.fetchall():
print(name)
# you also can obtain raw DBAPI connection
rconn = engine.raw_connection()
SQLAlchemy provides many other benefits:
You can easily switch DBAPI implementations just by changing URI (psycopg2, psycopg2cffi, etc), or maybe even databases.
It implements connection pooling out of the box (both psycopg2 and psycopg3 has connection pooling, but API is different)
asyncio support via create_async_engine (psycopg3 also supports asyncio).
I have a seemingly straight-forward situation, but can't find a straight-forward solution.
I'm using sqlalchemy to query postgres. If a client timeout occurs, I'd like to stop/cancel the long running postgres queries from another thread. The thread has access to the Session or Connection object.
At this point I've tried:
session.bind.raw_connection().close()
and
session.connection().close()
and
session.close
and
session.transaction.close()
But no matter what I try, the postgres query still continues until it's end. I know this from watching pg in top. Shouldn't this be fairly easy to do? I'm I missing something? Is this impossible without getting the pid and sending a stop signal directly?
This seems to work well, so far:
def test_close_connection(self):
import threading
from psycopg2.extensions import QueryCanceledError
from sqlalchemy.exc import DBAPIError
session = Session()
conn = session.connection()
sql = self.get_raw_sql_for_long_query()
seconds = 5
t = threading.Timer(seconds, conn.connection.cancel)
t.start()
try:
conn.execute(sql)
except DBAPIError, e:
if type(e.orig) == QueryCanceledError:
print 'Long running query was cancelled.'
t.cancel()
source
For those MySQL folks that may have ended up here, a modified version of this answer that kills the query from a second connection can work. Essentially the following, assuming pymysql under the hood:
thread_id = conn1.connection.thread_id()
t = threading.Timer(seconds, lambda: conn2.execute("kill {}".format(thread_id)))
The original connection will raise pymysql.err.OperationalError. See this other answer for a neat way to create a long running query for testing.
Found on MYSQL that you can specify the query optimiser hints.
One such hint is MAX_EXECUTION_TIME to specify how long query should execute before termination.
You can add this in your app.py
#event.listens_for(engine, 'before_execute', retval=True)
def intercept(conn, clauseelement, multiparams, params):
from sqlalchemy.sql.selectable import Select
# check if it's select statement
if isinstance(clauseelement, Select):
# 'froms' represents list of tables that statement is querying
table = clauseelement.froms[0]
'''Update the timeout here in ms (1s = 1000ms)'''
timeout_ms = 4000
# adding filter in clause
clauseelement = clauseelement.prefix_with(f"/*+ MAX_EXECUTION_TIME({timeout_ms}) */", dialect="mysql")
return clauseelement, multiparams, params
SQLAlchemy query API not working correctly with hints and
MYSQL reference