pyodbc with MultiSubnetFailover - python

Recently, one of our servers was migrated to 3-node cluster from a pylon server. The connection string below is what I used previously via python and pyodbc and never had any issues.
server = 'test_server'
database = 'test_db'
cnxn = 'DRIVER={SQL Server};SERVER='+server+';DATABASE='+database+';Trusted_Connection=yes'
With the new server I started receiving time out errors. So i thought I had to add MultiSubnetFailover to the connection string such as the following
server = 'test_server'
database = 'test_db'
cnxn = 'DRIVER={SQL Server};SERVER='+server+';DATABASE='+database+';Trusted_Connection=yes;MultiSubnetFailover=True'
However, I am still receiving a time out error as well as an additiaonl error seen below
[Microsoft][ODBC SQL Server Driver]Login timeout expired (0) (SQLDriverConnect); [HYT00] [Microsoft][ODBC SQL Server Driver]Invalid connection string attribute (0)
Does pyodbc support MultiSubnetFailover? I couldn't find documentation one way or another.
If so, how do I implement it? On the other side, if it does not, how would i go about connecting?
Lastly, should I use the IP address instead?

The ancient SQL Server ODBC driver that ships with Windows doesn't support MultiSubnetFailover. I suggest you move to a modern driver or have your DBA set RegisterAllProvidersIP to zero to support down level clients.
In the interim, you could specify the current listener IP address or the host name of the current primary node. However, that will fail if the primary is failed over to a secondary node on a different subnet.

Related

Oracle 19c connection with python Listener refused connection

I'm using oracledb library as the cx_oracle is not working now, using the command oracledb.connect(), and it always gives error
here is my code:
connection = oracledb.connect(
user='myusername',
password='mypassword',
dsn='xx.xx.xxx.xxx:portnumber/dsnname')
print("Successfully connected to Oracle Database")
oracledb.exceptions.OperationalError: DPY-6000: cannot connect to database. Listener refused connection. (Similar to ORA-12660)
and if I set the parameters like this
connection = oracledb.connect(
user='myusername',
password='mypassword',
dsn='xx.xx.xxx.xxx:portnumber:dsnname')
print("Successfully connected to Oracle Database")
it returns error
oracledb.exceptions.DatabaseError: DPY-4027: no configuration directory to search for tnsnames.ora
the database administrator approved the variables are correct and we are using thin client which is the default in the code parameters so I don't know what is making the problem?
The error (ORA-12660) indicates that you have encryption or checksumming parameters set on the database. These are set up in the server side sqlnet.ora and look something like this:
SQLNET.ENCRYPTION_SERVER=REQUIRED
SQLNET.CRYPTO_CHECKSUM_SERVER=REQUIRED
SQLNET.ENCRYPTION_TYPES_SERVER=(AES256,AES192,AES128)
SQLNET.CRYPTO_CHECKSUM_TYPES_SERVER=(SHA1)
SQLNET.ENCRYPTION_CLIENT=REQUIRED
SQLNET.CRYPTO_CHECKSUM_CLIENT=REQUIRED
SQLNET.ENCRYPTION_TYPES_CLIENT=(AES256,AES192,AES128)
SQLNET.CRYPTO_CHECKSUM_TYPES_CLIENT=(SHA1)
This has been noted in the documentation. Your only option is to either disable the server requirement for native network encryption (NNE) or enable thick mode (which works the same way as cx_Oracle).
You can follow along with this enhancement request to see when NNE support is added to thin mode.

Databricks JDBC Connections works but query fails

I am trying to build a connection to a SQL Server database through Azure Databricks. While there are a number of questions attempting to alleviate issues here, I have yet to find a solution for mine.
I am able to connect, at least it seems because the DataFrame (spark) object reads the exact schema from my database table, yet in attempting to look at the data (display(df) or df.show()) it throws a connection error.
Here is how I'm connecting:
jdbcUrl = "jdbc:sqlserver://{}:{};database={}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
'user': jdbcUsername,
'password': jdbcPassword,
"driver" : "com.microsoft.sqlserver.jdbc.SQLServerDriver"
}
pushdown_query = "(select * from persons where personid = 3040) Person"
df = spark.read.jdbc(url=jdbcUrl, table=pushdown_query, properties=connectionProperties)
display(df)
I can see the object df and it correctly identifies all 47 fields in the table (see below photo), but display(df) throws the following error:
SQLServerException: The TCP/IP connection to the host hornets-sql.westus.cloudapp.azure.com, port 1433 has failed. Error: "connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".
By default, SQL Server has a firewall enabled and don't allow access from arbitrary IPs. In your case you may have the driver node allowed in the firewall - that's why you get the table schema, but the reading itself happens on the worker nodes that may not be enabled on the firewall, and reading fails.
You may try to solve it by:
Adding public IPs all worker nodes to the firewall of SQL server
or by configuring the private link/endpoint for Azure SQL into the VNet of your
or using service endpoints for Azure services
The last two items are well covered in the Databrick's blog post.

Extract data from client database

I have to send multiple data through Python from a DB allocated in the client to the Main DB in the server, whats the best solution to this? i currently have my web server up and functioning, i can fill my DB locally but i dont really know how to do it remotely, im using Python in my hardware, here is what i have so far in the client:
import mysql.connector
cnx = mysql.connector.connect(user='user', password='pass', host='url?', database='db') #im able to enter with this
cursor = cnx.cursor()
query = ("INSERT INTO IGNORE table " "(id,date,son) " "VALUES( (%s,%s,%s)")
for row in data: #ive already extracted data from the other DB
cursor.execute(query, (row[0],row[1],row[2]))
wich yealds an error:
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the righ syntax to use near 'IGNORE table(id,date,son) VALUES (number, '2017-11-09 14:33:15', 18.987)' at line 1
One possibility - Make sure your mysqld service is binding to your external ip address (set in your mysql config file on the server). By default I believe it binds to 127.0.0.1 or localhost. If it is binding to localhost, your db will never respond to external requests.
I fixed the issue with the MySQL port in the following way: I opened the port internally from my VM instead of from my Virtual Network (I'm using the Google Cloud Platform).

Where does pyodbc get its user and pwd from when none are provided in the connection string

I inherited a project and am having what seems to be a permissions issue when trying to interact with the database. Basically we have a two step process of detach and then delete.
Does anyone know where the user would come from if the connection string only has driver, server, and database name.
EDIT
I am on Windows Server 2008 standard
EDIT
"DRIVER={%s};SERVER=%s;DATABASE=%s;" Where driver is "SQL Server"
I just did a few tests and the {SQL Server} ODBC driver apparently defaults to using Windows Authentication if the Trusted_connection and UID options are both omitted from the connection string. So, your Python script must be connecting to the SQL Server instance using the Windows credentials of the user running the script.
(On the other hand, the {SQL Server Native Client 10.0} driver seems to default to SQL Authentication unless Trusted_connection=yes is included in the connection string.)
Since you're on Windows, a few things you should know:
Using the Driver={SQL Server} only enables features and data types
supported by SQL Server 2000. For features up through 2005, use {SQL
Native Client} and for features up through 2008 use {SQL Server
Native Client 10.0}.
To view your ODBC connections, go to Start and search for "ODBC" and
bring up Data Sources (ODBC). This will list User, System, and File
DSNs in a GUI. You should find the DSN with username and password
filled in there.

Client-side pyodbc error: "Server does not exist or access denied."

I have a python application designed to pull data from a remote database server using pyodbc, then organize and display the data in a spreadsheet. I've had it working fine for several months now, with multiple coworkers in my department using it through a shared network folder.
My connection:
pyodbc.connect('DRIVER={SQL Server};
SERVER=<myServer_name>;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword')
A different employee within our same network recently tried to use the program and got this error:
pyodbc.Error: ('08001','[08001][Microsoft][ODBC SQL Server Driver]
[DBNETLIB]SQL Server does not exist or access denied. (17) (SQLDriverConnect)')
It looked like a simple permissions issue to me, so to confirm I replaced the userID and password with my own hardcoded in, but it gave the same error. Furthermore the same employee can log in and execute queries through SQL Server Management Studio without issue.
Since everyone else in our department can still use the application fine, I know it must be a client-side issue, but I just can't pinpoint the problem. Any input would be greatly appreciated, Thanks!
Updates:
Per flipperPA's answer below, I updated my connection string to include the port:
con = pyodbc.connect('''DRIVER={SQL Server};
SERVER=<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword;''')
Unfortunately we still got the same error.
He is running 32-bit Windows 7 on an HP machine, the same setup as the rest of the group so it shouldn't to be an os-level issue.
He does operate SSMS on the same machine, but I ran through the telnet check just be sure - no issue there.
I've taught myself the pyodbc API and basic SQL, but I'm still relatively new to the underlying concepts of databases and remote connections. Could you explain the TDS driver a little more?
When including SERVER, I've found you often need to include the PORT as well; this is the most likely problem:
pyodbc.connect('DRIVER={SQL Server};
SERVER=<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword')
I connect mostly from Linux, however. Could it be the other person is connecting from Mac OS/X or Linux? If so, they'll need to use the FreeTDS driver (MS provides one as well, but it is flaky, at best). If you continue to have problems, from the coworkers machine, make sure you can connect from the machine you're having issues with (unless its the same machine they can connect SSMS from):
telnet <myServer_name> 1433
If it connects, you're good, if it hangs on connecting, you're most likely looking at a firewall issue. Good luck!
After talking with a knowledgeable friend I was finally able to figure out my issue!
For some reason, the user's system was configured to connect using named pipes, but the server I was connecting to only had TCP/IP protocol enabled. The solution was to force the application to use TCP/IP by adding "tcp:" to the front of the server name.
The fixed connection string:
pyodbc.connect('''DRIVER={SQL Server};
SERVER=tcp:<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword
''')
If for any of you still doesn't work you can try to refer the Localdb (if that's the case) by its pipe address.
If the DB name is LocalDBA, in cmd type
SqlLocalDB LocalDBA v
Get the instance pipe name and then put it on the server name:
conn_str = (
r'DRIVER={SQL Server};'
r'SERVER=np:\\.\pipe\LOCALDB#ECE5B7EE\tsql\query;'
r'PORT=1433;'
r'DATABASE=VT-DE;'
r'trusted_connection=yes;'
)

Categories

Resources