pyodbc + PostgreSQL ODBC: connection string works, DSN doesn't - python

I am trying to connect to a Postgres DB built for a Django app. I can connect fine in windows, but when we moved it over to a Linux server for production it stopped working. I tracked it down to pyodbc not working. So in a separate script, I have been trying to get a connection working with no luck. I'm pretty sure the Linux server is running Redhat (yum is the install, but I can double check if it matters)
Here are some of the things I have tried:
installed unixODBC-devel
added a DSN to the user sourcename /home/localUsername/.odbc.ini file as follows:
[DSNName]
Description=Postgres Connection to Database
Driver=/usr/pgsql-10/lib/psqlodbc.so
Server=servername
Database=dbname
PWD=pass
UID=username
Running odbcinst -q -d returns:
[PostgreSQL]
python script I have tried (although using interpreter for now)
con = odbc.connect("DSN=DSNName")
con = odbc.connect("Driver={PostgreSQL};Uid=username;Pwd=pass; Server=servername;Port=5432)
con = odbc.connect("Driver={PostgreSQL Unicode(x64)};Uid=username;Pwd=pass; Server=servername;Port=5432)
I get one of three errors depending on which driver I try:
For the Driver using Unicode(x32) I get:
pyodbc.Error ('01000', "[01000] [unixODBC][Driver Manager]can't open lib 'PostgreSQL Unicode(x32)' : file not found ...
I figure that means this driver is not installed which is fine.
For the DSN approach I get:
pyodbc.OperationalError: ('08001', '[08001] FATAL: role:"localUsername" does not exists\n (101) (SQLDriverConnect)')
This second error seems to make me think (maybe incorrectly) that it is trying to use my localUsername to authenticate to Postgres, when I want to use a special admin username that was setup for the host for now.
For the third option (PostgreSQL):
pyodbc.OperationalError 08001 FATAL: database "dbname" does not exist
I don't understand why that might be? My first thought is Linux wants to use a different port for connection. Locally on windows I can use the 5432 port and it worked fine. So I'm at a loss on how to get it to find the DB assuming the rest is working okay.
If you need additional details let me know and I'll try to add them.
Edit:
Have python (and Django) on one server. DB is on another.
Tried running psql -h OSServername -U 'username' with the same: role error/DB not found errors. I feel like I must be needing something after OSServername like 'OsServername/pgAdminServer' but that didn't work
where db 'username' is found by right clicking inside of pgAdmin one of the DB server names and selecting properties. Are the Server names inside pgAdmin different and do I need to somehow use the pgAdmin Server Name as part of the connection string?
As the comments suggest, starting with the psql -h command seems like a good place to start as it gets rid of the python complexity. Once I can get that command working, I might be able to fix the rest. What do I type when my Linux server name (Host name) is 'LinuxName', pgAdmin Server is 'pgAdminServer', the actual DB has a name 'dbName', and the pgAdmin username is 'username'. 'dbName' has an owner 'owner' which is different from the username of the pgServer as well as different from the Linux username I am signed in as. I also validated that the 'pgAdminServer' shows port 5432, so that shouldn't be the issue.
Edit 2:
I got the pyodbc.connect('Driver={PostgreSQL};Server=servNm;Uid=uid;pwd=pwd;Database=db') to work.
Now just need the last step for the DSN approach. Your dump_dsn worked to find a typo in my dsn file (.odbc.ini in my local home directory). So that helped. Still not finding the DB.
File in: /etc/odbcinst.ini list the following drivers which I have tried all three in my DSN file:
/usr/pgsql-10/lib/psqlodbc.so
/usr/pgsql-10/lib/psqlodbca.so
/usr/pgsql-10/lib/psqlodbcw.so
here is the info again from my .odbc.ini file in home/user/.odbc.ini:
variables: servNm, uid, db, and pwd match exactly with those found in my pyodbc.connect() string now working.
[DSNName]
Description=Postgres Connection to Database
Driver=/usr/pgsql-10/lib/psqlodbc.so
Server=servNm
CommLog=0
Debug=0
Fetch=100
UniqueIndex=1
UseDeclareFetch=0
Database=db
UID=uid
Username=uid
PWD=pwd
ReadOnly=0

Deleting and re-creating the ~/.odbc.ini file appears to have resolved the issue. This makes us suspect that there were one or more unusual characters in the previous version of that file that were causing strange behaviour.
One possible source of such troublesome (and sometimes even invisible!) characters is when copying text from the web. If we copy the following from our web browser …
DRIVER=PostgreSQL Unicode
… and paste it into a blank document in our text editor with the default encoding UTF-8 then everything looks normal. However, if we save that file (as UTF-8) and open it in a hex editor we can see that the space is not a normal space (U+0020) …
… it is a NO-BREAK SPACE (a.k.a. "NBSP", U+00A0, \xc2\xa0 in UTF-8 encoding) so the chances are very good that we would get an error when trying to use that DSN because b'PostgreSQL\xc2\xa0Unicode' is not the same as b'PostgreSQL Unicode'.

I had the exact same problem. I looked for several solutions, but none worked.
The problem was solved more easily than I thought:
1 - Remove all packages about postgresodbc:
$ sudo apt-get remove odbc-postgresql
2 - Install two libs, in the same order below:
$ apt-get install libcppdb-postgresql0 odbc-postgresql
Enjoy!
Doing so worked perfectly here.

Related

How to add SUPER priviledge to MySQL in pythonanywhere?

I am using PythonAnywhere to host my web application for testing purpose. My frontend and python script is working fine. Now I want to connect it to MySQL database. I have uploaded my .sql file to the mysite folder and trying to restore it using this syntax:
mysql -u username -h username.mysql.pythonanywhere-services.com 'username$scm' < ab.sql
as told in Backing up (and restoring) MySQL databases (where username=created username) but it's throwing this error:
ERROR 1419 (HY000) at line 88: You do not have the SUPER privilege and binary logging is enabled (you *might* want to use the less safe log_bin_
trust_function_creators variable)
I've tried to fix this error by following this How to grant super privilege to the user? but still it's throwing error:
ERROR 1044 (42000): Access denied for user 'username'#'%' to database 'username$scm'
Please help me out.
Take a look at the answer on PythonAnywhere forums:
You would not be able to get super privileges on MySQL (only
postgres). Could you disable binary logging before doing the restore?
You could either try to edit the ab.sql file to take out that line, or
turn it off from wherever you were creating the sqldump originally,
and doing the sqldump again.

mssql python Connection

I am using Pycharm and try to link to a mssql server. When I link to a server that requires SQL authentication, the connect is created successfully. However, when I try to link to a server that requires my Windows Authentication, even though I use my username or password of windows log in, I cannot connect successfully. May I know what should be a proper way to setup if it is windows authentication.
I am using the below code:
import pymssql
conn=pymssql.connect(host="10.xx.xx.xx",user="CORPORATE/mywindowsloginname",password="mypassword",database="BIC_reference")
cur=conn.cursor()y
cur.execute('SELECT top 10 * FROM dbo.hi_invoiceline')
print (cur.fetchall())
in order to use Windows Authentication you have to add the property trusted_connection='yes' to your connection string. In this case you can omit user and password:
conn=pymssql.connect(host="10.xx.xx.xx",database="BIC_reference",trusted_connection='yes')
When using Windows Authentication, you should not specify any user credentials. The following should work assuming your Windows account has the relevant permissions:
conn=pymssql.connect(host="10.xx.xx.xx",database="BIC_reference")
I have tested this using pymssql-2.1.3. Using this version there was no need to specify trusted_connection='yes' (see apomene's answer), however, you may want to try that as well in case the above snippet doesn't work.

Oracle incorrectly looking in TNS Names: TNS:could not resolve the connect identifier specified

I'm trying to make a connection to an oracle database with cx_Oracle but am getting the following error message:
ORA-12154: TNS:could not resolve the connect identifier specified
I'm using a connection string such as this one:
'xxxx/pw#lonod-com:1221/LNOUND_USER.uk.something.com'
The connection string is definitely correct as it is working from a different computer on the same network. I can also connect to the database when using Oracle SQL Developer, it's simply not working from Python.
I suspect that for some reason it keeps looking for a TNS Name entry, which I am not using. Is there a flag somewhere that could cause cx_Oracle to keep looking for a TNS name entry or what else could be causing this problem?
I have seen this occur if you have a sqlnet.ora configuration file that does not include the EZCONNECT option in the names.directory_path configuration variable. Below are a few ways to check what you are using. You can also test this connection string with SQL*Plus -- if it works with SQL*Plus it will work with cx_Oracle as well.
1) If you have the environment variable TNS_ADMIN set, its value indicates where Oracle searches for configuration files. If not and you have a full Oracle client installed it will look inside $ORACLE_HOME/network/admin
2) If you have a full Oracle client installed you can also use the tnsping utility to determine what Oracle is using and from what configuration files it is reading.
3) If you have a sqlnet.ora file in the location Oracle is searching for configuration files, then look for the names.directory_path= line in the file. If it is found, it needs to look something like this:
names.directory_path = (TNSNAMES, EZCONNECT)
Hope that helps!

Client-side pyodbc error: "Server does not exist or access denied."

I have a python application designed to pull data from a remote database server using pyodbc, then organize and display the data in a spreadsheet. I've had it working fine for several months now, with multiple coworkers in my department using it through a shared network folder.
My connection:
pyodbc.connect('DRIVER={SQL Server};
SERVER=<myServer_name>;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword')
A different employee within our same network recently tried to use the program and got this error:
pyodbc.Error: ('08001','[08001][Microsoft][ODBC SQL Server Driver]
[DBNETLIB]SQL Server does not exist or access denied. (17) (SQLDriverConnect)')
It looked like a simple permissions issue to me, so to confirm I replaced the userID and password with my own hardcoded in, but it gave the same error. Furthermore the same employee can log in and execute queries through SQL Server Management Studio without issue.
Since everyone else in our department can still use the application fine, I know it must be a client-side issue, but I just can't pinpoint the problem. Any input would be greatly appreciated, Thanks!
Updates:
Per flipperPA's answer below, I updated my connection string to include the port:
con = pyodbc.connect('''DRIVER={SQL Server};
SERVER=<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword;''')
Unfortunately we still got the same error.
He is running 32-bit Windows 7 on an HP machine, the same setup as the rest of the group so it shouldn't to be an os-level issue.
He does operate SSMS on the same machine, but I ran through the telnet check just be sure - no issue there.
I've taught myself the pyodbc API and basic SQL, but I'm still relatively new to the underlying concepts of databases and remote connections. Could you explain the TDS driver a little more?
When including SERVER, I've found you often need to include the PORT as well; this is the most likely problem:
pyodbc.connect('DRIVER={SQL Server};
SERVER=<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword')
I connect mostly from Linux, however. Could it be the other person is connecting from Mac OS/X or Linux? If so, they'll need to use the FreeTDS driver (MS provides one as well, but it is flaky, at best). If you continue to have problems, from the coworkers machine, make sure you can connect from the machine you're having issues with (unless its the same machine they can connect SSMS from):
telnet <myServer_name> 1433
If it connects, you're good, if it hangs on connecting, you're most likely looking at a firewall issue. Good luck!
After talking with a knowledgeable friend I was finally able to figure out my issue!
For some reason, the user's system was configured to connect using named pipes, but the server I was connecting to only had TCP/IP protocol enabled. The solution was to force the application to use TCP/IP by adding "tcp:" to the front of the server name.
The fixed connection string:
pyodbc.connect('''DRIVER={SQL Server};
SERVER=tcp:<myServer_name>;
PORT=1433;
DATABASE=<myDB_name>;
UID=personsUser;
PWD=personsPassword
''')
If for any of you still doesn't work you can try to refer the Localdb (if that's the case) by its pipe address.
If the DB name is LocalDBA, in cmd type
SqlLocalDB LocalDBA v
Get the instance pipe name and then put it on the server name:
conn_str = (
r'DRIVER={SQL Server};'
r'SERVER=np:\\.\pipe\LOCALDB#ECE5B7EE\tsql\query;'
r'PORT=1433;'
r'DATABASE=VT-DE;'
r'trusted_connection=yes;'
)

Connect to a Sybase database in Python without a DSN?

I'm trying to connect to a Sybase database in Python (using the python-sybase DBAPI and sqlalchemy module), and I'm currently receiving the following error:
ct_connect(): directory service layer: internal directory control layer error: There was an error encountered while binding to the directory service
Here's the code:
import sqlalchemy
connect_url = sqlalchemy.engine.url.URL(drivername='pysybase', username='read_only', password='*****', host='hostname', port=9000, database='tablename', query=None)
db = sqlalchemy.create_engine(connect_url)
connection = db.connect()
I've also tried to connect without sqlalchemy - ie, just importing the Python Sybase module directly and attempting to connect, but I still get the same error.
I've done quite a bit of googling and searching here on SO and at the doc sites for each of the packages I'm using. One common suggestion was to verify the DSN settings, as that's what's causing ct_connect() to trip up, but I am able to connect to and view the database in my locally-installed copy of DBArtisan just fine, and I believe that uses the same DSN.
Perhaps I should attempt to connect in a way without a DSN? Or is there something else I'm missing here?
Any ideas or feedback are much appreciated, thank you!
I figured out the issue for anyone else who might be having a similar problem.
Apparently, even though I had valid entries for the hostname in my sql.ini file and DSN table, Sybase was not reading it correctly - I had to open DSEdit (one of the tools that comes with Sybase) and re-enter the server/hostname info.

Categories

Resources