Ubuntu server, postgresql, python, "Peer auth failed" - python

Hi I'm getting the following and the other solutions I've seen for this don't seem to be working...
Within the ubuntu server terminal (a virtualbox vm):
Error FATAL: Peer authentication failed for user "a4apps"
My Ubuntu server os user name is the same.
I have restarted my postgres.
I have tried changing my pg_hba.conf file by:
changing the IPv4 host method from md5 to "trust"
and by adding a line under it "host all all myubuntuserverip/32 trust"
I am trying to access it via a python script.
I am using psycopg
con = psycopg2.connect(database='fieldtest2', user='a4apps')
I created the user: sudo -u postgres create user a4apps
superuser no, create databases yes, create other users no.
Created database: sudo -u postgres createdb fieldtest2 -O a4apps
I was following this tutorial: here
I'm running out of ideas. Any guidance would be appreciated.
Thanks
Mike

This specific error message:
Peer authentication failed for user "a4apps"
means that the peer authentication method was selected per pg_hba.conf and that the connection attempt was not made by the OS user a4apps, contrary to what this auth method requires.
The default Ubuntu pg_hba.conf has these lines:
# Database administrative login by Unix domain socket
local all postgres peer
# TYPE DATABASE USER ADDRESS METHOD
# "local" is for Unix domain socket connections only
local all all peer
To allow local passwordless connections for any user except postgres, you may replace peer by trust in the last line.
The IPv4-related changes you tried in pg_hba.conf had no effect on your script because it doesn't connect through TCP/IP. If the connection string mentioned a hostname, it would then use TCP/IP and trigger the corresponding rules in pg_hba.conf.

Related

Pandas read from remote Postgresql with SSH tunnel and sqlalchemy

I can read from my local psql instance like this:
engine = create_engine('postgresql://postgres:postgres#localhost/db_name')
df = pd.read_sql("select * from table_name;", engine)
I have a remote postgresql sever which I successfully accessed with ssh tunneling both in PgAdmin4 and pycharm. I use public key file to login into remote server. Now, my question is how do I access that database with pandas. I tried:
engine = create_engine('postgresql://username:password#localhost/db_name')
Here, username and password are of remote database. I get sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: password authentication failed for user. However, with the same username and password I can access the table in PgAdmin.
From what I read, because of ssh tunneling I have to use localhost and not the remote server address, right? In pgAdmin I can see that the server is running. So, my question is how do I read the table from remote postgresql database with ssh tunneling? In examples I have seen people using different port (different than 5432) but for me the setup only works if I use port 5432. I have disconnected all other servers to avoid the port conflict but I get the same error.
The tunnel created by pgAdmin4 is intended for its own use. It does not arrange for it to listen on 5432, it picks some arbitrary high numbered port and doesn't advertise what port that is. While you can discover what port it is listening on using system tools (like netstat) and then connect to it, you would probably be better served by finding some other way to set up your tunnel. There are python libraries that can help with that.
As for why you can connect to 5432 at all, clearly there is something listening there which is either PostgreSQL or pretending to be PostgreSQL, but it doesn't seem to be the one you intend. You can use netstat -ao to find the pid for it and then look up based on that.

MySQL Refusing Remote Connections

Using:
MySQL/MariaDB Ver 14.14 Distrib 5.7.23
Ubuntu 16.04.5 LTS (GNU/Linux 2.6.32-042stab131.1 x86_64)
Everything works perfectly from the local machine (a remote dedicated server) such as PYPMyAdmin, SQLAlchemy, etc, but it will not accept any remote connections. I can telnet to the MySQL port, it gives me a login prompt, and I have already set the remote user up with:
GRANT ALL PRIVILEGES ON . TO 'admin'#'%' IDENTIFIED BY 'p4$$w0rd' WITH GRANT OPTION;
FLUSH PRIVILEGES;
And yet I still get the following error when I try to login remotely via Python/SQLAlchemy. "Access denied for user 'admin'#'myrdns.myisp.net"
Is there some additional security feature that I need to tweak that I am unaware of to enable remote access to one server from another?
You need to change/add grants for your desired username in order to access your database remotely, i.e.
GRANT ALL PRIVILEGES ON *.* TO 'user'#'%' IDENTIFIED BY 'mypassword' WITH GRANT OPTION;
You'll need to issue the following command as well, before being able to connect.
FLUSH PRIVILEGES;
Since you (probably) did it already, you might wish to check your config file, in order to see if there's any bind address being set. In case of editing your config file, you'll need to issue server restart.
Sometimes it's due to some firewall blocking rules as well, but you'll need to confirm and resolve it on your network/machine if that's the case.
Be warned, however, that it is advised that you set up SSL certificate if you allow remote access to your database.
Also be aware that you can issue remote login without logging in "directly", i.e. you can connect to your remote machine/server via SSH client, then connect to your mysql server via localhost/127.0.0.1 address.

Access denied while trying to establish SSL connection to Mysql server using python-mysql-connector

I have a python app that needs to establish an ssl encrypted remote connection to a Mysql database. I am using the python-mysql-connector library.
Here is an example of the code:
config {
'host': 'XX.XX.XX.XX',
'user': 'user',
'password':'password',
'ssl_ca': '/etc/app/ssl/ca.pem',
'database':'database'
}
conn = mysql.connector.connect(**config)
SSL is enabled on the server.
The user is set up to require SSL (not X509).
I have FLUSH PRIVILEGES;
The file ca.pem was generated on the server and copied to the client machine and has the proper read permissions for the app.
Whenever I execute this code, I receive an error:
ERROR: MYSQL ERROR: 1045 (28000): Access denied for user
'user'#'XX.XX.XX.XX' (using password: YES)
This error only appears when I am using the python library to connect AND the the user is set to REQUIRE SSL. If I strip back the REQUIRE SSL requirement, I can connect. Albeit, with no ssl though.
I can run successfully on the client machine (as well as any other machine):
mysql -u 'user' -p -h XX.XX.XX.XX
This provides a secure connection.
Can anybody help me figure out how and why mysql-client has no problem establishing a secure connection while python-mysql-connector wil not?
The problem was permissions problem. While the file ca.pem had the appropriate permissions, it's directory (app/) did not have read access.
Hence, the file could not be read.

Can MySQLdb on client machine connect to a database on a server machine?

I am not a network/web/internet programmer so please excuse my noobness in this area. I have gotten a website using a free hosting service. They include one MySQL database. Here are the details for the database:
port = 3306
host = "fdb4.biz.nf"
database = "1284899_6067"
user = "1284899_6067"
password = "somepass9351"
I am using MySQLdb module (installed on my CLIENT machine - not server) to connect to this database:
db = MySQLdb.connect(host=host, user=user, passwd=password, db=database,port=port)
But I get the following error:
OperationalError: (2003, "Can't connect to MySQL server on 'fdb4.biz.nf' (10060)
What I have already tried
tried two different databases from different hosts
tried changing the port
tried searching SO for similar answers but all others connect to 'local host'
What I think:
could this be caused by my firewall? I am using my school's internet. I don't think this could be it because I am on CLIENT so if anything it is the SERVER'S firewall.
Two questions
Can MySQLdb be used to connect to a db on a SERVER when it is imported on a CLIENT?
If yes, what am I doing wrong?
Thank you so much for any help, its greatly appreciated! Been stuck the whole day on this.
For security reasons, mysql only listens for connections from localhost. Error code 10060 is basically that: you are not allowed to connect remotely.
Solution: find a my.ini (or my.cnf in linux) and try to find a line:
bind-address = 127.0.0.1
this line says: allow only local connections. So, you should comment-out this line, or set your IP address.
Yes, MySQLdb can connect to remote hosts.
And your usage of the connect method is correct.
You should first check if you can connect to the remote mysql server from your mysql client.
In terminal you can type mysql -h hostname -u username -p databasename
This should prompt you for the password. Enter the password. Can you connect?
If you can't connect, then you have an access problem, and its not a python - mysqldb problem
Either the server is not reachable because it is behind a firewall, in that case your client machine's ip needs to be whitelisted. Check your firewall settings
Or, the mysql server running on the remote machine is configured to accept only local connections. I think this is the default, but I'm not sure. You should ssh into the server remote host where the database server is running, locate the my.cnf file on the server and check the settings. Depending on your mysql version, the configuration would look slightly different.
Or, the user that you're trying to connect as is not associated with the ip that you're trying to connect from. Mysql users have two parts, like this: 'username'#'host'. To enable a user to connect from all ips the user needs to look like this 'user'#'%'.
I hope I've given you enough to try to debug this issue.

Psycopg2 reporting pg_hba.conf error

I've run into a weird situation while trying to use PostgreSQL and Psycopg2. For some reason, every time I attempt to connect to the postgre database via python, I get the following error:
psycopg2.OperationalError: FATAL: no pg_hba.conf entry for host "127.0.0.1", user "steve", database "steve", SSL on
FATAL: no pg_hba.conf entry for host "127.0.0.1", user "steve", database "steve", SSL off
Naturally, I checked pg_hba.conf to see what the issue was, but everything appeared to be configured correctly as far as I can see:
pg_hba.conf:
# TYPE DATABASE USER CIDR-ADDRESS METHOD
# "local" is for Unix domain socket connections only
local all all md5
# IPv4 local connections:
host all all 127.0.0.1/32 md5
# IPv6 local connections:
host all all ::1/128 md5
In addition, I've found that I can connect to the database via psql as I would expect:
$ psql -U steve -h 127.0.0.1
...
steve=>
Anyone have any ideas as to what could be going on here? Thanks in advance!
Typical explanations include:
You are connecting to the wrong server.
Is the DB server running on the same host as Python does?
You got the wrong port.
Check the server log if you see a connection attempt. You have to log connections for that, of course. See the config parameter log_connections.
You did not reload (SIGHUP) the server after changing pg_hba.conf - or reloaded the wrong cluster (if you have multiple DB clusters).
Use pg_ctl or pg_ctlcluser on Debian and derivatives for that.
Or, on modern Linux installations with systemd (incl. Debian & friends), typically:
sudo systemctl reload postgresql
Or, if there are multiple installations, check with:
sudo systemctl status postgres*
And then realod the one you want with something like:
sudo systemctl reload postgresql#14-main
I recently got into this same issue and I found the solution for this problem.
System:
I have an application server (with these packages installed python, django, psycopg2 and postgres client 9.6.1 (postgresql-9.6.1.tar.gz)), for instance ip address 10.0.0.1(a private address).
And AWS postgres RDS server "aws_rds_host_name" or any database IP address.
Error:
django.db.utils.OperationalError: FATAL: no pg_hba.conf entry for host "10.0.0.1", user "your_user", database "your_db", SSL off
Solution:
While installing the postgres client 9.6.1 source package in application server 10.0.0.1, we have to pass an argument "--with-openssl". I suggest to remove the existing the postgres client and install with below steps.
Download the postgres client source package 9.6.1 (postgresql-9.6.1.tar.gz)
Untar the package postgresql-9.6.1.tar.gz.
./configure --prefix="your_preferred_postgres_path_if_needed" --with-openssl (this '--with-openssl' argument is important to get rid of that error)
make
make install
After successful installation, that error didn't occur when we ran the django project with psycopg2.
I hope this solution helps someone.

Categories

Resources