Error while using Python to connect to Amazon RDS Postgresql database - python

I'm trying to connect to Amazon RDS Postgresql database with this python code
import psycopg2
engine = psycopg2.connect(
database="vietop2database",
user="postgres",
password="07041999",
host="vietop2.cf4afg8yq42c.us-east-1.rds.amazonaws.com",
port='5433'
)
cursor = engine.cursor()
print('opened database successfully')
I encountered an error:
could not connect to server: Connection timed out
Is the server running on host "vietop2.cf4afg8yq42c.us-east-1.rds.amazonaws.com" (54.161.159.194) and accepting
TCP/IP connections on port 5433?
I consulted this trouble shooting on amazon and I already make sure the DB instance's public accessibility is set to Yes to allow external connections. I also changed port to 5433 and set VPC security to default. Yet, I fail to connect to the database. What might be the reasons? Please help me. Thank you very much
Below are the database connectivity and configuration information

I found the answer. I need to add new inbound rule allowing all traffic of IPv4 type

Related

Connection timed out when I try to connect posgrest sql server on digitalocen and azure function

I have a postgres database running on a digital ocean server. The database is protected by a firewall and ssl root certificate, I add the Outbound addresses provided by the Azure Function App to the database firewall and I am passing the certificate through the connection string.
pg_conn = psycopg2.connect(host= os.environ.get("PG_HOST"),database= os.environ.get("PG_DB"), user=os.environ.get("PG_USER"), password=os.environ.get("PG_PASSWORD"), port=os.environ.get("PG_PORT"), sslmode='require', sslrootcert = r'my-proyect/certificate.crt' )
But when I upload my function to the cloud the connection sends a timeout
Connection timed out Is the server running on that host and accepting TCP/IP connections?
As per my knowledge, A connection time-out error is typically due to connectivity issues or networking issues.
Firewall if not allowing the access to the port number which application has.
Here is the tool for troubleshooting these sort of issues is portgry
portqry -n [hostname] -e [port number]
you can even add applications to Trusted Resources in Postgry SQL
Here is the document which has complete information about connection time out error.

sshtunnel forwarding and mysql connection on google cloud function

I am trying to connect to a MySQL server through ssh tunnel on one of my google cloud functions. This works fine in my home environment. I assume it is some port issue on cloud function.
Edit: For clarification the MySQL server sits on a Namecheap shared hosting web server. Not Google Cloud SQL
Every time I run this I timeout with "unknown error". The tunnel appears to be successful. I am however unable to get the mysql connection to work.
import base64
import sshtunnel
import mysql.connector
def testing(event, context):
"""
Testing function
"""
with sshtunnel.SSHTunnelForwarder(
("server address", port),
ssh_username="user",
ssh_password="password",
remote_bind_address=("127.0.0.1",3306),
) as server:
print(server.local_bind_port)
with mysql.connector.connect(
user="user",
password="password",
host="localhost",
database="database",
port=server.local_bind_port
) as connection:
print(connection)
There's too many steps to list, but I'm wondering if the "connector" setup plays a difference even for SSH. Maybe you have to create a connector as shown here (notice how the instructions in "Private IP" tab are different than on your local computer). Then, configure Cloud Functions to use that connector. Make sure you also use the right port.
A Serverless VPC Access connector handles communication to your VPC
network. To connect directly with private IP, you need to:
Make sure that the Cloud SQL instance created above has a private IP
address. If you need to add one, see the Configuring private IP page
for instructions.
Create a Serverless VPC Access connector in the same
VPC network as your Cloud SQL instance. Unless you're using Shared
VPC, a connector must be in the same project and region as the
resource that uses it, but the connector can send traffic to resources
in different regions. Serverless VPC Access supports communication to VPC networks connected via Cloud VPN and VPC Network Peering. Serverless VPC Access does not support legacy networks.
Configure Cloud Functions to use the connector.
Connect using your
instance's private IP and port 3306.
Keep in mind, this "unknown" error could also very well be due to the Cloud SQL Admin API not being enabled here. As a matter of fact, make sure you follow that entire page as it's a broad question.
Let us know what worked for this type of error.

Python using mysql.connector.connect yields an Interface Error

When I execute the code
import mysql.connector
mydb = mysql.connector.connect(
host="localhost",
user="user",
passwd="password"
)
I get:
mysql.connector.errors.InterfaceError: 2003: Can't connect to MySQL server on 'localhost:3306' (10061 No connection could be made because the target machine actively refused it)
I've tried using an example with "pymysql' and the error stays.
I looked around the internet and a lot of people say it could be a firewall inbound problem. Yet there is no fire wall stopping 3306. The security group on Amazon RDS allows all connections. I connected the RDS instance to my local mySql Workbench (so I can make tables and stuff from there). Interestingly enough when I run the code and the error persists there is additional client connections that pop up. Anyone else deal with this? Thank you very much I'm trying to learn this part of AWS well.
I figured it out. My host name should NOT be "localhost" it should be my endpoint on Amazon RDS. It seems clear now considering when I connected my RDS database to mySql workbench I put in "databasename.xxxxxxx.us-east-1.rds.amazonaws.com" for my host name. Now I can connect and read/write to the database.

OpenShift mysql connection issues in python

I previously wrote my app using local development servers, and now that I have moved it onto an openshift small gear almost all works except for mysql connections.
In my code I have the line:
self.db = MySQLdb.connect(host, username, password, dbname)
When I review the openshift error log, the following error is reported:
_mysql_exceptions.OperationalError: (2002, "Can't connect to local MySQL server through socket '/var/lib/mysql/mysql.sock' (2)")
I think that python is trying to connect using a UNIX socket as opposed to an INET one, but I'm not sure how to change this behavior. Any help is much appreciated.
Not specific to MySQLdb: if you use localhost as hostname, a MySQL client using the MySQL C libraries will try to connect using UNIX socket (or named pipe on Windows). There are 2 ways around this, but you'll need to grant extra permissions to make it work for both:
Use IP address 127.0.0.1
Use IP address 127.0.0.1 instead of the localhost hostname. This will make MySQL client connect using TCP/IP.
Use option files
The other way is to force the protocol using using option files. For example, in your ~/.my.cnf (or any file you want), add the following:
[python]
protocol=tcp
Now use the connection arguments to read the option file and group:
import MySQLdb
cnx = MySQLdb.connect(host='localhost', user='scott', passwd='tiger',
read_default_file='~/.my.cnf',
read_default_group='python')
The group name does not need to be python, but it is good not to use mysql or client as it might interfere with other MySQL tools (unless you want that of course).
For setting up permissions, you'll need to use the IP address of localhost, something like:
mysql> GRANT SELECT TO yourdb.* TO 'scott'#'127.0.0.1' IDENTIFIED BY ...;
(Site note: MySQL database drivers such as MySQL Connector/Python do not consider localhost to be special and connect through TCP/IP right away and you have to explicitly use the unix_socket.)
As I later discovered, while the database server runs on localhost, it runs on a very specific localhost bind address. In my case it was an address that I would never have though to try if I hadn't noticed how phpmyadmin was connecting.

MySQLdb security when connecting to a remote server?

db = MySQLdb.connect(host ="host",
user="user",
passwd="pass",
db="dbname")
q = db.cursor()
So, that's my code block, I was just wondering, how easy would this be to reverse engineer, and does mysqldb send authentications over cleartext?
I am creating a program that connects to a mySQL server over the internet, would someone be able to get my credentials?
Would someone be able to get my server login details?
The MySQL server could be configured to use SSL to secure the connection. See here for an example of using MySQLdb with an SSL connection and here for some info on configuring the server.
In your example above the username, password and all other data would be sent in cleartext.
Here are two related questions Python MySQLDB SSL Connection , CA SSL parameter for Python MySQLdb not working, but key does?
If you have access to change configure the MySQL server, we can help configure SSL.
MySQL supports encrypted connections. The MySQL server you are connecting to must be configured to use SSL and the client must add an SSL parameter when connecting.
Using SSL connections
shell> mysql --ssl-ca=ca-cert.pem ...
You can test if the server you are connecting to supports SSL my adding --ssl-ca=ca-cert.pem.
ca-cert.pem: Use this as the argument to --ssl-ca on the server and client sides. (The CA certificate, if used, must be the same on both sides.)
MySQL SSL Example describes the process from setting up MySQL for and connecting with SSL.
Passwords shouldn't be hardcoded in the code. Python has the convention of a config.py module, where you can keep such values separate from the rest of your code.
Please have a look here:
http://dev.mysql.com/doc/refman/5.5/en/connector-python-coding.html
The question regarding SSL to prevent disclosure has been answered above.
Fabio
#fcerullo

Categories

Resources