Python Tunnel to MS SQL Server - python

I am trying to access a database in SQL Server, but to access the SQL Server, I have to login and connect to a remote company server.
Many questions suggest SSH tunnelling using SSHTunnelForwarder(localhost,sshUser,sshPwd,remote_bind_address) - however I always get an error.
I have several print statements which give the following information:
Could not receive SSH config file
0 Keys loaded from agent
0 keys loaded
Tries to connect to the gateway which is the IP of the server as my user
Tries to log in with my password
Fails connection

Related

Connection Timing Out When Accessing Gcloud MySQL from Python

I have a python application where I'm trying to access a MySQL database on Google's cloud service.
I've been following this set up guide for connecting via an external application (Python) and I am using the pymysql package. I'm attempting to connect via the proxy and have already authenticated my connection via gcloud auth log in from the console.
As of now, I CAN access the database via the console, but I need to be able to make queries from my python script to build it out. When I try running it as is, I get the following error:
OperationalError: (2003, "Can't connect to MySQL server on '34.86.47.192' (timed out)")
Here's the function I'm using, with security sensitive info starred out:
def uploadData():
# cd to the directory with the MySQL exe
os.chdir('C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin')
# Invoke the proxy
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather', shell=True)
# Create connection
# I have also tried host = '127.0.0.1' for localhost here
conn = pymysql.connect(host='34.86.47.192',
user='root',
password='*******',
db='gribdata')
try:
c = conn.cursor()
# Use the right databse
db_query = 'use gribdata'
c.execute(db_query)
query = 'SELECT * FROM clients'
c.execute(query)
result = c.fetchall()
print(result)
except Error as e:
print(e)
finally:
conn.close()
Yeah, this one's pretty limited in documentation, but what you want to do is run it from it's hosted IP and configure access to your external IP address on your server. So you want use that IP (34.xxx.xxx.xxx) rather than the loopback 127 local host IP.
To get it to work, you want to go to your connections tab and add a new connection within Gcloud. Make sure the public address box is checked, the IP is correct, and you save once done.
There's some excellent details here from some Gcloud engineers. Looks like some of the source documentation is outdated and this is the way to connect now.
First of all, confirm that the Cloud SQL proxy is indeed installed in the directory that you are expecting it to be. The Cloud SQL proxy is not part of MySQL Server, hence you should not find it in C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin, at least by default. Instead, the Cloud SQL proxy is a tool provided by Google and is just an .exe file that can be stored in any directory you wish. For instructions on how to download the Proxy you can check the docs
The Cloud SQL proxy creates a secure link between the Cloud SQL instance and your machine. what it does is forward a local port in your machine to the Cloud SQL instance. Thus, the host IP that you should use if you are using the proxy is 127.0.0.1
conn = pymysql.connect(host='127.0.0.1',
user='root',
password='*******',
db='gribdata')
When starting the Cloud SQL Proxy with TCP socket, you should add the port to which you want to forward Cloud SQL's traffic at the end of the start command =tcp:3306
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306', shell=True)
Have you tried to connect CloudSQL from the console? Once you connected, you should get a message in the console displaying "Listening on 127.0.0.1:3306".Your connection command should be
"cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306"
Try to connect cloud proxy from the console and try to create a connection with pymysql. Use "127.0.0.1".

Connect to remote Postgres server via SQLAlchemy

I am trying to send some commands to a remote Postgres server using SQLAlchemy but each time I receive an error.
Please note that I can connect to the remote Postgres using SSH username and password to login to the server. For that I have used my local terminal, PuTTY and WinSCP so the problem appears to be in the Python code I have written
# create postgres engine to connect to the database
engine = create_engine('postgres://server_username:server_password#server_name:port/database')
with engine.connect() as conn:
ex = conn.execute("SELECT version();")
conn.close() # not needed but keep just in case
print(ex)
Running the code above yields the following error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) SSL SYSCALL error: Connection reset by peer (0x00002746/10054)
expected authentication request from server, but received S
I have also tried adding the SSL verification parameter as follows
create_engine('postgres://server_username:server_password#server_name:port/database?sslmode=verify-full')
which returned the error
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) root certificate file "C:\Users\aris.pavlides\AppData\Roaming/postgresql/root.crt" does not exist
Either provide the file or change sslmode to disable server certificate verification.
at which point I had nothing to lose so I disabled certificate verification altogether
create_engine('postgres://server_username:server_password#server_name:port/database?sslmode=disable')
which returned the initial error message.
Do you have any ideas on how I can modify the code to make it work?

Mysql.connector to access remote database in local network Python 3

I used mysql.connector python library to make changes to my local SQL server databases using:
from __future__ import print_function
import mysql.connector as kk
cnx = kk.connect(user='root', password='password123',
host='localhost',
database='db')
cursor = cnx.cursor(buffered=True)
sql = "DELETE FROM examples WHERE id = 4"
number_of_rows = cursor.execute(sql)
cnx.commit()
cnx.close()
This works fine, but when i try the same code with a change only to the 'host' parameter, with something like,
host='xxx.xxx.xxx.xxx'
(where the IP is that of a server connected to my local network.), it won't update that particular data base in that server.
The error thrown is something like:
mysql.connector.errors.DatabaseError: 2003 (HY000): Can't connect to MySQL server on 'xx.xxx.x.xx' (10060)
Why wouldn't this work?
First, you must check if your local IP can acces to your remote server (check if you are an IP restriction on your server), after check if your mysql database use the default port or not, If not you must precise the port in your code.
Check if the database user you are using to connect to the database on the remote host has the correct access and privileges.
You can test this from the command line using:
mysql -u root -p password123 - h xxx.xxx.xxx.xxx db
If this does not work then debug as follow:
ping xxx.xxx.xxx.xxx. If host is reachable move on to next step, if not then this IP is blocked, not available or incorrect. Double check the IP and check that they are on the same network.
Check if mysqld is running on host. service mysqld restart. If it is move on to next step, if not start mysqld. If it does not want to start, install it, start the service and setup your database.
Telnet the specific port to see if the port is blocked. telnet xxx.xxx.xxx.xxx 3306. If this works, move on to the next step. If this does not work, check your IPTables and check if the port is open on the remote host.
Add a user to the mysql on the the host: https://dev.mysql.com/doc/refman/8.0/en/adding-users.html
Restart mysqld and try the command above again.

Neo4J with Py2neo: Unauthorized error HTTPS

I have Neo4J running on a Docker container in which I have mapped the internal container ports 7473 and 7687 to their respective host ports 7473 and 7687, 7474 is exposed but not mapped.
The Neo4J server configuration regarding network.
# Bolt connector dbms.connector.bolt.enabled=true
#dbms.connector.bolt.tls_level=OPTIONAL
dbms.connector.bolt.listen_address=0.0.0.0:7687
# HTTP Connector. There must be exactly one HTTP connector.
dbms.connector.http.enabled=true
dbms.connector.http.listen_address=0.0.0.0:7474
# HTTPS Connector. There can be zero or one HTTPS connectors.
dbms.connector.https.enabled=true
dbms.connector.https.listen_address=0.0.0.0:7473
I was able to login to Neo4J's webclient through the browser and change the default password.
Regarding the Python code here's the line where I create the client.
self.client = py2neo.Graph(host =ip_address,
username=username,
password=password,
secure =use_secure,
bolt =use_bolt)
As soon as I execute a query like this one.
node = Node("FooBar", foo="bar")
self.client.create(node)
I get the following Unauthorized exception.
py2neo.database.status.Unauthorized: https://localhost:7473/db/data/
Any idea on why this may be happening?
The solution was to call a separate authentication method provided by the library like this:
auth_port = str(self._PORT_HTTPS if use_secure else self._PORT_HTTP)
py2neo.authenticate(":".join([ip_address, auth_port]), username, password)
It took me a while to get to this because at first, I thought the authentication was done automatically in the constructor and then I wasn't able to make the authentication method run because I was using the bolt port.

How to connect to my remote SQL server

I have a linux ubuntu server that I rent from DigitalOcean for storing streaming real time data in MySQL by coding with python.
Problem is that I am coding not in the linux server environment but in my local computer python(personal Windows 10(not a server)). So in the coding, I need to connect to my linux server in order to feed/get the data to/from the linux MySQL server.
I know I need to use MySQLdb library in python to do this. I tried to connect to my linux server by using this MySQLdb in python, but it could not connect to the server.
A different question:
For granting other ip addresses from which I am connect to the mysql server, should I do it whenever my ip address changes? for example, when I work at home I need to grant my home internet ip, and when I work other places do I need to grant that ip addresses?
Anyway, I tried granting to the ip address where I am connecting to the internet, but even after the granting I cannot access to the mysql server.4
What should I do?
Here is the code I used;
import MySQLdb
conn = MySQLdb.connect("000.000.000.0", "root", "password", "name of database")
"000.000.000.0" is the server that I rent and it is a linux server.
"root" is my username of the server
"password" is the password for the server as well as for the mysql.
and then the name of the database I want to connect to.
c = conn.cursor()
c.execute("SELECT * FROM practice")
here I just want to see what's in the "practice" table that I have made in the database.
rows = c.fetchall()
for eachRow in rows:
print eachRow
I don;t know what went wrong.
For Granting I used following code:
GRANT ALL ON nwn.* TO new_username#'00.000.000.00' IDENTIFIED BY 'new_password';
The ip address 00.000.000.000 is Starbucks ip address where I am working on this project right now.I got a message like this:
Query OK, 0 rows affected, 1 warning (0.03 sec)
After that I tried to connect in python on my local window python.And it didn't work and got this python error:
OperationalError: (2003, "Can't connect to MySQL server on '000.000.000.0' (10061)")
What's wrong with this?
Similar question have already been asked here.
Can you connect from your local OS to your remote Database?
mysql -u XXXX -h {IP} -p
Have you granted your client in the Database? Digital Ocean HowTo for this
Create an user for the remote access
GRANT SELECT,DELETE,INSERT,UPDATE ON tablename.* TO 'user'#'your_local_ip';
FLUSH PRIVILEGES;
It's good practice not to grant all privileges for remote users (scripts)
Your local IP should be a static IP, otherwise you have allways change ip or use a dynDNS like Service.
(EDIT)
I just set up an environment:
If you connect to an address that does not exist or the mysql/mariadb instance is not running you get errorcode 2003.'
(2003, "Can't connect to MySQL server on 'my_hostname' (61)")
If you're local host is not granted you get error 1130
(1130, "Host '213.213.213.213' is not allowed to connect to this MariaDB server")
To check if you can access your remote sql server try
mysql -h 123.123.123.123 -u username -p
on youre local windwos machine, where 123.123.123.123 is the IP Adress of your DigitalOcean Server.
To get your public IP Adress (outside of starbucks) you can visit a Site like this
If you REALLY used 0.0.0.0 as ip:
0.0.0.0 in service configuration as ip address is used to bind the service for external access. 0.0.0.0 is like a joker for every ip, but can not be used to access the server/service remotely.

Categories

Resources