Neo4J with Py2neo: Unauthorized error HTTPS - python

I have Neo4J running on a Docker container in which I have mapped the internal container ports 7473 and 7687 to their respective host ports 7473 and 7687, 7474 is exposed but not mapped.
The Neo4J server configuration regarding network.
# Bolt connector dbms.connector.bolt.enabled=true
#dbms.connector.bolt.tls_level=OPTIONAL
dbms.connector.bolt.listen_address=0.0.0.0:7687
# HTTP Connector. There must be exactly one HTTP connector.
dbms.connector.http.enabled=true
dbms.connector.http.listen_address=0.0.0.0:7474
# HTTPS Connector. There can be zero or one HTTPS connectors.
dbms.connector.https.enabled=true
dbms.connector.https.listen_address=0.0.0.0:7473
I was able to login to Neo4J's webclient through the browser and change the default password.
Regarding the Python code here's the line where I create the client.
self.client = py2neo.Graph(host =ip_address,
username=username,
password=password,
secure =use_secure,
bolt =use_bolt)
As soon as I execute a query like this one.
node = Node("FooBar", foo="bar")
self.client.create(node)
I get the following Unauthorized exception.
py2neo.database.status.Unauthorized: https://localhost:7473/db/data/
Any idea on why this may be happening?

The solution was to call a separate authentication method provided by the library like this:
auth_port = str(self._PORT_HTTPS if use_secure else self._PORT_HTTP)
py2neo.authenticate(":".join([ip_address, auth_port]), username, password)
It took me a while to get to this because at first, I thought the authentication was done automatically in the constructor and then I wasn't able to make the authentication method run because I was using the bolt port.

Related

Connection Timing Out When Accessing Gcloud MySQL from Python

I have a python application where I'm trying to access a MySQL database on Google's cloud service.
I've been following this set up guide for connecting via an external application (Python) and I am using the pymysql package. I'm attempting to connect via the proxy and have already authenticated my connection via gcloud auth log in from the console.
As of now, I CAN access the database via the console, but I need to be able to make queries from my python script to build it out. When I try running it as is, I get the following error:
OperationalError: (2003, "Can't connect to MySQL server on '34.86.47.192' (timed out)")
Here's the function I'm using, with security sensitive info starred out:
def uploadData():
# cd to the directory with the MySQL exe
os.chdir('C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin')
# Invoke the proxy
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather', shell=True)
# Create connection
# I have also tried host = '127.0.0.1' for localhost here
conn = pymysql.connect(host='34.86.47.192',
user='root',
password='*******',
db='gribdata')
try:
c = conn.cursor()
# Use the right databse
db_query = 'use gribdata'
c.execute(db_query)
query = 'SELECT * FROM clients'
c.execute(query)
result = c.fetchall()
print(result)
except Error as e:
print(e)
finally:
conn.close()
Yeah, this one's pretty limited in documentation, but what you want to do is run it from it's hosted IP and configure access to your external IP address on your server. So you want use that IP (34.xxx.xxx.xxx) rather than the loopback 127 local host IP.
To get it to work, you want to go to your connections tab and add a new connection within Gcloud. Make sure the public address box is checked, the IP is correct, and you save once done.
There's some excellent details here from some Gcloud engineers. Looks like some of the source documentation is outdated and this is the way to connect now.
First of all, confirm that the Cloud SQL proxy is indeed installed in the directory that you are expecting it to be. The Cloud SQL proxy is not part of MySQL Server, hence you should not find it in C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin, at least by default. Instead, the Cloud SQL proxy is a tool provided by Google and is just an .exe file that can be stored in any directory you wish. For instructions on how to download the Proxy you can check the docs
The Cloud SQL proxy creates a secure link between the Cloud SQL instance and your machine. what it does is forward a local port in your machine to the Cloud SQL instance. Thus, the host IP that you should use if you are using the proxy is 127.0.0.1
conn = pymysql.connect(host='127.0.0.1',
user='root',
password='*******',
db='gribdata')
When starting the Cloud SQL Proxy with TCP socket, you should add the port to which you want to forward Cloud SQL's traffic at the end of the start command =tcp:3306
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306', shell=True)
Have you tried to connect CloudSQL from the console? Once you connected, you should get a message in the console displaying "Listening on 127.0.0.1:3306".Your connection command should be
"cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306"
Try to connect cloud proxy from the console and try to create a connection with pymysql. Use "127.0.0.1".

redisai Client password/auth process

I am trying to connect to a redisai server through the redisai-py Client. The server is password protected and the Client is passed host, port, and password as arguments. However, the client times out on a tensorset/get even though it returns a connection object.
import redisai
r = redisai.Client(host='<host>', port=<port>, password='<password>')
in redis-cli, you would
redis-cli
auth <password>
...
which works just fine. There doesn't seem to be a way to perform this action through a redisai-py Client despite it extending the StrictRedis class. Since the Client won't connect without authentication, I cannot access the data.
The solution to accessing the redisai database involved creating inbound port rules focused directly around the VNet the Azure VM nodes were located on.
When connecting with redisai Client, the private IP address is used and the argument for port is left out.
import redisai
r = redisai.Client(host=<Private IP>)
r.ping()
# PONG
The primary node inbound port rules:
Worker inbound port rule:
However, this does not solve the issue around the client hanging and providing authentication when the redisai database is exposed but requires a password.

Unable to make TLS TCP connection to remote MySQL server with PyMySQL, other tools work

Setting up a new server that I want to communicate to a central MySQL database, using a TLS connection for security
Following steps like this I have been able to set up TLS for my MySQL server, I have made several users that are able to login from any host (%), and require SSL connections
+------------+-----------+----------+
| user | host | ssl_type |
+------------+-----------+----------+
| testuser | % | ANY |
+------------+-----------+----------+
I can confirm this on any host by connecting using tools like HeidiSQL or the MySQL CLI tool
Ex: mysql -u testuser -p -h mysql_server_IP this will initiate a TLS connection, as confirmed by \s
This rules out the majority of issues I have seen on this and other forums, which is caused by the host being set to localhost.
When accessing local databases, the following works fine. When connecting to non-TLS remote database servers, it also works fine.
import pymysql.cursors
connection = pymysql.connect(host=host,
user=user,
password=password,
db=db,
cursorclass=pymysql.cursors.DictCursor)
When attempting to access my server with require-tls I receive the following error
pymysql.err.OperationalError: (1045, "Access denied for user 'testuser'#'desktop.example.com' (using password: YES)")
The other findings I have suggest that that error is caused by:
Invalid username / password combinations, or, connection prohibited by that host. However I know I can make connections from this host, as demonstrated by the CLI.
When connecting to a server that only has the require_secure_transport = ON in my.cnf, PyMySQL gives a more obvious error around being unable to start a TLS connection. pymysql.err.InternalError: (3159, 'Connections using insecure transport are prohibited while --require_secure_transport=ON.' But if the MySQL user itself requires SSL, you get the more generic permission denied error from the question above.
On the github issue tracker there is mention of supplying the CA .pem file. If you don't have access to these files and want to trust the self signed cert implicitly. The docs mention the -ssl flag, which allows you to pass in paths for various cert files.
However, by passing in a valid dictionary, without any of the valid keys, you can effectively blanket trust self-signed certs. Example:
connection = pymysql.connect(host=host,
user=user,
password=password,
db=db,
cursorclass=pymysql.cursors.DictCursor,
ssl={"fake_flag_to_enable_tls":True})

RetriesExhaustedError on connecting to HPE iLO 5 through Python iLO REST client

Following is a Python based RESTful library client (recommended by HP https://developer.hpe.com/platform/ilo-restful-api/home) that uses Redfish REST API (https://github.com/HewlettPackard/python-ilorest-library) to connect to the remote HPE iLO5 server of ProLiant DL360 Gen10 based hardware
#! /usr/bin/python
import redfish
iLO_host = "https://xx.xx.xx.xx"
username = "admin"
password = "xxxxxx"
# Create a REST object
REST_OBJ = redfish.redfish_client(base_url=iLO_host,username=username, password=password, default_prefix='/redfish/v1')
# Login into the server and create a session
REST_OBJ.login(auth="session")
# HTTP GET request
response = REST_OBJ.get("/redfish/v1/systems/1", None)
print response
REST_OBJ.logout()
I am getting RetriesExhaustedError when creating REST object. However, I can successfully do SSH to the server from the VM (RHEL7.4) where I am running this script. The authentication details are given correctly. I verified that the Web Server is enabled (both port 443 and 80) in the iLO Security - Access settings. Also, in my VM box the Firewalld service has been stopped and IPTables is flushed. But still connection could not be established. What other possibilities I can try yet?
I found the root cause. The issue is with SSL Certificate verification being done by the Python code.
This can be turned off by setting the environment variable PYTHONHTTPSVERIFY=0 before running the code solved the problem.
This is a very old topic, but perhaps for other people that have a similar issue when accessing the iLO in any way, and not just over Python:
You most likely need to update the firmware in your server, so that the TLS is updated. You will most likely need to use an old browser to do this, as modern versions of Mozilla/Chrome will not work with old TLS. I have had luck with Konqueror.

Authentication Type error in Paramiko using SFTP

I am using paramiko to establish an SFTP connection with a public/private key exchange. They key is an SSH2 RSA key. When I try to connect I'm receiving the error BadAuthenticationType: Bad authentication type (allowed_types=['']). Does anyone have an idea what might be causing this?
key = paramiko.RSAKey.from_private_key_file(key, password=passphrase)
transport = paramiko.Transport((host, port))
transport.start_client()
transport.auth_publickey(username, key)
sftp = paramiko.SFTPClient.from_transport(transport)
According to the documentation for Paramiko, the server you're trying to connect to isn't configured properly (it doesn't allow public-key authentication for the user you're using to connect). Here is a link to the portion of the documentation that I referenced, hopefully it will be of use. http://www.lag.net/paramiko/docs/paramiko.Transport-class.html#auth_publickey
I recommend that you check your server config and make sure everything is set up properly.

Categories

Resources