Backup SQL objects using mssql-scripter - python

I'm making backup sql objects by mssql-scripter and it works like a charm if it comes to my local server (here I don't have to provide any user, just pointing server and db). The problem is I have to backup sql objects from client's database wchich has Azure Active Directory authentication method.
I'm providing server, database and my credentials (mail as user)
mssql-scripter -S server -d database -U user -P password -f destination --file-per-object
But I'm getting error:
Failed to connect to server 'server_name'. ---> System.Data.SqlClient.SqlException: Cannot open server "my_mail_domain" requested by the login. The login failed.
Where im mistaking? I've read info to backup sql object using mssql-scripter, authentication has to be set as 'Windows Authentication'. Is this true or I can backup providing Azure credentials?

This has been documented as an issue already.
This is only supported with a custom connection-string parameter
mssql-scripter
--connection-string "Server=server;Database=database;User Id=user;Password=password;Authentication=Active Directory Password" `
-f destination `
--file-per-object
Assuming you actually meant that you have a on-premises SQL Server, but the domain is Azure AD, then you just need to use Integrated Security ie normal Windows Authentication.
mssql-scripter
--connection-string "Server=server;Database=database;Integrated Security=true" `
-f destination `
--file-per-object
Note that usernames and passwords are not used here, the currently logged in user is used.

Related

confusion on full process of setting up heroku postgresdb for python app

so I have followed this guide https://devcenter.heroku.com/articles/connecting-heroku-postgres#connecting-in-python and provisioned a database for my heroku app. In my server code, I added the database url and ssl require. However, how does my server actually get permission to write to the database?
Locally, with psycopg2 you would have :
conn = psycopg2.connect(database="dbName", user = "postgres", password = "password", host = "127.0.0.1", port = "5432"), and thats how your server authenticates itself and has permission to write onto the postgres db. With heroku however, none of that extra information is provided except for the db url link. What is the remaining process that I must to do to allow my server deployed on heroku to have permission to write to its associated provisioned database?

Python Tunnel to MS SQL Server

I am trying to access a database in SQL Server, but to access the SQL Server, I have to login and connect to a remote company server.
Many questions suggest SSH tunnelling using SSHTunnelForwarder(localhost,sshUser,sshPwd,remote_bind_address) - however I always get an error.
I have several print statements which give the following information:
Could not receive SSH config file
0 Keys loaded from agent
0 keys loaded
Tries to connect to the gateway which is the IP of the server as my user
Tries to log in with my password
Fails connection

Connection Timing Out When Accessing Gcloud MySQL from Python

I have a python application where I'm trying to access a MySQL database on Google's cloud service.
I've been following this set up guide for connecting via an external application (Python) and I am using the pymysql package. I'm attempting to connect via the proxy and have already authenticated my connection via gcloud auth log in from the console.
As of now, I CAN access the database via the console, but I need to be able to make queries from my python script to build it out. When I try running it as is, I get the following error:
OperationalError: (2003, "Can't connect to MySQL server on '34.86.47.192' (timed out)")
Here's the function I'm using, with security sensitive info starred out:
def uploadData():
# cd to the directory with the MySQL exe
os.chdir('C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin')
# Invoke the proxy
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather', shell=True)
# Create connection
# I have also tried host = '127.0.0.1' for localhost here
conn = pymysql.connect(host='34.86.47.192',
user='root',
password='*******',
db='gribdata')
try:
c = conn.cursor()
# Use the right databse
db_query = 'use gribdata'
c.execute(db_query)
query = 'SELECT * FROM clients'
c.execute(query)
result = c.fetchall()
print(result)
except Error as e:
print(e)
finally:
conn.close()
Yeah, this one's pretty limited in documentation, but what you want to do is run it from it's hosted IP and configure access to your external IP address on your server. So you want use that IP (34.xxx.xxx.xxx) rather than the loopback 127 local host IP.
To get it to work, you want to go to your connections tab and add a new connection within Gcloud. Make sure the public address box is checked, the IP is correct, and you save once done.
There's some excellent details here from some Gcloud engineers. Looks like some of the source documentation is outdated and this is the way to connect now.
First of all, confirm that the Cloud SQL proxy is indeed installed in the directory that you are expecting it to be. The Cloud SQL proxy is not part of MySQL Server, hence you should not find it in C:\\Program Files\\MySQL\\MySQL Server 8.0\\bin, at least by default. Instead, the Cloud SQL proxy is a tool provided by Google and is just an .exe file that can be stored in any directory you wish. For instructions on how to download the Proxy you can check the docs
The Cloud SQL proxy creates a secure link between the Cloud SQL instance and your machine. what it does is forward a local port in your machine to the Cloud SQL instance. Thus, the host IP that you should use if you are using the proxy is 127.0.0.1
conn = pymysql.connect(host='127.0.0.1',
user='root',
password='*******',
db='gribdata')
When starting the Cloud SQL Proxy with TCP socket, you should add the port to which you want to forward Cloud SQL's traffic at the end of the start command =tcp:3306
subprocess.call('start cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306', shell=True)
Have you tried to connect CloudSQL from the console? Once you connected, you should get a message in the console displaying "Listening on 127.0.0.1:3306".Your connection command should be
"cloud_sql_proxy_x64.exe -instances=trans-cosine-289719:us-east4:compuweather=tcp:3306"
Try to connect cloud proxy from the console and try to create a connection with pymysql. Use "127.0.0.1".

Unable to make TLS TCP connection to remote MySQL server with PyMySQL, other tools work

Setting up a new server that I want to communicate to a central MySQL database, using a TLS connection for security
Following steps like this I have been able to set up TLS for my MySQL server, I have made several users that are able to login from any host (%), and require SSL connections
+------------+-----------+----------+
| user | host | ssl_type |
+------------+-----------+----------+
| testuser | % | ANY |
+------------+-----------+----------+
I can confirm this on any host by connecting using tools like HeidiSQL or the MySQL CLI tool
Ex: mysql -u testuser -p -h mysql_server_IP this will initiate a TLS connection, as confirmed by \s
This rules out the majority of issues I have seen on this and other forums, which is caused by the host being set to localhost.
When accessing local databases, the following works fine. When connecting to non-TLS remote database servers, it also works fine.
import pymysql.cursors
connection = pymysql.connect(host=host,
user=user,
password=password,
db=db,
cursorclass=pymysql.cursors.DictCursor)
When attempting to access my server with require-tls I receive the following error
pymysql.err.OperationalError: (1045, "Access denied for user 'testuser'#'desktop.example.com' (using password: YES)")
The other findings I have suggest that that error is caused by:
Invalid username / password combinations, or, connection prohibited by that host. However I know I can make connections from this host, as demonstrated by the CLI.
When connecting to a server that only has the require_secure_transport = ON in my.cnf, PyMySQL gives a more obvious error around being unable to start a TLS connection. pymysql.err.InternalError: (3159, 'Connections using insecure transport are prohibited while --require_secure_transport=ON.' But if the MySQL user itself requires SSL, you get the more generic permission denied error from the question above.
On the github issue tracker there is mention of supplying the CA .pem file. If you don't have access to these files and want to trust the self signed cert implicitly. The docs mention the -ssl flag, which allows you to pass in paths for various cert files.
However, by passing in a valid dictionary, without any of the valid keys, you can effectively blanket trust self-signed certs. Example:
connection = pymysql.connect(host=host,
user=user,
password=password,
db=db,
cursorclass=pymysql.cursors.DictCursor,
ssl={"fake_flag_to_enable_tls":True})

Neo4J with Py2neo: Unauthorized error HTTPS

I have Neo4J running on a Docker container in which I have mapped the internal container ports 7473 and 7687 to their respective host ports 7473 and 7687, 7474 is exposed but not mapped.
The Neo4J server configuration regarding network.
# Bolt connector dbms.connector.bolt.enabled=true
#dbms.connector.bolt.tls_level=OPTIONAL
dbms.connector.bolt.listen_address=0.0.0.0:7687
# HTTP Connector. There must be exactly one HTTP connector.
dbms.connector.http.enabled=true
dbms.connector.http.listen_address=0.0.0.0:7474
# HTTPS Connector. There can be zero or one HTTPS connectors.
dbms.connector.https.enabled=true
dbms.connector.https.listen_address=0.0.0.0:7473
I was able to login to Neo4J's webclient through the browser and change the default password.
Regarding the Python code here's the line where I create the client.
self.client = py2neo.Graph(host =ip_address,
username=username,
password=password,
secure =use_secure,
bolt =use_bolt)
As soon as I execute a query like this one.
node = Node("FooBar", foo="bar")
self.client.create(node)
I get the following Unauthorized exception.
py2neo.database.status.Unauthorized: https://localhost:7473/db/data/
Any idea on why this may be happening?
The solution was to call a separate authentication method provided by the library like this:
auth_port = str(self._PORT_HTTPS if use_secure else self._PORT_HTTP)
py2neo.authenticate(":".join([ip_address, auth_port]), username, password)
It took me a while to get to this because at first, I thought the authentication was done automatically in the constructor and then I wasn't able to make the authentication method run because I was using the bolt port.

Categories

Resources