Application name in cx_Oracle - python

I am connecting to an Oracle Database Server using a SessionPool from cx_Oracle. When I look at the description of the opened session in the Oracle developer, I see that its name is "python.exe". How can I set the application/module name in cx_oracle?

You may be able to physically rename python.exe but there is no programmatic way to change what is shown in Oracle Database as the executable.
You can set the driver name by calling cx_Oracle.init_oracle_client. This changes the CLIENT_DRIVER column of V$SESSION_CONNECT_INFO.
Other settable attributes including 'module' (which in Oracle terminology is not the program name) are shown in the documentation Oracle Database End-to-End Tracing.
# Set the tracing metadata
connection.client_identifier = "pythonuser"
connection.action = "Query Session tracing parameters"
connection.module = "End-to-end Demo"
for row in cursor.execute("""
SELECT username, client_identifier, module, action
FROM V$SESSION
WHERE username = 'SYSTEM'"""):
print(row)

Related

SQLAlchemy: Get the database name from connection

I wonder if it is possible to get the database name after a connection. I known that it is possible with the engine created by the function 'create_engine' (here) and I would like to have the same possibility after a connection.
from sqlalchemy import create_engine, inspect
engine = create_engine('mysql+mysqldb://login:pass#localhost/MyDatabase')
print (engine.url.database) # print the dabase name with an engine
con = engine.connect()
I looked at the inspector tool, but there is no way to retrieve the database name like:
db_name = inspect(con.get_database_name() )
May be it is not possible. Any idea ?
Thanks a lot!
For MySQL, executing select DATABASE() as name_of_current_database should be sufficient. For SQL Server, it would be select DB_NAME() as name_of_current_database. I do not know of any inherently portable way of doing this that will work irrespective of the backend.

Apache Airflow - Connection issue to MS SQL Server using pymssql + SQLAlchemy

I am facing a problem to connect to an Azure MS SQL Server 2014 database in Apache Airflow 1.10.1 using pymssql.
I want to use the MsSqlHook class provided by Airflow, for the convenience to create my connection in the Airflow UI, and then create a context manager for my connection using SqlAlchemy:
#contextmanager
def mssql_session(dt_conn_id):
sqla_engine = MsSqlHook(mssql_conn_id=dt_conn_id).get_sqlalchemy_engine()
session = sessionmaker(bind=sqla_engine)()
try:
yield session
except:
session.rollback()
raise
else:
session.commit()
finally:
session.close()
But when I do that, I have this error when I run a request :
sqlalchemy.exc.InterfaceError: (pyodbc.InterfaceError) ('IM002',
'[IM002] [unixODBC][Driver Manager]Data source name not found, and no
default driver specified (0) (SQLDriverConnect)') (Background on this
error at: http://sqlalche.me/e/rvf5)
It seems come from pyodbc whereas I want to use pymssql (and in MsSqlHook, the method get_conn uses pymssql !)
I searched in the source code of Airflow the cause.
I noticed that the method get_uri from the class DbApiHook (from which is inherited MsSqlHook) builds the connection string passed to SqlAlchemy like this:
'{conn.conn_type}://{login}{host}/{conn.schema}'
But conn.conn_type is simply equal to 'mssql' whereas we need to specify the DBAPI as described here:
https://docs.sqlalchemy.org/en/latest/core/engines.html#microsoft-sql-server
(for example : 'mssql+pymssql://scott:tiger#hostname:port/dbname')
So, by default, I think it uses pyodbc.
But how can I set properly the conn_type of the connection to 'mssql+pymssql' instead of 'mssql' ?
In the Airflow IU, you can simply select SQL server in a dropdown list, but not set as you want :
To work around the issue, I overload the get_uri method from DbApiHook in a new class I created inherited from MsSqlHook, and in which I build my own connection string, but it's not clean at all...
Thanks for any help
You're right. There's no easy, straightforward way to get Airflow to do what you want. Personally I would build the sqlalchemy engine inside of your context manager, something like create_engine(hook.get_uri().replace("://", "+pymssql://")) -- then I would toss the code somewhere reusable.
You can create a connection by passing it as an environment variable to Airflow. See the docs. The value of the variable is the database URL in the format SqlAlchemy accepts.
The name of the env var follows the pattern AIRFLOW_CONN_ to which you append the connection ID. For example AIRFLOW_CONN_MY_MSSQL, in this case, the conn_id would be 'my_mssql'.

testing postgres db python

I don't understand how to test my repositories.
I want to be sure that I really saved object with all of it parameters into database, and when I execute my SQL statement I really received what I am supposed to.
But, I cannot put "CREATE TABLE test_table" in setUp method of unittest case because it will be created multiple times (tests of the same testcase are runned in parallel). So, as long as I create 2 methods in the same class which needs to work on the same table, it won't work (name clash of tables)
Same, I cannot put "CREATE TABLE test_table" setUpModule, because, now the table is created once, but since tests are runned in parallel, there is nothing which prevents from inserting the same object multiple times into my table, which breakes the unicity constraint of some field.
Same, I cannot "CREATE SCHEMA some_random_schema_name" in every method, because I need to globally "SET search_path TO ..." for a given Database, so every method runned in parallel will be affected.
The only way I see is to create to "CREATE DATABASE" for each test, and with unique name, and establish a invidual connection to each database.. This looks extreeeemly wasteful. Is there a better way?
Also, I cannot use SQLite in memory because I need to test PostgreSQL.
The best solution for this is to use the testing.postgresql module. This fires up a db in user-space, then deletes it again at the end of the run. You can put the following in a unittest suite - either in setUp, setUpClass or setUpModule - depending on what persistence you want:
import testing.postgresql
def setUp(self):
self.postgresql = testing.postgresql.Postgresql(port=7654)
# Get the url to connect to with psycopg2 or equivalent
print(self.postgresql.url())
def tearDown(self):
self.postgresql.stop()
If you want the database to persist between/after tests, you can run it with the base_dir option to set a directory - which will prevent it's removal after shutdown:
name = "testdb"
port = "5678"
path = "/tmp/my_test_db"
testing.postgresql.Postgresql(name=name, port=port, base_dir=path)
Outside of testing it can also be used as a context manager, where it will automatically clean up and shut down when the with block is exited:
with testing.postgresql.Postgresql(port=7654) as psql:
# do something here

Python MySQLdb script works correctly on OS X, permission denied error when copied to a linux server

I wrote a quick python script to automate setting up wordpress blogs for friends. I've tested the script with no problems on my OS X computer, but when the script is copy pasted to a linux server and ran, the script is able to create the database, but when it tries to create the user and grant privileges to it I receive
"_mysql_exceptions.OperationalError: (1044, "Access denied for user
'root'#'XXX.XXX.XXX.XXX' to database 'livetest'")"
the script is:
db = MySQLdb.connect(host="ip", # your host
user="root", # username
passwd="password")
# password
# Create a Cursor object to execute queries.
cur = db.cursor()
# Select data from table using SQL query.
cur.execute("CREATE DATABASE IF NOT EXISTS " +blogshortstr)
cur.execute("GRANT ALL PRIVILEGES on " +blogshortstr +".*" + " " + "TO " +mysqluser+'#'+'ip'+ " " +'IDENTIFIED BY' + " " +mysqlpassword)
cur.execute("FLUSH PRIVILEGES")
db.commit()
db.close()
Eventually figured out the problem by accident thanks to another post I found thanks to a typo.
The problem turned out to be twofold.
1) When I enabled remote login it added a second root account. While the root#localhost user had the grant option, my root#ip.addr didn't. The root#ip.addr could still create and delete tables. Adding the grant option made it work.
2) At least with this code and the python module I was using (MySQLdb) nowhere is it documented in the module's documentation that the username is required to be in backticks (the key next to the 1 on QWERTY keyboard) and not regular quotes when doing GRANT ALL PRIVILEGES. I don't know why or if this is a normal thing for MySQL connectors in any language.
maybe it's obvious and I'm an idiot (this is the first thing I ever worked on in python and I've almost never used MySQL). maybe it'll help someone.

Calling python script in web2py framework over webserver

I have NGINX UWSGI and WEB2PY installed on the server. Web2py application performing only one function by accessing the database and printing rows in the table.
def fetch():
import psycopg2
conn = psycopg2.connect(database="postgres",
user="postgres",
password="qwerty",
host="127.0.0.1")
cur = conn.cursor()
cur.execute("SELECT id, name from TEST")
rows = cur.fetchall()
conn.close()
return rows
When the function is called locally the table contents is returned.
But when I'm trying to call the function from remote machine I get an internal error 500.
One more interesting thing, is when function looks like this:
def hello():
return 'hello'
String 'hello' is returned. Starting adding it an import directive immediately causes error page to be generated.
Can any one please suggest the proper application syntax/logic?
My guess is that your MySQL service doesn't allow remote access. Could you check your MySQL configuration?
vim /etc/mysql/my.cnf
Comment out the following lines.
#bind-address = 127.0.0.1
#skip-networking
If there is no skip-networking line in your configuration file, just add it and comment out it.
And then restart the mysql service.
service mysql restart
Forgive the stupid question but have you checked if the module is available on your server?
When you say that the error appears in your hello function as soon as you try to import, it's the same directive import psycopg2?
Try this:
Assuming that fetch() it's defined on controllers/default.py
open folder views/default and create a new file called fetch.html
paste this inside
{{extend 'layout.html'}}
{{=rows}}
fetch.html is a view or a template if you prefer
Modify fetch() to return a dictionary with rows for the view to print
return dict(rows=rows)
this is very basic tough, you can find more information about basic steps in the book -> http://www.web2py.com/books/default/chapter/29/03/overview#Postbacks

Categories

Resources