pyodbc vs ADO via COM for Python scripts - python

For many years, my company has used the win32com module and ADO to connect to databases via ODBC in Python scripts. I do not like ADO because it is ancient and because COM is inherently slow and because it tends to throw one particular exception for which there is no workaround I've ever found. We use ODBC because we cannot assume that our customers have any particular database system (although most of them use PostgreSQL). We have a class that wraps ADO and provides access to most (maybe all) of the functionality in ADO. I am at a point where I could recommend a complete changeover to pyodbc. Before I do that, I'm curious: are there advantages to ADO via win32com? Does it have more capability than pyodbc?

are there advantages to ADO via win32com? Does it have more capability than pyodbc?
Practically speaking, and specifically with regard to ODBC, not really. ADODB would have the advantage of being able to use an OLEDB provider for a database that had an OLEDB provider but not an ODBC driver, but that would be a rare occurrence. (The only such database I can recall is "SQL Server Compact Edition", which was discontinued long ago.)
As mentioned in the comments to the question, pyodbc would have the advantage of avoiding extra layers of middleware when communicating with the database, i.e.,
your Python app ↔ pyodbc ↔ ODBC Driver Manager ↔ ODBC Driver ↔ database
vs.
your Python app ↔ win32com ↔ ADODB ↔ OLEDB provider for ODBC ↔ ODBC Driver Manager ↔ ODBC Driver ↔ database
As also mentioned, win32com/ADODB is a Windows-only technology, whereas a pyodbc solution could also be deployed on Linux or Mac if the appropriate ODBC drivers were available for those platforms.

Related

Oracle 10g express edition vs Mysql in python

Can't i use oracle 10g express edition for sql with python? as i have it installed in my pc. Or i need to hv mysql to use sql with python. I am learning DBMS this semester so i hv Oracle 10g and in Python there is database part too but basically we are using sql. So why install MYsql?
python is a programming language which can communicate with any database for which there exists a driver. For Oracle that is cx_oracle. MySQL has a python connector. PostgreSQL uses the psycopg2 module.
So basically you can probably use whatever DBMS you are comfortable with. Unless your teacher has a specific flavour they want you to use. However, I would suggest you use a more recent version of Oracle XE than 10g. That's more than a decade old. I think you will have fewer driver compatibility issues if you use a modern version of whatever database you choose. For Oracle that's XE 18c.

access netezza database with python without driver from IBM

Is there a way to query a netezza database without explicitly installing its driver? I am using ubuntu 64 bit OS, our IT support says the driver they have only works on red hat systems.
If you can get your hands on the JDBC driver, you could use the Python, jaydebeapi module, with the driver to connect to the server. Note that there are a couple quirks involved. Namely things like boolean datatypes.
You can use pyodbc.
pyodbc is an open source Python module that makes accessing ODBC
databases simple. It implements the DB API 2.0 specification but is
packed with even more Pythonic convenience.
On Ubuntu systems, all you need to do is run
sudo apt install unixodbc-dev
before attempting
pip install pyodbc
See more details from Installing pyodbc.

Accessing Postgres on Linux from Python using turbodbc

I'm trying to read large amounts of data from Postgres on Linux via Python. SQL Alchemy is unacceptably slow. turbodbc https://github.com/blue-yonder/turbodbc bills itself as being fast, but seems to require an ODBC source, which is Windows, not Linux, AFAIK. (The Postgres FTP site has only .dlls for ODBC.) Yet, it claims Linux / Postgres compatibility.
How do I access Postgres on Linux via turbodbc or any other ODBC?
turbodbc works with PostgreSQL and Linux. This requires the packages unixodbc and odbc-postgresql to be installed. Then you need to set up a data source according to PostgreSQL's specifications.
The one issue here is that it won't be blazingly fast. Turbodbc is just an efficient way to communicate with the ODBC driver, basically exploiting bulk operations. However, the ODBC driver freely available for PostgreSQL itself is pretty slow. There is not much turbodbc can do about this.
I'd recommend psycopg2 or asyncpg (the latter requires Python 3.5, but is indeed very fast).

How to config Django using pymysql as driver?

I'm new to Django. It wasted me whole afternoon to config the MySQL engine. I am very confused about the database engine and the database driver. Is the engine also the driver? All the tutorial said that the ENGINE should be 'django.db.backends.mysql', but how the ENGINE decide which driver is used to connect MySQL?
Every time it says 'django.db.backends.mysql', sadly I can't install MySQLDb and mysqlclient, but PyMysql and the official mysql connector 2.1.3 has been installed. How could I set the driver to PyMysql or mysql connector?
Many thanks!
OS: OS X Al Capitan
Python: 3.5
Django: 1.9
This question is not yet solved:
Is the ENGINE also the DRIVER?
You can import pymsql so it presents as MySQLdb. You'll need to do this before any django code is run, so put this in your manage.py file
import pymysql
pymysql.install_as_MySQLdb()
The short answer is no they are not the same.
The engine, in a Django context, is in reference to RDBMS technology. The driver is the library developed to facilitate communication to that actual technology when up and running. Letting Django know what engine to use tells it how to translate the ORM functions from a backend perspective. The developer doesn't see a change in ORM code but Django will know how to convert those actions to a language the technology understands. The driver then takes those actions (e.g. selects, updates, deletes) and sends them over to a running instance to facilitate the action.

Is it possible to use django-pyodbc with iSeries Access ODBC Driver?

I am trying to get Django-pyodbc to work with DB2 on IBM i using the standard IBM i Access ODBC driver.
I know there is a Django DB implementation supported by IBM, but that requires the DB2 Connect product, which is (for us) prohibitively expensive, whereas the included Access ODBC driver comes no-charge with the OS.
I have seen a question asked regarding django-pyodbc with iSeries ODBC, suggesting that it is possible, but I have found no way to get it to work:
https://stackoverflow.com/questions/25066866/django-inspectdb-on-db2-database
My first question therefore is; has anybody succeeded in getting this setup to work?
And if yes, can you share information on how you did it?
Thanks,
Richard

Categories

Resources