What is the standard Python module for SQL queries? - python

I am wondering what is the standard Python module for SQL queries?
I am writing queries for an Oracle database in particular. I am looking to write quick, easy, and direct queries, in the context of both scripts and small programs.

Each SQL database has their own module which implements DB-API 2.0. Oracle uses cx_Oracle.

DB-API is the standard API for accessing SQL databases from Python.
The following page provides further information: http://wiki.python.org/moin/DatabaseProgramming/

Related

Implementing Python DB-API

I'm trying to implement the python DB-API for a small "database" that we built internally. This database does not expose an ODBC interface (or JDBC for that matter). My goal is to create a sqlalchemy for this so that I can use it with an application like Superset for example. I have created JDBC drivers in the past and that requires full Java implementation of the methods from the interfaces. In case of Python's DB-API, I couldn't find any example. Even the one I saw with psycopg2 https://github.com/psycopg/psycopg2 is fully written in C and I'm not an expert on C.
Any way to implement the DB-API only in python? Is there any examples available? (Sorry if my understanding of db-api is not correct.)
You can find plenty of DB-API drivers written in Python. The specific libraries depend on how your database communicates and packs/unpacks data.
If you're database is listening on a port, you'll probably be using the socket module.
If you're doing any kind of batch inserting or unpacking, you want to check out the struct module as well.
If you don't need support for Python 2, with Python 3.3+ you get memoryview().cast(), which may also come handy with regard to unpacking data.
Python 3.8 comes with the shared memory module, which can help you out when you start optimizing.
If your database runs on a specific platform, ctypes comes handy for pulling out OS specific tweaks (Like manually implementing shared memory if you can't use Python 3.8).
Pandas used to support DB-API connections directly. It currently only supports the SQlite DB-API officially, but you can piggyback it, which will allow you to test yourself with a known tool.

Is there a CherryPy equivalent to Django Model.objects.filter()?

I know that in Django I can fetch objects from the DB with something like ModelName.objects.filter().
Is there an analogous pattern in CherryPy?
Yes but not native. There are a couple of python ORM's that appear to work great with cherry pie with similar syntax to django. SQLAlchemy is an extremely popular very well supported ORM. It has a huge active community and is probably the de-facto python ORM. THere is a tool posted on cherrypy site that helps with integration.
From wikipedia:
Object-relational mappers:
SQLAlchemy — a database backend and ORM for Python applications. TurboGears 2.x uses CherryPy as server and SQLAlchemy as its default ORM.[13]
SQLObject — a popular ORM for providing an object interface to your database. Supports a number of common database backends: included in the distribution are MySQL, PostgreSQL, SQLite, Sybase SQL Server, MaxDB, Microsoft SQL Server and Firebird. TurboGears 1.x uses CherryPy as server and SQLObject as ORM.[14]
Storm — the ORM from Canonical Ltd. (makers of Ubuntu)
Dejavu[15] — a public domain, thread-safe ORM for Python applications

Pure python SQL solution that works with PostgreSQL and MySQL?

I am looking for a pure-python SQL library that would give access to both MySQL and PostgreSQL.
The only requirement is to run on Python 2.5+ and be pure-python, so it can be included with the script and still run on most platforms (no-install).
In fact I am looking for a simple solution that would allow me to write SQL and export the results as CSV files.
Two part answer:
A) This is absolutely possible.
B) Depending on your exact concerns, a pure-python may or may not be a good approach to your problem.
Explained:
The SqlAlchemy library comes with two components : the more popular "ORM" , and the "Core" which it sits on top of. Either one will let you write your SQL commands in the SqlAlchemy format (which is just Python); SqlAlchemy will then compile the statements to Mysql or PostgreSQL and connect to the appropriate database.
SqlAlchemy is a great library, and I recommend it for just about everything. While you do have to write your statements in their format, it's easy to pick up -- and you can switch to virtually any underlying database library you want... at any time. It's the perfect platform to use in any database project , whether or not you need to support multiple backends.
SqlAlchemy talks to the database via the standard DBAPI drivers, and does support multiple options for pure python, notably the pymysql and pypostgresql drivers ( http://docs.sqlalchemy.org/en/latest/core/engines.html#supported-dbapis )
As for writing csv, the standard library has you covered.
import csv
So the caveat?
The following may or may not apply to your situation:
Most higher level db modules in the python universe are still recommending mysql-python and psycopg - both of which are not pure-python and compile/link/configure against the installed database. This largely seems to be from a mix of API / integration concerns and the speed of the various pure-python packages compared to c-extensions when run under CPython.
There are pure-python drivers like I recommended, but most reviewers state that they're largely experimental. The pymysql authors claim stability and production readiness, some developers who blog have challenged that. As for what "stabile" or "experimental" means, that varies with the projects. Some have a changing API, others are incomplete, some are buggy.
You'd need to ensure that you can find pure-python drivers for each system that support the exact operations you need. This could be simple, or this could be messy. Whether you use SqlAlchemy or something else, you'll still face this concern when selecting a DBAPI driver.
The PyPy project ( pure-python python interpreter ) has a wiki listing the compatibility of various packages : https://bitbucket.org/pypy/compatibility/wiki/Home I would defer to them for specific driver suggestions. If PyPy is your intended platform, SqlAlchemy runs perfectly on it as well.
Are you looking for an ORM, or a single library that would allow you to write SQL statements directly and convert where there are differences?
I'm not sure whether psycopg2 is pure-python or not, but it certainly has bindings and works on all platforms. You'd still have to install at least psycopg2 to communicate with the PostgreSQL database as Python (as far as I know) doesn't ship with it natively.
From there, any additional ORM library you want would also need to be installed, but most are pure-python on-top of whatever backend they use.
Storm, Django, SQLAlchemy all have abstracted layers on top of their database layer - based on your description, Django is probably too large a framework for your needs (was for mine) but is a popular one, SQLAlchemy is a tried and true system - tho a bit clunky particularly if you have to deal with inheritance (in my opinion). I have heard that Storm is good, tho I haven't tested too much with it so I can't fully say.
If you are looking to mix and match (some tables in MySQL and some tables in PostgreSQL) vs. a single database that could be either MySQL or PostgreSQL, I've been working on an ORM called ORB that focuses more on object-oriented design and allows for multiple databases and relationships between databases. Right now it only supports PostgreSQL and Mongo, just cause I haven't needed MySQL, but I'd be up for writing that backend. The code for that can be found at http://docs.projexsoftware.com/api/orb
Use SQL-Alchemy. It will work with most database types, and certainly does work with postgres and MySQL.

In python, is there a simple way to connect to a mysql database that doesn't require root access?

I'm writing a script to parse some text files, and insert the data that they contain into a mysql database. I don't have root access on the server that this script will run on. I've been looking at mysql-python, but it requires a bunch of dependencies that I don't have available. Is there a simpler way to do this?
I would recommend the MySQL Python Connector, a MySQL DB-API adapter that does not use the C client library but rather reimplements the MySQL protocol completely in pure Python (compatible with Python 2.5 to 2.7, as well a 3.1).
To install C-coded extensions to Python you generally need root access (though the server you're using might have arranged things differently, that's not all that likely). But with a pure Python solution you can simply upload the modules in question (e.g. those from the Connector I recommend) just as you're uploading those you write yourself, which (if you of course do have a valid userid and password for that MySQL database!-) might solve it for you.

Python & sql server

What is the best way to access sql server from python is it DB-API ?
Also could someone provide a such code using the DB-API how to connect to sql server from python and excute query ?
See pyodbc - Python Database API Specification v2.0 implementation. Works great.
Additionally you can use it with other databases. If using with SQL Server 2008, make sure you use Native Driver if you need to support new DATE data types.
See pymssql It is a DB-API module for MS Sql Server and would be the most pythonic way. The documentation includes examples.
If on a Windows OS you could also use OLEDB through COM which will not require any thing else to be installed on the client.
Also if you use Iron Python you can use the .Net APIs
Also could someone provide a such code using the DB-API
how to connect to sql server from python and excute query ?
This hello world snippet pretty much shows you the common way how to connect with SQL server in Python with an DBI 2.0 database interface module.
Disclaimer: I'm the developer of pypyodbc
ODBC + freetds + a python wrapper library for ODBC.

Categories

Resources