Searching a MySQL Database on ClearDB using Python script - python

We have built a series a forms using PHP that populate a MySQL Database and after learning more Python want to begin transitioning our whole web app over to a python back end instead of PHP. Can anyone offer a quick intro into searching a MySQL DB using Python?

The easiest way is to abstract the database using an ORM. I found that the python package SQLAlchemy works fantastically.
A ORM let's you treat the database objects as normal python objects. Most queries can be abstracted and for 99% of cases you won't need to write SQL queries. Also this makes transition between database technologies very simple.
Go check it out on:
http://www.sqlalchemy.org/
A search query would be something like:
session.query(User).filter_by(first_name='bob')
Now you will have all users with the first name 'bob'

You search it the same way you would in PHP; because the queries you are using will not change.
The only difference is the driver you have to use.

Related

Using SQLAlchemy without Predefined Tables for SQL Generation

I want to write my SQL queries in Python so that I can make them parameterizable and composable.
I was hoping to do something like the following via SQLAlchemy, but the expression API doesn't seem to support it without automapping my database.
select('*').from_('table')
How can I accomplish that with SQLAlchemy or should I look to use a different tool?

Cleanest way to make an ORM for neo4j + sql in python flask? One model over 2 databases

How can I create one model that talks to two databases in Flask, where one is, say, sqlite, and the other is specifically neo4j?
I'd like to have login and password stuff in a traditional db, and keep other graphy information in neo4j. I'm told neo4j is bad for things that need large graph traversals. Perhaps I'm wrong in needing this, but I have an instance where I'd like to say something like...
"return a dict(person.x,person.y,person.z) from all nodes where type==person", and then feed that into the view of my index page.
I've seen related questions about ORMs with neo4j:
ORM with Graph-Databases like Neo4j in Python
...and this about multiple DBs in Flask:
http://packages.python.org/Flask-SQLAlchemy/binds.html
Specifically, I see this taking the form of my create statement writing to sqlite db connection and then writing a key from there to additional relational information in neo4j.
I've recently released an OGM (Object-Graph Mapping) module for py2neo (http://book.py2neo.org/en/latest/ogm.html). This might help with what you're trying to do.
Otherwise, you could also look at neomodel (https://github.com/robinedwards/neomodel). It's written for Django but should be usable in Flask too.
I don't know about mixed backend models, but I think depending on your user count, you can use neo4j for your users, too. If you put the user nodes into an index, you can get all users without searching the graph.
If you find that this actually is a bottleneck, migrating it to a split storage should not be too difficult.
It is not that hard to adapt neo4j-driver and py2neo to use eg. Flask-Login.
I have use py2neo to that and worked well, but migrated now to neo4j-driver
Downside is that i did not manage to get it working with eg SQLalchemy etc.
Using a double backend solution is not a problem, in an earlier project i have used SQLalchemy with SQLite3 and PostgreSQL, Neo4j and redis together.
Using that, i have found no issues other than some design issues.

medium - large portable database options for python program

I am creating a simple python program that needs to search a somewhat large database ( ~40 tables, 6 Million or so rows all together ).
Currently, I use MySQLdb to query my local MySQL database then I have some other python function that work with the data and returns some statistics and other stuff. I would like to share this with others that do not want to construct their own database. At this point the database is used for queries only.
How best can I share the database and python program as a "package". Do I have to give up on the SQL method and switch to some sort of text file database or is there an easier way... sqlite maybe?
If the answer is sqlite how do I go about exporting my current SQL database to the sqlite database? Is there any gotchas I should know about?
Currently I use simple SELECT quarries with a few WHERE statements to locate the data I need. I am afraid that if I switched to text based database I would end up having to write a large amount of code to make these queries.
Thank you in advance for any suggestions.
EDIT
So I wrote my little python program with an sqlite3 database and it works perfectly.
I ended up using using a shell script called mysql2sqlite.sh found here to convert my MySQL database to sqlite. It worked flawlessly.
I only had to change 2 lines of python code. Awesome.
My little program runs in osx, windows and linux (ubuntu and redhat) without any changes or hassle. Thanks for the advise!
Converting your database could be as easy as an sql-dump and then an import, depending on the complexity of your db. See this post for strategies and alternatives.

SQLAlchemy or psycopg2?

I am writing a quick and dirty script which requires interaction with a database (PG).
The script is a pragmatic, tactical solution to an existing problem. however, I envisage that the script will evolve over time into a more "refined" system. Given the fact that it is currently being put together very quickly (i.e. I don't have the time to pour over huge reams of documentation), I am tempted to go the quick and dirty route, using psycopg.
The advantages for psycopg2 (as I currently understand it) is that:
written in C, so faster than sqlAlchemy (written in Python)?
No abstraction layer over the DBAPI since works with one db and one db only (implication -> fast)
(For now), I don't need an ORM, so I can directly execute my SQL statements without having to learn a new ORM syntax (i.e. lightweight)
Disadvantages:
I KNOW that I will want an ORM further down the line
psycopg2 is ("dated"?) - don't know how long it will remain around for
Are my perceptions of SqlAlchemy (slow/interpreted, bloated, steep learning curve) true - IS there anyway I can use sqlAlchemy in the "rough and ready" way I want to use psycopg - namely:
execute SQL statements directly without having to mess about with the ORM layer, etc.
Any examples of doing this available?
SQLAlchemy is a ORM, psycopg2 is a database driver. These are completely different things: SQLAlchemy generates SQL statements and psycopg2 sends SQL statements to the database. SQLAlchemy depends on psycopg2 or other database drivers to communicate with the database!
As a rather complex software layer SQLAlchemy does add some overhead but it also is a huge boost to development speed, at least once you learned the library. SQLAlchemy is an excellent library and will teach you the whole ORM concept, but if you don't want to generate SQL statements to begin with then you don't want SQLAlchemy.
To talk with database any one need driver for that. If you are using client like SQL Plus for oracle, MysqlCLI for Mysql then it will direct run the query and that client come with DBServer pack.
To communicate from outside with any language like java, c, python, C#... We need driver to for that database. psycopg2 is driver to run query for PostgreSQL from python.
SQLAlchemy is the ORM which is not same as database driver. It will give you flexibility so you can write your code without any database specific standard. ORM provide database independence for programmer. If you write object.save in ORM then it will check, which database is associated with that object and it will generate insert query according to the backend database.

Setting up Pyramid to use MySQL raw instead of SQLAlchemy

We're trying to set up a Pyramid project that will use MySQL instead of SQLAlchemy.
My experience with Pyramid/Python is limited, so I was hoping to find a guide online. Unfortunately, I haven't been able to find anything to push us in the right direction. Most search results were for people trying to use raw SQL/MySQL commands with SQLAlchemy (many were re-posted links).
Anyone have a useful tutorial on this?
Pyramid at its base does not assume that you will use any one specific library to help you with your persistence. In order to make things easier, then, for people who DO wish to use libraries such as SQLALchemy, the Pyramid library contains Scaffolding, which is essentially some auto-generated code for a basic site, with some additions to set up items like SQLAlchemy or a specific routing strategy. The pyramid documentation should be able to lead you through creating a new project using the "pyramid_starter" scaffolding, which sets up the basic site without SQLAlchemy.
This will give you the basics you need to set up your views, but next you will need to add code to allow you to connect to a database. Luckily, since your site is just python code, learning how to use MySQL in Pyramid is simply learning how to use MySQL in Python, and then doing the exact same steps within your Pyramid project.
Keep in mind that even if you'd rather use raw SQL queries, you might still find some usefulness in SQLAlchemy. At it's base level, SQLAlchemy simply wraps around the DBAPI calls and adds in useful features like connection pooling. The ORM functionality is actually a large addition to the tight lower-level SQLAlchemy toolset.
sqlalchemy does not make any assumption that you will be using it's orm. If you wish to use plain sql, you can do so, with nothing more than what sqlalchemy already provides. For instance, if you followed the recipe in the cookbook, you would have access to the sqlalchemy session object as request.db, your handler would look something like this:
def someHandler(request):
rows = request.db.execute("SELECT * FROM foo").fetchall()
The Quick Tutorial shows a Pyramid application that uses SQL but not SQLAlchemy. It uses SQLite, but should be reasonably easy to adapt for MySQL.

Categories

Resources