How to use mysql in gevent based programs in python? - python

I have found that ultramysql meets my requirement. But it has no document, and no windows binary package.
I have a program heavy on internet downloads and mysql inserts. So I use gevent to solve the multi-download-tasks problem. After I downloaded the web pages, and parsed the web pages, I get to insert the data into mysql.
Is monkey.patch_all() make mysql operations async?
Can anyone show me a correct way to go.

Postgres may be better suited due to its asynchronous capabilities

I think one solution is use pymysql. Since pymysql use python socket, after monkey patch, should be work with gevent.

Related

Database migrations in Flask without SQLAlchemy

Is there any library or tool to perform database migrations in python without SQLAlchemy? I am using a PostgreSQL database and the queries are raw SQL. I use psycopg2 for the database connection.
In short, "yes."
There are numerous possibilities. The one you choose will depend on doing some homework to determine which one best suits your needs.
https://github.com/amacneil/dbmate
https://pypi.org/project/yoyo-migrations/
https://pypi.org/project/alembic/
Suggest when asking questions that can be answered with a google search, to be more specific about your situation, to help answerers to understand why Google couldn't answer your question in the first page of results.
If you can bring yourself to do your db migrations outside of the CLR, I'd strongly recommend considering FlyWay. It has been around forever, it's supported by RedGate, and the community version is free and open source. It has built-in support for all the major db platforms and installs like any other utility you'd use. For a Python app you'd do all of your db migrations via shell scripts or process calls instead of directly in the code.
My team settled on this tech because:
Yoyo is pretty immature and buggy for us
We don't use SQLAlchemy's OR framework
Alembic's syntax and framework seems pretty heavyweight for us
Under FlyWay's model we just write SQL scripts and check them into a directory. It's really lightweight.
It's been proven to work nicely in clustered environments
Give it a look.

using mysql instead of sqlite3

im working on python application that requiring database connections..I had developed my application with sqlite3 but it start showing the error(the database is locked).. so I decided to use MySQL database instead.. and it is pretty good with no error..
the only one problem is that I need to ask every user using my application to install MySQL server on his pc (appserv for example) ..
so can I make mysql to be like sqlite3 apart of python lib. so I can produce a python script can be converted into exe file by the tool pyInstaller.exe and no need to install mysql server by users???
update:
after reviewing the code I found opened connection not closed correctly and work fine with sqllite3 ..thank you every body
It depends (more "depends" in the answer).
If you need to share the data between the users of your application - you need a mysql database server somewhere setup, your application would need to have an access to it. And, the performance can really depend on the network - depends on how heavily would the application use the database. The application itself would only need to know how to "speak" with the database server - python mysql driver, like MySQLdb or pymysql.
If you don't need to share the data between users - then sqlite may be an option. Or may be not - depends on what do you want to store there, what for and what do you need to do with the data.
So, more questions than answers, probably it was more suitable for a comment. At least, think about what I've said.
Also see:
https://stackoverflow.com/questions/1009438/which-database-should-i-use-for-my-desktop-application
Python Desktop Application Database
Python Framework for Desktop Database Application
Hope that helps.
If your application is a stand-alone system such that each user maintains their own private database then you have no alternative to install MySQL on each system that is running the application. You cannot bundle MySQL into your application such that it does not require a separate installation.
There is an embedded version of MySQL that you can build into your application (thanks, Carsten, in the comments, for pointing this out). More information is here: http://mysql-python.blogspot.com/. It may take some effort to get this working (on Windows you apparently need to build it from source code) and will take some more work to get it packaged up when you generate your executable, but this might be a MySQL solution for you.
I've just finished updating a web application using SQLite which had begun reporting Database is locked errors as the usage scaled up. By rewriting the database code with care I was able to produce a system that can handle moderate to heavy usage (in the context of a 15 person company) reliably still using SQLite -- you have to be careful to keep your connections around for the minimum time necessary and always call .close() on them. If your application is really single-user you should have no problem supporting it using SQLite -- and that's doubly true if it's single-threaded.

Pure python SQL solution that works with PostgreSQL and MySQL?

I am looking for a pure-python SQL library that would give access to both MySQL and PostgreSQL.
The only requirement is to run on Python 2.5+ and be pure-python, so it can be included with the script and still run on most platforms (no-install).
In fact I am looking for a simple solution that would allow me to write SQL and export the results as CSV files.
Two part answer:
A) This is absolutely possible.
B) Depending on your exact concerns, a pure-python may or may not be a good approach to your problem.
Explained:
The SqlAlchemy library comes with two components : the more popular "ORM" , and the "Core" which it sits on top of. Either one will let you write your SQL commands in the SqlAlchemy format (which is just Python); SqlAlchemy will then compile the statements to Mysql or PostgreSQL and connect to the appropriate database.
SqlAlchemy is a great library, and I recommend it for just about everything. While you do have to write your statements in their format, it's easy to pick up -- and you can switch to virtually any underlying database library you want... at any time. It's the perfect platform to use in any database project , whether or not you need to support multiple backends.
SqlAlchemy talks to the database via the standard DBAPI drivers, and does support multiple options for pure python, notably the pymysql and pypostgresql drivers ( http://docs.sqlalchemy.org/en/latest/core/engines.html#supported-dbapis )
As for writing csv, the standard library has you covered.
import csv
So the caveat?
The following may or may not apply to your situation:
Most higher level db modules in the python universe are still recommending mysql-python and psycopg - both of which are not pure-python and compile/link/configure against the installed database. This largely seems to be from a mix of API / integration concerns and the speed of the various pure-python packages compared to c-extensions when run under CPython.
There are pure-python drivers like I recommended, but most reviewers state that they're largely experimental. The pymysql authors claim stability and production readiness, some developers who blog have challenged that. As for what "stabile" or "experimental" means, that varies with the projects. Some have a changing API, others are incomplete, some are buggy.
You'd need to ensure that you can find pure-python drivers for each system that support the exact operations you need. This could be simple, or this could be messy. Whether you use SqlAlchemy or something else, you'll still face this concern when selecting a DBAPI driver.
The PyPy project ( pure-python python interpreter ) has a wiki listing the compatibility of various packages : https://bitbucket.org/pypy/compatibility/wiki/Home I would defer to them for specific driver suggestions. If PyPy is your intended platform, SqlAlchemy runs perfectly on it as well.
Are you looking for an ORM, or a single library that would allow you to write SQL statements directly and convert where there are differences?
I'm not sure whether psycopg2 is pure-python or not, but it certainly has bindings and works on all platforms. You'd still have to install at least psycopg2 to communicate with the PostgreSQL database as Python (as far as I know) doesn't ship with it natively.
From there, any additional ORM library you want would also need to be installed, but most are pure-python on-top of whatever backend they use.
Storm, Django, SQLAlchemy all have abstracted layers on top of their database layer - based on your description, Django is probably too large a framework for your needs (was for mine) but is a popular one, SQLAlchemy is a tried and true system - tho a bit clunky particularly if you have to deal with inheritance (in my opinion). I have heard that Storm is good, tho I haven't tested too much with it so I can't fully say.
If you are looking to mix and match (some tables in MySQL and some tables in PostgreSQL) vs. a single database that could be either MySQL or PostgreSQL, I've been working on an ORM called ORB that focuses more on object-oriented design and allows for multiple databases and relationships between databases. Right now it only supports PostgreSQL and Mongo, just cause I haven't needed MySQL, but I'd be up for writing that backend. The code for that can be found at http://docs.projexsoftware.com/api/orb
Use SQL-Alchemy. It will work with most database types, and certainly does work with postgres and MySQL.

Non relational database for twisted

I'm looking for any key-value database implementation for working with twisted in asynchronous mode.
The one Idea that I have is using the Twisted Memcache API with MemcacheDB.
Is this some other solution?
One of possible solution is using Redis(REmote DIctionary Server).
Redis is very fast, powerful and stable key-value storage which is used in many projects. Stackoverflow also uses redis;).
I've recently start using redis in my current project for creating user's ratings. My personal opinion: redis is very simple, very fast and stable. It also has a pretty command line client, I like it.
On website I use synchronous redis package. Server uses twisted and requires asynchronous approach. Fortunately, there is third-party module txredis, which allows easily to interact with redis database using twisted. I didn't have any problems with it. However, txredis doesn't have a connection pool, but it's not a problem to implement it manually, if needed.
I use Apache Cassandra from twisted, using Telephus if production for years.
Adding one more point to #dr. 's answer marked as accepted : Use the python package txredisapi, which uses redis protocol for twisted with connection pool support and many more.

In python, is there a simple way to connect to a mysql database that doesn't require root access?

I'm writing a script to parse some text files, and insert the data that they contain into a mysql database. I don't have root access on the server that this script will run on. I've been looking at mysql-python, but it requires a bunch of dependencies that I don't have available. Is there a simpler way to do this?
I would recommend the MySQL Python Connector, a MySQL DB-API adapter that does not use the C client library but rather reimplements the MySQL protocol completely in pure Python (compatible with Python 2.5 to 2.7, as well a 3.1).
To install C-coded extensions to Python you generally need root access (though the server you're using might have arranged things differently, that's not all that likely). But with a pure Python solution you can simply upload the modules in question (e.g. those from the Connector I recommend) just as you're uploading those you write yourself, which (if you of course do have a valid userid and password for that MySQL database!-) might solve it for you.

Categories

Resources