Kind of blocks in Python - python

In Ruby I can say:
def get_connection
db = connect_to_db()
yield
db.close()
end
and then call it
get_connection do
# action1.....
# action2.....
# action3.....
end
In Python I have to say
def get_connection(code_block):
db = connect_to_db()
code_block()
db.close()
get_connection(method1)
def method1():
# action1.....
# action2.....
# action3.....
It's not convenient since I have to create an extra method1. Notice that method1 can be big. Is there any way to emulate Ruby's anonymous blocks in Python?

Yes. Use the 'with' statement:
Using classes
class get_connection(object):
def __enter__(self):
self.connect_to_db()
def __exit__(self, *args, **kwargs):
self.close()
def some_db_method(self,...):
...
And use it like this:
with get_connection() as db:
db.some_db_method(...)
This does the following:
self.connect_to_db()
db.some_db_method(...)
self.close()
Have a look here: http://docs.python.org/release/2.5/whatsnew/pep-343.html . You can use the arguments taken by __exit__ to handle exceptions within the with statement, etc.
Using functions
from contextlib import contextmanager
#contextmanager
def db_connection():
db = connect_to_db()
yield db
db.close()
and use this:
with db_connection() as db:
db.some_db_method()
(Perhaps this is closer to your ruby equivalent. Also, see here for more details: http://preshing.com/20110920/the-python-with-statement-by-example)
Hope this helps

Related

python contextlib closing prevents enter/exit calls

Most likely I don't understand this fully but I have a class declared like
class db:
def __init__(self, name):
pass
#self._conn = get_conn(name)
def __enter__(self):
print('enter')
#self_conn.begin_transaction()
def __exit__(self, a, b, c):
print('exit')
#self._conn.commit()
def close(self):
print('close')
#self._conn.close()
When I use it like:
with db('bname') as db:
print('do something')
I get the expected output like:
enter
do something
exit
but when I use contextlib closing those functions don't get called at all
from contextlib import closing
with closing(db('bname')) as db:
print('do something')
I only get:
do something
close
My understanding was the contextlib closing can be used with context manager classes to call close always but what am i missing?
The closing class implements its own version of __exit__. This calls close().
Since you're passing an instance of closing to the with block, the __exit__ method of the closing instance will get called and not yours.
Subclass the closing class for your __exit__ to get called. Here's an example:
from contextlib import closing
class Db(closing):
def __init__(self):
super().__init__(self)
print('initializing')
def close(self):
print('closing')
def __enter__(self):
print('entering')
def __exit__(self, *exc_info):
print('exiting')
return super().__exit__(*exc_info)
with Db() as db:
print('running')
Output
initializing
entering
running
exiting
closing
It is the with statement that executes the __enter__ and __exit__ blocks. In your second example, you are applying in on the closing function, not on your db class. That's why those methods don't get triggered.

python3 singleton pattern with "with" statement

I want to write a class with singleton pattern to provide some persistent data storage using pickle/dict:
#singleton
class Pdb:
def __init__(self):
self.cache = None
self.dirty = False
try:
with open("data.pck","rb") as fp:
self.cache = pickle.load(fp)
except FileNotFoundError:
pass
except pickle.PickleError:
pass
if self.cache is None:
self.cache = {}
def flush(self):
if self.dirty:
try:
with open("data.pck","wb") as fp:
pickle.dump(self.cache,fp,protocol=4)
except pickle.PickleError:
pass
else:
self.dirty = False
def __del__(self): # PROBLEM HERE
self.flush()
When I was using python 2, I can do it by overriding __del__. But it does not appear to be correct in python 3. How can I do it?
If I do it by "with" statement, I will need to pass the instance to each function that I call:
def func1(db):
db.set(...)
func3(db,x1,x2,...)
with Pdb() as db:
func1(db)
func2(db)
It is complicated. Is there a pythonic way to do a global scope "with" statement?
If I do it by "with" statement, I will need to pass the instance to each function that I call:
No, you don't. Just use your singleton:
# global
db = Pdb()
# any other context
with db:
All that is required is that the expression produces a context manager. Referencing a singleton object with __enter__ and __exit__ methods would satisfy that requirement. You can even ignore the __enter__ return value, as I did above. The global will still be available to all your functions, the only thing that changes is that __enter__ and __exit__ will be called at the appropriate locations.
Note that even in Python 2, you should not rely on __del__ being called. And in the CPython implementation, outside circular references, the rules for when __del__ are called have not changed between Python 2 and 3.

designing python api. reliable cleanup vs easy-to-use in interpreter

I'm working on a design of the python-based API. At the moment I've met an issue with two divergent requirements. On the one side I would like to provide reliable way to cleanup API-related resources. Thus as I know the best way to do it is to use context managers like:
# lib
class Client(object):
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, tb):
do_cleanup()
def method1(self):
pass
def method2(self):
pass
# api user
import mylib
with mylib.Client() as client:
client.method1()
client.method2()
On the other hand I would like to provide a way to seamlessly use my lib in interactive interpreter.
But using compound construct like with or try-except-finally in interpreter makes usage of interpreter not so groovy because with-block is treated as a single statement. And it's would be preferred to use single statement per single api method like:
# interpreter session
>>> import mylib
>>> client = mylib.Client()
<client object at ...>
>>> client.method1()
True
>>> client.method2()
100
So, could I have any options here? Definitely there is a way to provide different usage semantics for scripts and for interpreter, but I would like to use it as a last resort.
The typical way to do this is to provide a method to do the cleanup manually, and have __exit__ call that method. For your example, if do_cleanup was implemented as a method, it can just be called from the interpreter when done with it.
class Client(object):
def __enter__(self):
return self
def __exit__(self, exc_type, exc_val, tb):
self.do_cleanup()
def do_cleanup(self):
pass
def method1(self):
pass
def method2(self):
pass
Then, in the interpreter:
>>> import mylib
>>> client = mylib.Client()
<client object at ...>
>>> client.method1()
True
>>> client.method2()
100
>>> client.do_cleanup()
I would recommend renaming do_cleanup to close or similar, so that the similarity between File Objects is more obvious.
You could create your 'base' logic, with open and close functions, in an implementation module, and extend these classes with a context manager:
#in private module mymodule.impl
class Base(object):
def open(self, ...):
...
def close(self):
...
This file would normally not be included by the client code.
Instead, client code imports exported.api, which looks like this:
#in public package mymodule
import mymodule.impl
class Base(mymodule.impl.Base):
def __enter__(self):
return self
def __exit__(self, ...):
self.close()

Close an sqlite3 database on exit, no matter what

I'm currently writing a script that uses sqlite3. I recently ran into a problem with the database being in use by another program due to my code exiting early with an error.
With similar problems, one usually uses:
conn = sqlite3.connect(...)
try:
#Do stuff
finally:
conn.close()
But this won't work in my case. In a nutshell, this is my code:
import sqlite3
class Thingamadoodle:
def __init__(self, ...):
self.conn = sqlite3.connect(...)
...
#Methods and stuff
def __del__(self):
self.conn.close()
poop = Thingamadoodle(...)
poop.do_stuff(...)
poop.throw_irritating_exception_that_you_cant_track_down(irritatingness=11)
After the program exits without closing the connection, I get errors when I try to modify the database.
Is there a way to safely close the connection, even on an unclean exit?
To be honest, i don't understand the question much, but why not just wrap the poop.do_stuff() in a try/except block?
try:
poop.do_stuff()
except:
poop.__del__()
finally:
poop.__del__()
Or to be a bit cleaner, use a context manager:
class Thingamadoodle:
def __init__(self, ...):
...
#Methods and stuff
def __enter__(self):
self.conn = sqlite3.connect(...)
return self
def __exit__(self, errorType, errorValue, errorTrace):
self.conn.close()
And just execute it as:
with Thingmadoodle(args) as poop:
#do things
After all the code is done, or after an exception happened in the statement, __exit__ will be executed, and you can safely close it.
Hope this helps!

Writing python decorators with "with"

I'm trying to write a decorator that can be used with with keyword.
# regular code ...
with my_exception_handler():
# dangerous code ...
# regular code ...
And my_exception_handler would receive a function and wrap it in a huge try-except.
I want to make it a decorator/wrapper because it's a lot of code that I don't want to copy-paste. I can't figure out where to start. I wrote a regular decorator and it works on functions, but not on intermediate chunks of code.
The thing that you use with with, is a context manager not a decorator—those are 2 completely different things.
See http://docs.python.org/release/2.5/whatsnew/pep-343.html for how with and context managers work.
See https://wiki.python.org/moin/PythonDecorators for decorators.
EDIT: see kindall's post for a good example on how to write a simple context manager without having to use a full-fledged class; I didn't have time to amend my answer with such an example :)
You need to write a context manager, not a decorator. You can very easily do what you want to do using the contextlib.contextmanager decorator.
from contextlib import contextmanager
#contextmanager
def my_exception_handler():
try:
yield # execute the code in the "with" block
except Exception as e:
# your exception-handling code goes here
print e
# try it out
with my_exception_handler():
raise ValueError("this error has value")
After learning about context managers I took an extended traceback function and turned it into a decorator and a context manager with a few snippets below:
def traceback_decorator(function):
def wrap(*args, **kwargs):
try:
return function(*args, **kwargs)
except:
print_exc_plus()
def traceback_wrapper(function=None, *args, **kwargs):
context = _TracebackContext()
if function is None:
return context
with context:
function(*args, **kwargs)
class _TracebackContext(object):
def __enter__(self):
pass
def __exit__(self, exc_type, exc_value, traceback):
if exc_type:
print_exc_plus()

Categories

Resources