I want to write a class with singleton pattern to provide some persistent data storage using pickle/dict:
#singleton
class Pdb:
def __init__(self):
self.cache = None
self.dirty = False
try:
with open("data.pck","rb") as fp:
self.cache = pickle.load(fp)
except FileNotFoundError:
pass
except pickle.PickleError:
pass
if self.cache is None:
self.cache = {}
def flush(self):
if self.dirty:
try:
with open("data.pck","wb") as fp:
pickle.dump(self.cache,fp,protocol=4)
except pickle.PickleError:
pass
else:
self.dirty = False
def __del__(self): # PROBLEM HERE
self.flush()
When I was using python 2, I can do it by overriding __del__. But it does not appear to be correct in python 3. How can I do it?
If I do it by "with" statement, I will need to pass the instance to each function that I call:
def func1(db):
db.set(...)
func3(db,x1,x2,...)
with Pdb() as db:
func1(db)
func2(db)
It is complicated. Is there a pythonic way to do a global scope "with" statement?
If I do it by "with" statement, I will need to pass the instance to each function that I call:
No, you don't. Just use your singleton:
# global
db = Pdb()
# any other context
with db:
All that is required is that the expression produces a context manager. Referencing a singleton object with __enter__ and __exit__ methods would satisfy that requirement. You can even ignore the __enter__ return value, as I did above. The global will still be available to all your functions, the only thing that changes is that __enter__ and __exit__ will be called at the appropriate locations.
Note that even in Python 2, you should not rely on __del__ being called. And in the CPython implementation, outside circular references, the rules for when __del__ are called have not changed between Python 2 and 3.
Related
Is there a way to pass arguments using context manager? Here is what I'm trying to do:
async with self.request.app['expiration_lock']('param'):
But I am getting an error:
TypeError: 'OrderStatusLock' object is not callable
Class OrderStatusLock:
class OrderStatusLock(_ContextManagerMixin, Lock):
def __init__(self, *args, loop=None):
print(args)
self._waiters = None
self._locked = False
if loop is None:
self._loop = events.get_event_loop()
else:
self._loop = loop
async def acquire(self, *args):
print('acq', args)
if (not self._locked and (self._waiters is None or
all(w.cancelled() for w in self._waiters))):
self._locked = True
return True
if self._waiters is None:
self._waiters = collections.deque()
fut = self._loop.create_future()
self._waiters.append(fut)
try:
try:
await fut
finally:
self._waiters.remove(fut)
except CancelledError:
if not self._locked:
self._wake_up_first()
raise
self._locked = True
return True
And if it is possible, what issues I can face, using this? Thank you very much.
There's a lot going on in your question, and I don't know where your _ContextManagerMixin class comes from. I also don't know much about async.
However, here's a simple (non-async) demonstration of a pattern where an argument can be passed to a context manager that alters how the __enter__ method of the context manager operates.
Remember: a context manager is, at its heart, just a class that implements an __enter__ method and an __exit__ method. The __enter__ method is called at the start of the with block, and the __exit__ method is called at the end of the with block.
The __call__ method added here in my example class is called immediately before the __enter__ method and, unlike the __enter__ method, can be called with arguments. The __exit__ method takes care to clean up the changes made to the class's internal state by the __call__ method and the __enter__ method.
from threading import Lock
class LockWrapper:
def __init__(self):
self.lock = Lock()
self.food = ''
def __enter__(self):
self.lock.acquire()
print(f'{self.food} for breakfast, please!')
return self
def __exit__(self, exc_type, exc_val, exc_tb):
self.lock.release()
self.food = ''
return True
def __call__(self, spam_preferred: bool):
if spam_preferred:
self.food = 'Spam'
else:
self.food = 'Eggs'
return self
breakfast_context_manager = LockWrapper()
with breakfast_context_manager(spam_preferred=True): # prints 'Spam for breakfast, please!'
pass
with breakfast_context_manager(spam_preferred=False): # prints 'Eggs for breakfast, please!'
pass
N.B. My example above will suppress all exceptions that are raised in the body of the with statement. If you don't want to suppress any exceptions, or if you only want to suppress certain kinds of exceptions, you'll need to alter the implementation of the __exit__ method.
Note also that the return values of these functions are quite important. __call__ has to return self if you want the class to be able to then call __enter__. __enter__ has to return self if you want to be able to access the context manager's internal state in the body of the with statement. __exit__ should return True if you want exceptions to be suppressed, and False if you want an encountered exception to continue to endure outside of the with statement.
You can find a good tutorial on context managers here. The tutorial also contains some info on how you might adapt the above pattern for async code using the __aenter__ and __aexit__ methods.
I encountered a code which looks similar to this:
from contextlib import contextmanager, ContextDecorator
class makepara(ContextDecorator):
def __enter__(self):
print ("<p>")
return self
def __exit__(self, *args):
print ("</p>")
return False
#makepara()
def emit_data():
print (" here is HTML code")
emit_data()
I found related answer this
but when i change the above code to
from contextlib import contextmanager, ContextDecorator
class makepara(ContextDecorator):
def __enter__(self):
print ("<p>")
def __exit__(self, *args):
print ("</p>")
#makepara()
def emit_data():
print (" here is HTML code")
emit_data()
there is no change in the output, that makes me wonder what does return self actually does and how to be used?
You choose to return self (or some other object, but usually the context manager instance itself) so that a name can be bound with this syntax:
with makepara() as var:
...
The object returned by __enter__ will be bound to the name var within the context (and will, in fact, remain bound to var after exiting context).
If you wouldn't need any value bound after entering context, it is possible to omit an explicit return (the implicit return of None would be used in this case regardless) but there is no harm and no disadvantage in returning self anyway.
return self is not only useful in with statement, but also useful in many other situations.
For example, when you open a file using:
with open("file") as f:
....
Function open actually return an object which implements __enter__, and in its __enter__, it use return self to let you bind this instance to variable f, so that you can do f.read or something else after.
In other situations, for another example, if you want to chained call(Maybe data = a.connect().get("key").to_dict()). You need to add return self to connect and get.
But after all, return self is nothing more than returning a normal variable.
In python is there a way to make a function/class that behaves like a function and a context manager?
Note: that I need the function/class to return an object that doesn't have a __exit__ method and I cant change that object (that's why I am wrapping it).
so just making a class with __enter__ and __exit__ won't work because I need it to also behave like a function.
I have tried the contextmanager decorator:
#contextmanager
def my_context_man(my_str):
my_str = 'begging ' + my_str
yield my_str+' after func'
print('end')
And it worked perfectly within the context manger, but not as a function:
a = 'middle'
old_a = my_context_man(a)
print('old_a', old_a)
with my_context_man(a) as new_a:
print('new_a', new_a)
the output:
old_a <contextlib._GeneratorContextManager object at 0x0000000004F832E8>
new_a begging middle after func
end
while the desired output will be:
old_a begging middle after func
new_a begging middle after func
end
edit:
The specific problem that i am having is with the psycopg2 module.
I want to use a different context manager. and it returns a connection object.
def connect(dsn=None, connection_factory=None, cursor_factory=None, **kwargs):
*the connection code*
return conn
I am trying to change it so that people will be able to use it with my context manager but in a way that will not break code.
I cannot change the conn object
Your __new__ method isn't returning an instance of my_context_man, but an instance of str, and str doesn't have an __enter__ method. It's the return value of __enter__ that gets bound to the name after as in the with statement. You want
class my_context_man:
def __init__(self, my_str):
self.msg = my_str
print("beginning " + my_str)
def __enter__(self):
return self.msg
def __exit__(self, exc_type, exc_val, exc_tb):
print('end')
I wrote a class with a "revert" decorator. The intention is to change class members in the yield, and if any exception occurs to "revert" all changes. :
class A():
def __init__(self):
self.kuku = 'old_value'
#contextmanager
def revertible_transaction(self):
old_self = deepcopy(self)
try:
yield
except Exception as e:
self = old_self
raise e
def change_stuff(self):
with self.revertible_transaction():
self.kuku = 'new_value'
raise Exception
I want self.kuku to still be 'old_value' after I run change_stuff(), but it's 'new_value' instead.
Any ideas why this is not working and how to do this properly?
You can't usefully assign to self; you're just changing which object a local name refers to without modifying the original object. You need to explicitly save the state of the object to restore if necessary.
#contextmanager
def revertible_transaction(self):
old_kuku = self.kuku
try:
yield
except Exception:
self.kuku = old_kuku
raise
self in the revertible_transaction is just a local variable. It is a different reference from the self in change_stuff, rebinding it won't change anything else. Even if you passed along the reference (say, by yield-ing and assigning with with ... as ..., you'd still not replace whatever reference the caller to the change_stuff() method has for the instance.
So you can't replace self like this. You'll need to replace the instance attributes; you can reach these through the self.__dict__ dictionary, usually:
#contextmanager
def revertible_transaction(self):
old_state = deepcopy(self.__dict__)
try:
yield
except Exception as e:
self.__dict__ = old_state
raise e
This won't work for classes that use slots, but the code could trivially be extended to then copy all names listed in all __slots__ class attributes in the MRO.
Suppose I want to extend the built-in file abstraction with extra operations at open and close time. In Python 2.7 this works:
class ExtFile(file):
def __init__(self, *args):
file.__init__(self, *args)
# extra stuff here
def close(self):
file.close(self)
# extra stuff here
Now I'm looking at updating the program to Python 3, in which open is a factory function that might return an instance of any of several different classes from the io module depending on how it's called. I could in principle subclass all of them, but that's tedious, and I'd have to reimplement the dispatching that open does. (In Python 3 the distinction between binary and text files matters rather more than it does in 2.x, and I need both.) These objects are going to be passed to library code that might do just about anything with them, so the idiom of making a "file-like" duck-typed class that wraps the return value of open and forwards necessary methods will be most verbose.
Can anyone suggest a 3.x approach that involves as little additional boilerplate as possible beyond the 2.x code shown?
You could just use a context manager instead. For example this one:
class SpecialFileOpener:
def __init__ (self, fileName, someOtherParameter):
self.f = open(fileName)
# do more stuff
print(someOtherParameter)
def __enter__ (self):
return self.f
def __exit__ (self, exc_type, exc_value, traceback):
self.f.close()
# do more stuff
print('Everything is over.')
Then you can use it like this:
>>> with SpecialFileOpener('C:\\test.txt', 'Hello world!') as f:
print(f.read())
Hello world!
foo bar
Everything is over.
Using a context block with with is preferred for file objects (and other resources) anyway.
tl;dr Use a context manager. See the bottom of this answer for important cautions about them.
Files got more complicated in Python 3. While there are some methods that can be used on normal user classes, those methods don't work with built-in classes. One way is to mix-in a desired class before instanciating it, but this requires knowing what the mix-in class should be first:
class MyFileType(???):
def __init__(...)
# stuff here
def close(self):
# more stuff here
Because there are so many types, and more could possibly be added in the future (unlikely, but possible), and we don't know for sure which will be returned until after the call to open, this method doesn't work.
Another method is to change both our custom type to have the returned file's ___bases__, and modifying the returned instance's __class__ attribute to our custom type:
class MyFileType:
def close(self):
# stuff here
some_file = open(path_to_file, '...') # ... = desired options
MyFileType.__bases__ = (some_file.__class__,) + MyFile.__bases__
but this yields
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: __bases__ assignment: '_io.TextIOWrapper' deallocator differs from 'object'
Yet another method that could work with pure user classes is to create the custom file type on the fly, directly from the returned instance's class, and then update the returned instance's class:
some_file = open(path_to_file, '...') # ... = desired options
class MyFile(some_file.__class__):
def close(self):
super().close()
print("that's all, folks!")
some_file.__class__ = MyFile
but again:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: __class__ assignment: only for heap types
So, it looks like the best method that will work at all in Python 3, and luckily will also work in Python 2 (useful if you want the same code base to work on both versions) is to have a custom context manager:
class Open(object):
def __init__(self, *args, **kwds):
# do custom stuff here
self.args = args
self.kwds = kwds
def __enter__(self):
# or do custom stuff here :)
self.file_obj = open(*self.args, **self.kwds)
# return actual file object so we don't have to worry
# about proxying
return self.file_obj
def __exit__(self, *args):
# and still more custom stuff here
self.file_obj.close()
# or here
and to use it:
with Open('some_file') as data:
# custom stuff just happened
for line in data:
print(line)
# data is now closed, and more custom stuff
# just happened
An important point to keep in mind: any unhandled exception in __init__ or __enter__ will prevent __exit__ from running, so in those two locations you still need to use the try/except and/or try/finally idioms to make sure you don't leak resources.
I had a similar problem, and a requirement of supporting both Python 2.x and 3.x. What I did was similar to the following (current full version):
class _file_obj(object):
"""Check if `f` is a file name and open the file in `mode`.
A context manager."""
def __init__(self, f, mode):
if isinstance(f, str):
self.file = open(f, mode)
else:
self.file = f
self.close_file = (self.file is not f)
def __enter__(self):
return self
def __exit__(self, *args, **kwargs):
if (not self.close_file):
return # do nothing
# clean up
exit = getattr(self.file, '__exit__', None)
if exit is not None:
return exit(*args, **kwargs)
else:
exit = getattr(self.file, 'close', None)
if exit is not None:
exit()
def __getattr__(self, attr):
return getattr(self.file, attr)
def __iter__(self):
return iter(self.file)
It passes all calls to the underlying file objects and can be initialized from an open file or from a filename. Also works as a context manager. Inspired by this answer.