Decorating a DBUS method - python

I'm trying to combine DBUS' asynchronous method calls with Twisted's Deferreds, but I'm encountering trouble in tweaking the usual DBUS service method decorator to do this.
To use the DBUS async callbacks approach, you'd do:
class Service(dbus.service.Object):
#dbus.service.method(INTERFACE, async_callbacks=('callback', 'errback'))
def Resources(self, callback, errback):
callback({'Magic' : 42})
There's a few places where I simply wrap those two methods in a Deferred, so I thought I'd create a decorator to do that for me:
def twisted_dbus(*args, **kargs):
def decorator(real_func):
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
def wrapped_func(callback, errback, *inner_args, **inner_kargs):
d = defer.Deferred()
d.addCallbacks(callback, errback)
return real_func(d, *inner_args, **inner_kargs)
return wrapped_func
return decorator
class Service(dbus.service.Object):
#twisted_dbus(INTERFACE)
def Resources(self, deferred):
deferred.callback({'Magic' : 42})
This, however, doesn't work since the method is bound and takes the first argument, resulting in this traceback:
$ python service.py
Traceback (most recent call last):
File "service.py", line 25, in <module>
class StatusCache(dbus.service.Object):
File "service.py", line 32, in StatusCache
#twisted_dbus(INTERFACE)
File "service.py", line 15, in decorator
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
File "/usr/lib/pymodules/python2.6/dbus/decorators.py", line 165, in decorator
args.remove(async_callbacks[0])
ValueError: list.remove(x): x not in list
I could add an extra argument to the inner function there, like so:
def twisted_dbus(*args, **kargs):
def decorator(real_func):
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
def wrapped_func(possibly_self, callback, errback, *inner_args, **inner_kargs):
d = defer.Deferred()
d.addCallbacks(callback, errback)
return real_func(possibly_self, d, *inner_args, **inner_kargs)
return wrapped_func
return decorator
But that seems... well, dumb. Especially if, for some reason, I want to export a non-bound method.
So is it possible to make this decorator work?

Why is it dumb? You're already assuming you know that the first positional argument (after self) is a Deferred. Why is it more dumb to assume that you know that the real first position argument is self?
If you also want to support free functions, then write another decorator and use that when you know there is no self argument coming.

Related

Use class function as __init__ parameter default value [duplicate]

The below Python fails for some reason.
class NetVend:
def blankCallback(data):
pass
def sendCommand(command, callback=NetVend.blankCallback):
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
def sendSignedCommand(command, signature, callback):
pass
I get the following error:
Traceback (most recent call last):
File "module.py", line 1, in <module>
class NetVend:
File "module.py", line 5, in NetVend
def sendCommand(command, callback=NetVend.blankCallback):
NameError: name 'NetVend' is not defined
You cannot refer to a class name while still defining it.
The class body is executed as a local namespace; you can refer to functions and attributes as local names instead.
Moreover, default values to function keyword parameters are bound at definition time, not when the method is called. Use None as a sentinel instead.
Instead of:
def sendCommand(command, callback=NetVend.blankCallback):
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
use:
def sendCommand(command, callback=None):
if callback is None:
callback = NetVend.blankCallback
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
You probably wanted to use the class as a factory for instances instead of as a namespace for what are essentially functions. Even if you only used one instance (a singleton) there are benefits in actually creating an instance first.
Well, I wouldn't say the first, but the second option is certainly true :-)
The trouble is that the default argument is evaluated at compile time, but at that point NetVend does not exist in that scope, because (obviously) the class itself has not yet been fully evaluated.
The way round it is to set the default to None, and check within the method:
def sendCommand(command, callback=None):
if callback is None:
callback=NetVend.blankCallback

__call__ or __init__ called here? Don't undestand which and why

Edit: this is unfortunately not answered in What is the difference between __init__ and __call__ in Python?
class OAuth2Bearer(requests.auth.AuthBase):
def __init__(self, api_key, access_token):
self._api_key = api_key
self._access_token = access_token
def __call__(self, r):
r.headers['Api-Key'] = self._api_key
r.headers['Authorization'] = "Bearer {}".format(self._access_token)
return r
#############
class AllegroAuthHandler(object):
def apply_auth(self):
return OAuth2Bearer(self._api_key, self.access_token) # what will happen here?
I read about __init__ and __call__, but I still don't undestand what is going on in this code
I don't understand:
1.) Which method will be called, __init__ or __call__
2.) If __init__, then __init__ doesn't return anything
3.) If __call__, then __call__ can't be called with two parameters
I think __init__ should be called, because we have X(), not x() from example below as in this answer:
x = X() # __init__ (constructor)
x() # __call__
I believe this is what you're looking for.
The behaviour of calling an object in Python is governed by its type's __call__, so this:
OAuth2Bearer(args)
Is actually this:
type(OAuth2Bearer).__call__(OAuth2Bearer, args)
What is the type of OAuth2Bearer, also called its "metaclass"? If not type, the default, then a subclass of type (this is strictly enforced by Python). From the link above:
If we ignore error checking for a minute, then for regular class instantiation this is roughly equivalent to:
def __call__(obj_type, *args, **kwargs):
obj = obj_type.__new__(*args, **kwargs)
if obj is not None and issubclass(obj, obj_type):
obj.__init__(*args, **kwargs)
return obj
So the result of the call is the result of object.__new__ after passed to object.__init__. object.__new__ basically just allocates space for a new object and is the only way of doing so AFAIK. To call OAuth2Bearer.__call__, you would have to call the instance:
OAuth2Bearer(init_args)(call_args)
I'd say it's neither here.
The part of code that's causing confusion is
OAuth2Bearer(self._api_key, self.access_token)
You need to know one thing: While OAuth2Bearer is the name of a class, it's also an object of class type (a built-in class). So when you write the above line, what's actually called is
type.__call__()
This can be easily verified if you try this code:
print(repr(OAuth2Bearer.__call__))
it will return something like this:
<method-wrapper '__call__' of type object at 0x12345678>
What type.__call__ does and returns is well covered in other questions: It calls OAuth2Bearer.__new__() to create an object, and then initialize that object with obj.__init__(), and returns that object.
You can think of the content of OAuth2Bearer(self._api_key, self.access_token) like this (pseudo-code for illustration purposes)
OAuth2Bearer(self._api_key, self.access_token):
obj = OAuth2Bearer.__new__(OAuth2Bearer, self._api_key, self.access_token)
obj.__init__()
return obj
__init__() is called when used with Class
__call__() is called when used with object of Class

super() in a decorated subclass in Python 2

I'm trying to use super in a subclass which is wrapped in another class using a class decorator:
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
return cls()
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(MySubclass, self).say(x.upper())
However, the call to super fails:
>>> MySubclass().make_instance().say('hello')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in say
TypeError: super(type, obj): obj must be an instance or subtype of type
The problem is that, when say is called, MySubclass doesn't refer to the original class anymore, but to the return value of the decorator.
One possible solution would be to store the value of MySubclass before decorating it:
class MySubclass(MyClass):
def say(self, x):
super(_MySubclass, self).say(x.upper())
_MySubclass = MySubclass
MySubclass = class_decorator(MySubclass)
This works, but isn't intuitive and would need to be repeated for each decorated subclass. I'm looking for a way that doesn't need additional boilerplate for each decorated subclass -- adding more code in one place (say, the decorator) would be OK.
Update: In Python 3 this isn't a problem, since you can use __class__ (or the super variant without arguments), so the following works:
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super().say(x.upper())
Unfortunately, I'm stuck with Python 2.7 for this project.
The problem is that your decorator returns a different class than python (or anyone who uses your code) expects. super not working is just one of the many unfortunate consequences:
>>> isinstance(MySubclass().make_instance(), MySubclass)
False
>>> issubclass(MySubclass, MyClass)
False
>>> pickle.dumps(MySubclass().make_instance())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
_pickle.PicklingError: Can't pickle <class '__main__.MySubclass'>: it's not the same object as __main__.MySubclass
This is why a class decorator should modify the class instead of returning a different one. The correct implementation would look like this:
def class_decorator(wrapped_cls):
#classmethod
def make_instance(cls):
return cls()
wrapped_cls.make_instance = make_instance
return wrapped_cls
Now super and everything else will work as expected:
>>> MySubclass().make_instance().say('hello')
HELLO
The problem occurs because at the time when MySubclass.say() is called, the global symbol MySubclass no longer refers to what's defined in your code as 'class MySubclass'. It is an instance of WrapperClass, which isn't in any way related to MySubclass.
If you are using Python3, you can get around this by NOT passing any arguments to 'super', like this:
super().say(x.upper())
I don't really know why you use the specific construct that you have, but it does look strange that a sub-class of MyClass that defines 'say()' - and has itself a 'say()' method in the source code would have to end up as something that does not have that method - which is the case in your code.
Note you could change the class WrapperClass line to make it read
class WrapperClass(cls):
this will make your wrapper a sub-class of the one you just decorated. This doesn't help with your super(SubClass, self) call - you still need to remove the args (which is OK only on Python3), but at least an instance created as x=MySubclass() would have a 'say' method, as one would expect at first glance.
EDIT: I've come up with a way around this, but it really looks odd and has the disadvantage of making the 'wrapped' class know that it is being wrapped (and it becomes reliant on that, making it unusable if you remove the decorator):
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
i = cls()
i._wrapped = cls
return i
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(self._wrapped, self).say(x.upper())
# make_instance returns inst of the original class, non-decorated i = MySubclass().make_instance() i.say('hello')
In essence, _wrapped saves a class reference as it was at declaration time, consistent with using the regular super(this_class_name, self) builtin call.

Preventing function (or decorator) from being nested

I've got some code in a decorator that I only want run once. Many other functions (utility and otherwise) will be called later down the line, and I want to ensure that other functions that may have this decorator aren't accidentally used way down in the nest of function calls.
I also want to be able to check, at any point, whether or not the current code has been wrapped in the decorator or not.
I've written this, but I just wanted to see if anyone else can think of a better/more elegant solution than checking for the (hopefully!) unique function name in the stack.
import inspect
def my_special_wrapper(fn):
def my_special_wrapper(*args, **kwargs):
""" Do some magic, only once! """
# Check we've not done this before
for frame in inspect.stack()[1:]: # get stack, ignoring current!
if frame[3] == 'my_special_wrapper':
raise StandardError('Special wrapper cannot be nested')
# Do magic then call fn
# ...
fn(*args, **kwargs)
return my_special_wrapper
def within_special_wrapper():
""" Helper to check that the function has been specially wrapped """
for frame in inspect.stack():
if frame[3] == 'my_special_wrapper':
return True
return False
#my_special_wrapper
def foo():
print within_special_wrapper()
bar()
print 'Success!'
#my_special_wrapper
def bar():
pass
foo()
Here is an example of using a global for this task - in what I believe is a relatively safe way:
from contextlib import contextmanager
from functools import wraps
_within_special_context = False
#contextmanager
def flag():
global _within_special_context
_within_special_context = True
try:
yield
finally:
_within_special_context = False
#I'd argue this would be best replaced by just checking the variable, but
#included for completeness.
def within_special_wrapper():
return _within_special_context
def my_special_wrapper(f):
#wraps(f)
def internal(*args, **kwargs):
if not _within_special_context:
with flag():
...
f(*args, **kwargs)
else:
raise Exception("No nested calls!")
return internal
#my_special_wrapper
def foo():
print(within_special_wrapper())
bar()
print('Success!')
#my_special_wrapper
def bar():
pass
foo()
Which results in:
True
Traceback (most recent call last):
File "/Users/gareth/Development/so/test.py", line 39, in <module>
foo()
File "/Users/gareth/Development/so/test.py", line 24, in internal
f(*args, **kwargs)
File "/Users/gareth/Development/so/test.py", line 32, in foo
bar()
File "/Users/gareth/Development/so/test.py", line 26, in internal
raise Exception("No nested calls!")
Exception: No nested calls!
Using a context manager ensures that the variable is unset. You could just use try/finally, but if you want to modify the behaviour for different situations, the context manager can be made to be flexible and reusable.
The obvious solution is to have special_wrapper set a global flag, and just skip its magic if the flag is set.
This is about the only good use of a global variable - to allow a single piece of code to store information that is only used within that code, but which needs to survive the life of execution in that code.
It doesn't need to be set in global scope. The function could set the flag on itself, for example, or on any object or class, as long as nothing else will touch it.
As noted by Lattyware in comments, you'll want to use either a try/except, or perhaps even better, a context manager to ensure the variable is unset.
Update: If you need the wrapped code to be able to check if it is wrapped, then provide a function which returns the value of the flag. You might want to wrap it all up with a class for neatness.
Update 2: I see you're doing this for transaction management. There are probably already libraries which do this. I strongly recommend that you at least look at their code.
While my solution technically works, it requires a manual reset of the decorator, but you could very well modify things such that the outermost function is instead a class (with the instances being the wrappers of the decorated functions passed to it in __init__), and have reset() being called in __exit__(), which would then allow you to use the with statement to create the decorator to be usable only once within the context. Also note that it requires Python 3 due to the nonlocal keyword, but that can easily be adapted to 2.7 with a dict in place of the flag variable.
def once_usable(decorator):
"Apply this decorator function to the decorator you want to be usable only once until it is reset."
def outer_wrapper():
flag = False
def inner_wrapper(*args, **kwargs):
nonlocal flag
if not flag:
flag = True
return decorator(*args, **kwargs)
else:
print("Decorator currently unusable.") # raising an Error also works
def decorator_reset():
nonlocal flag
flag = False
return (inner_wrapper, decorator_reset)
return outer_wrapper()
Testing:
>>> def a(aa):
return aa*2
>>> def b(bb):
def wrapper(*args, **kwargs):
print("Decorated.")
return bb(*args, **kwargs)
return wrapper
>>> dec, reset = once_usable(b)
>>> aa = dec(a)
>>> aa(22)
Decorated.
44
>>> aaa = dec(a)
Decorator currently unusable.
>>> reset()
>>> aaa = dec(a)
>>> aaa(11)
Decorated.
22

subclassing file objects (to extend open and close operations) in python 3

Suppose I want to extend the built-in file abstraction with extra operations at open and close time. In Python 2.7 this works:
class ExtFile(file):
def __init__(self, *args):
file.__init__(self, *args)
# extra stuff here
def close(self):
file.close(self)
# extra stuff here
Now I'm looking at updating the program to Python 3, in which open is a factory function that might return an instance of any of several different classes from the io module depending on how it's called. I could in principle subclass all of them, but that's tedious, and I'd have to reimplement the dispatching that open does. (In Python 3 the distinction between binary and text files matters rather more than it does in 2.x, and I need both.) These objects are going to be passed to library code that might do just about anything with them, so the idiom of making a "file-like" duck-typed class that wraps the return value of open and forwards necessary methods will be most verbose.
Can anyone suggest a 3.x approach that involves as little additional boilerplate as possible beyond the 2.x code shown?
You could just use a context manager instead. For example this one:
class SpecialFileOpener:
def __init__ (self, fileName, someOtherParameter):
self.f = open(fileName)
# do more stuff
print(someOtherParameter)
def __enter__ (self):
return self.f
def __exit__ (self, exc_type, exc_value, traceback):
self.f.close()
# do more stuff
print('Everything is over.')
Then you can use it like this:
>>> with SpecialFileOpener('C:\\test.txt', 'Hello world!') as f:
print(f.read())
Hello world!
foo bar
Everything is over.
Using a context block with with is preferred for file objects (and other resources) anyway.
tl;dr Use a context manager. See the bottom of this answer for important cautions about them.
Files got more complicated in Python 3. While there are some methods that can be used on normal user classes, those methods don't work with built-in classes. One way is to mix-in a desired class before instanciating it, but this requires knowing what the mix-in class should be first:
class MyFileType(???):
def __init__(...)
# stuff here
def close(self):
# more stuff here
Because there are so many types, and more could possibly be added in the future (unlikely, but possible), and we don't know for sure which will be returned until after the call to open, this method doesn't work.
Another method is to change both our custom type to have the returned file's ___bases__, and modifying the returned instance's __class__ attribute to our custom type:
class MyFileType:
def close(self):
# stuff here
some_file = open(path_to_file, '...') # ... = desired options
MyFileType.__bases__ = (some_file.__class__,) + MyFile.__bases__
but this yields
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: __bases__ assignment: '_io.TextIOWrapper' deallocator differs from 'object'
Yet another method that could work with pure user classes is to create the custom file type on the fly, directly from the returned instance's class, and then update the returned instance's class:
some_file = open(path_to_file, '...') # ... = desired options
class MyFile(some_file.__class__):
def close(self):
super().close()
print("that's all, folks!")
some_file.__class__ = MyFile
but again:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: __class__ assignment: only for heap types
So, it looks like the best method that will work at all in Python 3, and luckily will also work in Python 2 (useful if you want the same code base to work on both versions) is to have a custom context manager:
class Open(object):
def __init__(self, *args, **kwds):
# do custom stuff here
self.args = args
self.kwds = kwds
def __enter__(self):
# or do custom stuff here :)
self.file_obj = open(*self.args, **self.kwds)
# return actual file object so we don't have to worry
# about proxying
return self.file_obj
def __exit__(self, *args):
# and still more custom stuff here
self.file_obj.close()
# or here
and to use it:
with Open('some_file') as data:
# custom stuff just happened
for line in data:
print(line)
# data is now closed, and more custom stuff
# just happened
An important point to keep in mind: any unhandled exception in __init__ or __enter__ will prevent __exit__ from running, so in those two locations you still need to use the try/except and/or try/finally idioms to make sure you don't leak resources.
I had a similar problem, and a requirement of supporting both Python 2.x and 3.x. What I did was similar to the following (current full version):
class _file_obj(object):
"""Check if `f` is a file name and open the file in `mode`.
A context manager."""
def __init__(self, f, mode):
if isinstance(f, str):
self.file = open(f, mode)
else:
self.file = f
self.close_file = (self.file is not f)
def __enter__(self):
return self
def __exit__(self, *args, **kwargs):
if (not self.close_file):
return # do nothing
# clean up
exit = getattr(self.file, '__exit__', None)
if exit is not None:
return exit(*args, **kwargs)
else:
exit = getattr(self.file, 'close', None)
if exit is not None:
exit()
def __getattr__(self, attr):
return getattr(self.file, attr)
def __iter__(self):
return iter(self.file)
It passes all calls to the underlying file objects and can be initialized from an open file or from a filename. Also works as a context manager. Inspired by this answer.

Categories

Resources