Use class function as __init__ parameter default value [duplicate] - python

The below Python fails for some reason.
class NetVend:
def blankCallback(data):
pass
def sendCommand(command, callback=NetVend.blankCallback):
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
def sendSignedCommand(command, signature, callback):
pass
I get the following error:
Traceback (most recent call last):
File "module.py", line 1, in <module>
class NetVend:
File "module.py", line 5, in NetVend
def sendCommand(command, callback=NetVend.blankCallback):
NameError: name 'NetVend' is not defined

You cannot refer to a class name while still defining it.
The class body is executed as a local namespace; you can refer to functions and attributes as local names instead.
Moreover, default values to function keyword parameters are bound at definition time, not when the method is called. Use None as a sentinel instead.
Instead of:
def sendCommand(command, callback=NetVend.blankCallback):
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
use:
def sendCommand(command, callback=None):
if callback is None:
callback = NetVend.blankCallback
return NetVend.sendSignedCommand(command, NetVend.signCommand(command), callback)
You probably wanted to use the class as a factory for instances instead of as a namespace for what are essentially functions. Even if you only used one instance (a singleton) there are benefits in actually creating an instance first.

Well, I wouldn't say the first, but the second option is certainly true :-)
The trouble is that the default argument is evaluated at compile time, but at that point NetVend does not exist in that scope, because (obviously) the class itself has not yet been fully evaluated.
The way round it is to set the default to None, and check within the method:
def sendCommand(command, callback=None):
if callback is None:
callback=NetVend.blankCallback

Related

super() in a decorated subclass in Python 2

I'm trying to use super in a subclass which is wrapped in another class using a class decorator:
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
return cls()
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(MySubclass, self).say(x.upper())
However, the call to super fails:
>>> MySubclass().make_instance().say('hello')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in say
TypeError: super(type, obj): obj must be an instance or subtype of type
The problem is that, when say is called, MySubclass doesn't refer to the original class anymore, but to the return value of the decorator.
One possible solution would be to store the value of MySubclass before decorating it:
class MySubclass(MyClass):
def say(self, x):
super(_MySubclass, self).say(x.upper())
_MySubclass = MySubclass
MySubclass = class_decorator(MySubclass)
This works, but isn't intuitive and would need to be repeated for each decorated subclass. I'm looking for a way that doesn't need additional boilerplate for each decorated subclass -- adding more code in one place (say, the decorator) would be OK.
Update: In Python 3 this isn't a problem, since you can use __class__ (or the super variant without arguments), so the following works:
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super().say(x.upper())
Unfortunately, I'm stuck with Python 2.7 for this project.
The problem is that your decorator returns a different class than python (or anyone who uses your code) expects. super not working is just one of the many unfortunate consequences:
>>> isinstance(MySubclass().make_instance(), MySubclass)
False
>>> issubclass(MySubclass, MyClass)
False
>>> pickle.dumps(MySubclass().make_instance())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
_pickle.PicklingError: Can't pickle <class '__main__.MySubclass'>: it's not the same object as __main__.MySubclass
This is why a class decorator should modify the class instead of returning a different one. The correct implementation would look like this:
def class_decorator(wrapped_cls):
#classmethod
def make_instance(cls):
return cls()
wrapped_cls.make_instance = make_instance
return wrapped_cls
Now super and everything else will work as expected:
>>> MySubclass().make_instance().say('hello')
HELLO
The problem occurs because at the time when MySubclass.say() is called, the global symbol MySubclass no longer refers to what's defined in your code as 'class MySubclass'. It is an instance of WrapperClass, which isn't in any way related to MySubclass.
If you are using Python3, you can get around this by NOT passing any arguments to 'super', like this:
super().say(x.upper())
I don't really know why you use the specific construct that you have, but it does look strange that a sub-class of MyClass that defines 'say()' - and has itself a 'say()' method in the source code would have to end up as something that does not have that method - which is the case in your code.
Note you could change the class WrapperClass line to make it read
class WrapperClass(cls):
this will make your wrapper a sub-class of the one you just decorated. This doesn't help with your super(SubClass, self) call - you still need to remove the args (which is OK only on Python3), but at least an instance created as x=MySubclass() would have a 'say' method, as one would expect at first glance.
EDIT: I've come up with a way around this, but it really looks odd and has the disadvantage of making the 'wrapped' class know that it is being wrapped (and it becomes reliant on that, making it unusable if you remove the decorator):
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
i = cls()
i._wrapped = cls
return i
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(self._wrapped, self).say(x.upper())
# make_instance returns inst of the original class, non-decorated i = MySubclass().make_instance() i.say('hello')
In essence, _wrapped saves a class reference as it was at declaration time, consistent with using the regular super(this_class_name, self) builtin call.

Usage of the original class method after being decorated requires the object instance as parameter

I have the following code:
A decorator:
def pyDecorator(func):
print func
#wraps(func)
def wrapped(*args, **kwargs):
print args
print kwargs
tBegin = time()
result = func(*args, **kwargs)
tEnd = time()
if result:
# UI update
print("\nTBegin '{}'({} s)".format(func.__name__, tBegin))
# UI and report update
print("TEnd '{}' ({} s) ({} s) Result:{}".format(func.__name__, tEnd,tEnd - tBegin, result))
return result
#workarround to use the original function
wrapped._original=func
return wrapped
And a decorated class method:
class Dummy(object):
#pyDecorator
def ClassMethod(self):
print "Original class code executed"
return True
If I call the method for the original function in the following way, I receive this error "TypeError: ClassMethod() takes exactly 1 argument (0 given):"
ClassInstance.ClassMethod._original()
So I am forced to use the following call:
ClassInstance.ClassMethod._original(ClassInstance)
Is it possible to do this as in the first way ? I do not understand why I should put the class instance as a parameter when it is already provided.
ClassInstance.ClassMethod._original is a function not bound to any class instance.
Note that the transformation from function to method happens when a function object is accessed via a class instance, say, using dot reference. Here however, _original is only bound to another function object wrapper (elevated to a bound method at runtime) not to a class instance. An implicit self parameter is therefore not passed. You'll have to explicitly pass it.
ClassInstance.ClassMethod._original
^
|- instance ^
|- method
^
|- function object bound to method
I do not understand why I should put the class instance as a parameter
when it is already provided
No, it's not already provided.

Using a method from the parent class

I want to call a method from the parent class in a child class.
I use XX.__init__() in my child class and call the press function from the parent class. But it fails when I run the following code:
Func.py
class PC:
def __init__(self):
PCKeyDis = {}
self.PCKeyDis = PCKeyDis
def Press(self,key):
KeyDis = self.PCKeyDis
if len(key)==1 and key.islower():
key = key.upper()
win32api.keybd_event(KeyDis[key],0,0,0)
time.sleep(0.1)
win32api.keybd_event(KeyDis[key],0,win32con.KEYEVENTF_KEYUP,0)
class PCFunc(PC):
def __init__(self):
pass
def Sentence(self,string):
PC.__init__()
strlist = list(string)
for i in xrange(len(strlist)):
if strlist[i] == ' ':
strlist[i] = 'Space'
PC.Press(strlist[i]) #use this function
action.py
import Func
import win32gui
PC = Func.PC()
PCFunc = Func.PCFunc ()
win32gui.SetForegroundWindow(win32gui.FindWindow(winclass,winnm))
PCFunc.Sentence(path)
I get:
unbound method Sentence() must be called with PCFunc instance as first argument (got str instance instead)
If you want to call the constructor of the base class, then you do it on instantiation in the __init__() method, not in the Sentence() method:
def __init__(self):
super(self.__class__, self).__init__()
Since Sentence() is an instance method, you need to call it via an instance of the class (like the error tells you):
pc_func = PCFunc()
pc_func.Sentence(var)
Here you are calling the method with an undefined variable:
PCFunc.Sentence(path)
Instead you need to give a string as parameter, so either write Sentence('path'), or define the variable first:
path = 'my path'
pc_func.Sentence(path)
Do not use the same name as the class name for an instance of the class:
PCFunc = Func.PCFunc ()
Otherwise the variable name storing the instance overwrites the class name.
Apart from that, it is unclear what your code is actually supposed to do. Have a look at the Python code conventions for a first step to making your code more readible. Then do some research about classes and inheritance.
The code you posted does not produce the error you posted. Here is an example that will produce that error:
class Dog:
def do_stuff(self, string):
print string
d = Dog()
d.do_stuff('hello')
Dog.do_stuff(d, 'goodbye')
Dog.do_stuff('goodbye')
--output:--
hello
goodbye
Traceback (most recent call last):
File "1.py", line 9, in <module>
Dog.do_stuff('goodbye')
TypeError: unbound method do_stuff() must be called with Dog instance as first argument (got str instance instead)
An __init__() function can also produce that error:
class Dog:
def __init__(self):
pass
def do_stuff(self, string):
print(string)
Dog.__init__()
--output:--
Traceback (most recent call last):
File "1.py", line 7, in <module>
Dog.__init__()
TypeError: unbound method __init__() must be called with Dog instance as first argument (got nothing instead)
In the line:
d.do_stuff('hello')
the fragment d.do_stuff causes python to create and return a bound method object--which is then immediately executed by the function execution operator () in the fragment ('hello’). The bound method is bound to the instance d, hence the reason it is called a bound method. A bound method automatically passes the instance it contains to the method when the method is executed.
On the other hand, when you write:
Dog.do_stuff(....)
the fragment Dog.do_stuff causes python to create and return an unbound method. An unbound method does not contain an instance, so when an unbound method is executed by the function execution operator (), you must manually pass an instance. (In python3, things changed and you can pass anything as the first argument--an instance of the class isn't required.)

How to detect an instance data attribute change when debugging?

I am trying to debug a multi-threaded program that uses a third-party package.
At some point, one of the attributes of an object (that is not created directly by me) is changed and I can't figure out what changed it. I could not find anything in my code that changes it.
Since this is a third-party package, I prefer not to change its code directly, but rather patch it from the outside as necessary.
My plan was to somehow tap into or wrap the code that sets the attribute and set a breakpoint or print the stack trace from there.
I tried monkey-patching the __setattr__ method of the instance, but it was not triggered.
I also tried to patch the class itself:
def patch_class(target):
def method(self, name, value):
print(name, value)
print("called from", target)
setattr(self, name, value) # break or print trace here
target.__setattr__ = types.MethodType(method, target)
patch_class(WebSocket)
but then all of the attributes are set on the class itself, as the method is bound to it.
Wrapping the class with a proxy does not really help either, since I am not instantiating it myself, but rather get the instance at some point after its creation.
If it matters, the said class is ws4py's WebSocket that is created by another third-party package, but I consider this an exercise in general debugging techniques.
Is there a more "pythonic" way of tapping into the mutation of an existing instance? (hack-ish ways will be appreciated as well)
I ended up creating a __setattr__ for the class.
def setter_fun(self, name, value):
print('setting', name, value)
self.__dict__[name] = value
if name is 'problematic_prop' and value is 'problematicValue':
traceback.print_stack()
# and set the class setter magic method
instance.__class__.__setattr__ = setter_fun
It is also possible to use setattr instead of using the __dict__ magic property:
setattr(self, name, value)
Now, when something sets the instance's problematic_prop to problematicValue, the stack trace will be printed:
>>> class A(object):
def __init__(self):
self.foo = 1
def set_problematic(self):
self.problematic_prop = 'problematicValue'
>>> a = A()
>>> a.__class__.__setattr__ = setter_fun
>>> a.foo = 2
setting foo 2
>>> print(a.foo)
2
>>> a.set_problematic()
setting problematic_prop problematicValue
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "<input>", line 6, in set_problematic
File "<input>", line 5, in setter_fun
NameError: name 'traceback' is not defined
My failed attempts included either trying to attach the __setattr__ to the instance instead of the class, or trying to attach a bound method:
class MyClass(object):
def setter_fun(self, name, value):
print('setting', name, value)
self.__dict__[name] = value
if name is 'problematic_prop' and value is 'problematicValue':
traceback.print_stack()
def set_my_function(self):
# won't work, the function is bound to the current instance (self)
some.instace.__class__.__setattr__ = self.setter_fun

Decorating a DBUS method

I'm trying to combine DBUS' asynchronous method calls with Twisted's Deferreds, but I'm encountering trouble in tweaking the usual DBUS service method decorator to do this.
To use the DBUS async callbacks approach, you'd do:
class Service(dbus.service.Object):
#dbus.service.method(INTERFACE, async_callbacks=('callback', 'errback'))
def Resources(self, callback, errback):
callback({'Magic' : 42})
There's a few places where I simply wrap those two methods in a Deferred, so I thought I'd create a decorator to do that for me:
def twisted_dbus(*args, **kargs):
def decorator(real_func):
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
def wrapped_func(callback, errback, *inner_args, **inner_kargs):
d = defer.Deferred()
d.addCallbacks(callback, errback)
return real_func(d, *inner_args, **inner_kargs)
return wrapped_func
return decorator
class Service(dbus.service.Object):
#twisted_dbus(INTERFACE)
def Resources(self, deferred):
deferred.callback({'Magic' : 42})
This, however, doesn't work since the method is bound and takes the first argument, resulting in this traceback:
$ python service.py
Traceback (most recent call last):
File "service.py", line 25, in <module>
class StatusCache(dbus.service.Object):
File "service.py", line 32, in StatusCache
#twisted_dbus(INTERFACE)
File "service.py", line 15, in decorator
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
File "/usr/lib/pymodules/python2.6/dbus/decorators.py", line 165, in decorator
args.remove(async_callbacks[0])
ValueError: list.remove(x): x not in list
I could add an extra argument to the inner function there, like so:
def twisted_dbus(*args, **kargs):
def decorator(real_func):
#dbus.service.method(*args, async_callbacks=('callback', 'errback'), **kargs)
def wrapped_func(possibly_self, callback, errback, *inner_args, **inner_kargs):
d = defer.Deferred()
d.addCallbacks(callback, errback)
return real_func(possibly_self, d, *inner_args, **inner_kargs)
return wrapped_func
return decorator
But that seems... well, dumb. Especially if, for some reason, I want to export a non-bound method.
So is it possible to make this decorator work?
Why is it dumb? You're already assuming you know that the first positional argument (after self) is a Deferred. Why is it more dumb to assume that you know that the real first position argument is self?
If you also want to support free functions, then write another decorator and use that when you know there is no self argument coming.

Categories

Resources