This Python 2 example:
class LoggingDict(dict):
# Simple example of extending a builtin class
def __setitem__(self, key, value):
logging.info('Setting %r to %r' % (key, value))
super(LoggingDict, self).__setitem__(key, value)
and this Python 3 example:
class LoggingDict(dict):
# Simple example of extending a builtin class
def __setitem__(self, key, value):
logging.info('Setting %r to %r' % (key, value))
super().__setitem__(key, value)
illustrate the fact that Python 2's super requires explicit class and self arguments (but Python 3's doesnt). Why is that? It seems like an irritating limitation.
The link in AKS' comment provides the answer here:
Lets say in the Python 2 example I thought "I don't like that explicit class reference. What if I change the name of the class or move this code and forget to update it?". Lets say I thought, a-ha, I'll replace the explicit class name with self.__class__ and wrote:
class LoggingDict(dict):
# Simple example of extending a builtin class
def __setitem__(self, key, value):
logging.info('Setting %r to %r' % (key, value))
super(self.__class__, self).__setitem__(key, value)
Now I create a subclass called SpecialisedLoggingDict of LoggingDict (which doesn't override __setitem__), instantiate it and call __setitem__ on it.
Now self refers to an instance of SpecialisedLoggingDict, so the super returns LoggingDict, and we go straight back into LoggingDict.__setitem__, entering infinite recursion.
The essential point is that in Python 2 a method doesn't really know which class it was defined in, it only knows the class of the instance on which it's being called. Python 3 does compile-time "magic", adding a __class__ cell to functions so that super() can be used without the explicit class reference.
Related
I just realised that __setattr__ doesn't work on the class itself. So this implementation,
class Integer:
me_is_int = 0
def __setattr__(self, name, value):
if not isinstance(value, int):
raise TypeError
doesn't raise on this:
Integer.me_is_int = "lol"
So, switching to a metaclass:
class IntegerMeta:
def __setattr__(cls, name, value):
if not isinstance(value, int):
raise TypeError
class Integer(metaclass=IntegerMeta):
me_is_int = 0
this works, but this:
Integer().me_is_int = "lol"
doesn't work yet again. So do I need to copy the __setattr__ method in Integer again to make it work on instances? Is it not possible for Integer to use IntegerMeta's __setattr__ for instances?
You are right in your reasoning: having a custom __setattr__ special method in the metaclass will affect any value setting on the class, and having the it on the class will affect all instances of the class.
With that in mind, if you don't want to duplicate code, is to arrange the metaclass itself to inject the logic in a class, whenever it is created.
The way you've written it, even thinking as an example, is dangerous, as it will affect any attribute set on the class or instances - but if you have a list of the attributes you want to guard in that way, it would also work.
attributes_to_guard = {"me_is_int",}
class Meta:
def __init__(cls, name, bases, ns, **kw):
# This line itself would not work if the setattr would not check
# for a restricted set of attributes to guard:
cls.__setattr__ = cls.__class__.__setattr__
# Also, note that this overrides any manually customized
# __setattr__ on the classes. The mechanism to call those,
# and still add the guarding logic in the metaclass would be
# more complicated, but it can be done
super().__init__(name, bases, ns, **kw)
def __setattr__(self, name, value):
if name in attributes_to_guard not isinstance(value, int):
raise TypeError()
class Integer(metaclass=Meta):
me_is_int = 0
I have subclassed the dict and need to detect all its modifications.
(I know I cannot detect an in-place modification of a stored value. That's OK.)
My code:
def __setitem__(self, key, value):
super().__setitem__(key, value)
self.modified = True
def __delitem__(self, key):
super().__delitem__(key)
self.modified = True
The problem is it works only for a straightforward assignment or deletion. It does not detect changes made by pop(), popitem(), clear() and update().
Why are __setitem__ and __delitem__ bypassed when items are added or deleted? Do I have to redefine all those methods (pop, etc.) as well?
For this usage, you should not subclass dict class, but instead use the abstract classes form the collections module of Python standard library.
You should subclass the MutableMapping abstract class and override the following methods: __getitem__, __setitem__, __delitem__, __iter__ and __len__,
all that by using an inner dict. The abstract base class ensures that all other methods will use those ones.
class MyDict(collections.MutableMapping):
def __init__(self):
self.d = {}
# other initializations ...
def __setitem__(self, key, value):
self.d[key] = value
self.modified = true
...
pop, popitem, clear, update are not implemented through __setitem__ and __delitem__.
You must redefine them also.
I can suggest look at OrderedDict implementation.
I am quite an average programmer in python and i have not done very complex or any major application with python before ... I was reading new class styles and came across some very new things to me which i am understanding which is data types and classes unification
class defaultdict(dict):
def __init__(self, default=None):
dict.__init__(self)
self.default = default
def __getitem__(self, key):
try:
return dict.__getitem__(self, key)
except KeyError:
return self.default
but whats really getting me confused is why would they unify them? ... i really can't picture any reason making it of high importance .. i'll be glad if anybody can throw some light on this please Thank you
The primary reason was to allow for built-in types to be subclassed in the same way user-created classes could be. Prior to new-style classes, to create a dict-like class, you needed to subclass from a specially designed UserDict class, or produce a custom class that provided the full dict protocol. Now, you can just do class MySpecialDict(dict): and override the methods you want to modify.
For the full rundown, see PEP 252 - Making Types Look More Like Classes
For an example, here's a dict subclass that logs modifications to it:
def log(msg):
...
class LoggingDict(dict):
def __setitem__(self, key, value):
super(LoggingDict, self).__setitem__(key, value)
log('Updated: {}={}'.format(key, value))
Any instance of LoggingDict can be used wherever a regular dict is expected:
def add_value_to_dict(d, key, value):
d[key] = value
logging_dict = LoggingDict()
add_value_to_dict(logging_dict, 'testkey', 'testvalue')
If you instead used a function instead of LoggingDict:
def log_value(d, key, value):
log('Updated: {}={}'.format(key, value))
mydict = dict()
How would you pass mydict to add_value_to_dict and have it log the addition without having to make add_value_to_dict know about log_value?
class Silly:
#property
def silly(self):
"This is a silly property"
print("You are getting silly")
return self._silly
#silly.setter
def silly(self, value):
print("You are making silly {}".format(value))
self._silly = value
#silly.deleter
def silly(self):
print("Whoah, you killed silly!")
del self._silly
s = Silly()
s.silly = "funny"
value = s.silly
del s.silly
But it does not print "You are getting silly", "You are making silly funny",... as expected. Don't know why. Can you have me figure out, folks?
Thanks in advance!
The property decorator only works with new-style classes (see also). Make Silly extend from object explicitly to make it a new-style class. (In Python 3, all classes are new-style classes)
class Silly(object):
#property
def silly(self):
# ...
# ...
As you most likely are aware of the correct method to add properties would be to use:
#property
def silly(self):
return self._silly
#silly.setter:
def silly(self, value):
self._silly = value
But this requires new style classes, that is that somewhere in the chain there is supposed to be a class ParentClass(object):. The similar option of using silly = property(get_silly, set_silly) has the same requirement.
However, there is another option, and that is to use a corresponding private variable, like self._silly, and override the __getattr__ and __setattr__ methods:
def __getattr__(self, name):
"""Called _after_ looking in the normal places for name."""
if name == 'silly':
self._silly
else:
raise AttributeError(name)
def __setattr__(self, name, value):
"""Called _before_ looking in the normal places for name."""
if name == 'silly':
self.__dict__['_silly'] = value
else:
self.__dict__[name] = value
Notice how __getattr__ will be called after checking for other attributes, but that __setattr__ is called before checking for other attributes. Therefore the former can and should raise an error if not an accepted attribute, and the latter should set the attribute. Do not use self._silly = value within __setattr__ as that would cause infinite recursion.
Also note that since we're dealing with old style classes here, you should actually use the dict method, and not the newer baseclass.__setattr__(self, attr, value), see docs. A similar __delattr__() does also exist, if you want it.
Using this code, you can now do stuff like:
i_am = Silly()
i_am.silly = 'No, I'm clever'
print i_am.silly
I'm defining several classes intended to be used for multiple inheritance, e.g.:
class A:
def __init__(self, bacon = None, **kwargs):
self.bacon = bacon
if bacon is None:
self.bacon = 100
super().__init__(**kwargs)
class Bacon(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Eggs(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Spam(Eggs, Bacon):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
However, I have multiple classes (e.g. possibly Bacon, A, and Spam, but not Eggs) that care about when their property bacon is changed. They don't need to modify the value, only to know what the new value is, like an event. Because of the Multiple Inheritance nature I have set up, this would mean having to notify the super class about the change (if it cares).
I know that it might be possible if I pass the class name to the method decorator, or if I use a class decorator. I don't want to have all the direct self-class referencing, having to create lots of decorators above each class, or forcing the methods to be the same name, as none of these sound very pythonic.
I was hoping to get syntax that looks something like this:
#on_change(bacon)
def on_bacon_change(self, bacon):
# read from old/new bacon
make_eggs(how_much = bacon)
I don't care about the previous value of bacon, so that bacon argument isn't necessary, if this is called after bacon is set.
Is it possible to check if a super class has a method with this
decorator?
If this isn't feasible, are there alternatives to passing events like
this, up through the multiple-inheritance chain?
EDIT:
The actual calling of the function in Spam would be done in A, by using a #property and #bacon.setter, as that would be the upper-most class that initializes bacon. Once it knows what function to call on self, the problem only lies in propagating the call up the MI chain.
EDIT 2:
If I override the attribute with a #bacon.setter, Would it be possible to determine whether the super() class has a setter for bacon?
What you call for would probably be nicely fit with a more complete framework of signals, and so on - maybe even invite for Aspected Oriented Programing.
Without going deep into it however, a metaclass and a decorator can do just what you are asking for - I came up with these, I hope they work for you.
If you'd like to evolve this in to something robust and usable, write me - if nothing like this exists out there, it wouldbe worth to keep an utility package in pipy for this.
def setattr_wrapper(cls):
def watcher_setattr(self, attr, val):
super(cls, self).__setattr__(attr, val)
watched = cls.__dict__["_watched_attrs"]
if attr in watched:
for method in watched[attr]:
getattr(self, method)(attr, val)
return watcher_setattr
class AttrNotifier(type):
def __new__(metacls, name, bases, dct):
dct["_watched_attrs"] = {}
for key, value in dct.items():
if hasattr(value, "_watched_attrs"):
for attr in getattr(value, "_watched_attrs"):
if not attr in dct["_watched_attrs"]:
dct["_watched_attrs"][attr] = set()
dct["_watched_attrs"][attr].add(key)
cls = type.__new__(metacls, name, bases, dct)
cls.__setattr__ = setattr_wrapper(cls)
return cls
def on_change(*args):
def decorator(meth):
our_args = args
#ensure that this decorator is stackable
if hasattr(meth, "_watched_attrs"):
our_args = getattr(meth, "_watched_attrs") + our_args
setattr(meth, "_watched_attrs", our_args)
return meth
return decorator
# from here on, example of use:
class A(metaclass=AttrNotifier):
#on_change("bacon")
def bacon_changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
class Spam(A):
#on_change("bacon", "pepper")
def changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
a = A()
a.bacon = 5
b = Spam()
b.pepper = 10
b.bacon = 20
(tested in Python 3.2 and Python 2.6 - changing the declaration of the "A" class for
Python 2 metaclass syntax)
edit - some words on what is being done
Here is what happens:
The metaclass picks all methods marked with the "on_close" decorator, and register then in a dictionary on the class - this dictionary is named _watched_attrs and it can be accessed as a normal class attribute.
The other thing the metaclass does is to override the __setattr__ method for the clas once it is created. This new __setattr__ just sets the attribute, and then checks the _wacthed_attrs dictionary if there are any methods on that class registered to be called when the attribute just changed has been modified - if so, it calls it.
The extra indirection level around watcher_setattr (which is the function that becomes each class's __setattr__ is there so that you can register different attributes to be watched on each class on the inheritance chain - all the classess have indepently acessible _watched_attrs dictionaries. If it was not for this, only the most specilized class on the inheritance chain _watched_attrs would be respected.
You are looking for python properties:
http://docs.python.org/library/functions.html#property
Search google for override superclass property setter resulted in this StackOverflow question:
Overriding inherited properties’ getters and setters in Python