I have been writing python code with classes that will have a method called something like:
def set_log_paths(self):
The thing is, this method doesn't take an argument, it determines what some values should be based on other values of self. Is it inappropriate to use the word "set" in this case? I ask because it isn't a direct getter or a setter as one would use in a language with private members.
Is there a common conventional word to use in my method name?
f you don't pass any values, and instead, compute the value at the moment the method is called, based on current values, it is reasonable that the verb describing the action be "update" - therefore update_log_paths().
Just double check you really need this design, and what are the chances of you/other users of your class forgetting calling these "update" methods.
Python's introspection easily allows adopting some elements from "reactive programing", which could be used to trigger these updater methods when the values they depend upon are changed.
One optimal choice for such an architecture would be a descriptor for your properties that upon having __set__ called would check a class-level registry to "see" if events should be triggered, and then one decorator that would enable you to list the attributes that would trigger it. A base class with a proper __init_subclass__ method could set everything up.
Let's suppose you will have the "base properties" on your class as annotated attributes in the class body - the descritptor, decorator and base-class code for this to work could be something along:
from functools import wraps
from collections import ChainMap
class EventDescriptor:
def __init__(self, name, default):
self.name = name
self.default = default
def __get__(self, instance, owner):
if not instance:
return self
return instance.__dict__[self.name] if self.name in instance.__dict__ else self.default
def __set__(self, instance, value):
instance.__dict__[self.name] = value
triggers = instance._post_change_registry.get(self.name, [])
for trigger in triggers:
getattr(instance, trigger)()
def triggered_by(*args):
def decorator(func):
func._triggered_by = args
return func
return decorator
class EventPropertyMixin:
def __init_subclass__(cls, **kw):
super.__init_subclass__(**kw)
for property_name, type_ in cls.__annotations__.items():
if not hasattr(cls, property_name):
raise TypeError("Properties without default values not supported in this example code")
# It would also be trivial to implement runtime type-checking in this point (and on the descriptor code)
setattr(cls, property_name, EventDescriptor(property_name, getattr(cls, property_name)))
# collects all registries in ancestor-classes, preserving order:
post_change_registry = ChainMap()
for ancestor in cls.__mro__[:0:-1]:
if hasattr(ancestor, "_post_change_registry"):
post_change_registry = post_change_registy.new_child(ancestor._post_change_registry)
post_change_registry = post_change_registry.new_child({})
for method_name, method in cls.__dict__.items():
if callable(method) and hasattr(method, "_triggered_by"):
for property_name in method._triggered_by:
triggers = post_change_registry.setdefault(property_name, [])
if method_name not in triggers:
triggers.append(method_name)
cls._post_change_registry = post_change_registry
class Test(EventPropertyMixin):
path1: str = ""
path2: str = ""
#triggered_by("path1", "path2")
def update_log_paths(self):
self.log_paths = self.path1 + self.path2
And let's this working:
In [2]: t = Test()
In [3]: t.path1 = "/tmp"
In [4]: t.path2 = "/inner"
In [5]: t.log_paths
Out[5]: '/tmp/inner'
So, this is complicated code, but code that usually would lie inside a framework, or in base utility libraries - with these 50 lines of code, you could be using Python to work for you, and have it call the updating methods, so their name won't matter at all! :-)
(ok, this code is way overkill for the question asked - but I was in a mood to produce something like this before sleeping tonight - disclaimer: I had not tested the inheritance-related corner cases covered in here)
Related
I found this post where a function is used to inherit from a class:
def get_my_code(base):
class MyCode(base):
def initialize(self):
...
return MyCode
my_code = get_my_code(ParentA)
I would like to do something similar, but with a decorator, something like:
#decorator(base)
class MyClass(base):
...
Is this possible?
UPDATE
Say you have a class Analysis that is used throughout your code. Then you realize that you want to use a wrapper class Transient that is just a time loop on top of the analysis class. If in the code I replace the analysis class, but Transient(Analysis) everything breaks because an analysis class is expected, and thus all its attributes. The problem is that I can't just get to define class Transient(Analysis) in this way because there are plenty of analysis classes. I thought the best way to do this would be to have some sort of dynamic inheritance. Right now I use aggregation to redirect the functionality to the analysis class inside transient.
A class decorator actually gets the class already built - and instantiated (as a class object). It can perform changes on it's dict, and even wrap its methods with other decorators.
However, it means the class already has its bases set - and these can't be ordinarily changed. That implies you have to, in some ay rebuild the class inside the decorator code.
However, if the class'methods make use of parameterless super or __class__ cell variable, those are already set in the member functions (that in Python 3 are the same as unbound methods) you can't just create a new class and set those methods as members on the new one.
So, there might be a way, but it will be non-trivial. And as I pointed out in the comment above, I d like to understand what you'd like to be able to achieve with this, since one could just put the base class on the class declaration itself, instead of using it on the decorator configuration.
I've crafted a function that, as described above, creates a new class, "clonning" the original and can re-build all methods that use __class__ or super: it returns the new class which is functionally identical to the orignal one, but with the bases exchanged. If used in a decorator as requested (decorator code included), it will simply change the class bases. It can't handle decorated methods (other than classmethod and staticmethod), and don't take care of naming details - such as qualnames or repr for the methods.
from types import FunctionType
def change_bases(cls, bases, metaclass=type):
class Changeling(*bases, metaclass=metaclass):
def breeder(self):
__class__ #noQA
cell = Changeling.breeder.__closure__
del Changeling.breeder
Changeling.__name__ = cls.__name__
for attr_name, attr_value in cls.__dict__.items():
if isinstance(attr_value, (FunctionType, classmethod, staticmethod)):
if isinstance(attr_value, staticmethod):
func = getattr(cls, attr_name)
elif isinstance(attr_value, classmethod):
func = attr_value.__func__
else:
func = attr_value
# TODO: check if func is wrapped in decorators and recreate inner function.
# Although reaplying arbitrary decorators is not actually possible -
# it is possible to have a "prepare_for_changeling" innermost decorator
# which could be made to point to the new function.
if func.__closure__ and func.__closure__[0].cell_contents is cls:
franken_func = FunctionType(
func.__code__,
func.__globals__,
func.__name__,
func.__defaults__,
cell
)
if isinstance(attr_value, staticmethod):
func = staticmethod(franken_func)
elif isinstance(attr_value, classmethod):
func = classmethod(franken_func)
else:
func = franken_func
setattr(Changeling, attr_name, func)
continue
setattr(Changeling, attr_name, attr_value)
return Changeling
def decorator(bases):
if not isinstance(base, tuple):
bases = (bases,)
def stage2(cls):
return change_bases(cls, bases)
return stage2
Out of curiosity, I'm interested whether it's possible to write a meta class that causes methods of parent classes to have preference over methods of sub classes. I'd like to play around with it for a while. It would not be possible to override methods anymore. The base class would have to call the sub method explicitly, for example using a reference to a base instance.
class Base(metaclass=InvertedInheritance):
def apply(self, param):
print('Validate parameter')
result = self.subclass.apply(param)
print('Validate result')
return result
class Child(Base):
def apply(self, param):
print('Compute')
result = 42 * param
return result
child = Child()
child.apply(2)
With the output:
Validate parameter
Compute
Validate result
If you only care about making lookups on instances go in reverse order (not classes), you don't even need a metaclass. You can just override __getattribute__:
class ReverseLookup:
def __getattribute__(self, attr):
if attr.startswith('__'):
return super().__getattribute__(attr)
cls = self.__class__
if attr in self.__dict__:
return self.__dict__[attr]
# Using [-3::-1] skips topmost two base classes, which will be ReverseLookup and object
for base in cls.__mro__[-3::-1]:
if attr in base.__dict__:
value = base.__dict__[attr]
# handle descriptors
if hasattr(value, '__get__'):
return value.__get__(self, cls)
else:
return value
raise AttributeError("Attribute {} not found".format(attr))
class Base(ReverseLookup):
def apply(self, param):
print('Validate parameter')
result = self.__class__.apply(self, param)
print('Validate result')
return result
class Child(Base):
def apply(self, param):
print('Compute')
result = 42 * param
return result
>>> Child().apply(2)
Validate parameter
Compute
Validate result
84
This mechanism is relatively simple because lookups on the class aren't in reverse:
>>> Child.apply
<function Child.apply at 0x0000000002E06048>
This makes it easy to get a "normal" lookup just by doing it on a class instead of an instance. However, it could result in confusion in other cases, like if a base class method tries to access a different method on the subclass, but that method actually doesn't exist on that subclass; in this case lookup will proceed in the normal direction and possibly find the method on a higher class. In other words, when doing this you have be sure that you don't look any methods up on a class unless you're sure they're defined on that specific class.
There may well be other corner cases where this approach doesn't work. In particular you can see that I jury-rigged descriptor handling; I wouldn't be surprised if it does something weird for descriptors with a __set__, or for more complicated descriptors that make more intense use of the class/object parameters passed to __get__. Also, this implementation falls back on the default behavior for any attributes beginning with two underscores; changing this would require careful thought about how it's going to work with magic methods like __init__ and __repr__.
I feel like I have a pretty good grasp on using decorators when dealing with regular functions, but between using methods of base classes for decorators in derived classes, and passing parameters to said decorators, I cannot figure out what to do next.
Here is a snippet of code.
class ValidatedObject:
...
def apply_validation(self, field_name, code):
def wrap(self, f):
self._validations.append(Validation(field_name, code, f))
return f
return wrap
class test(ValidatedObject):
....
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
If I try this as is, I get an "apply_validation" is not found.
If I try it with #self.apply_validation I get a "self" isn't found.
I've also been messing around with making apply_validation a class method without success.
Would someone please explain what I'm doing wrong, and the best way to fix this? Thank you.
The issue you're having is that apply_validation is a method, which means you need to call it on an instance of ValidatedObject. Unfortunately, at the time it is being called (during the definition of the test class), there is no appropriate instance available. You need a different approach.
The most obvious one is to use a metaclass that searches through its instance dictionaries (which are really class dictionaries) and sets up the _validations variable based on what it finds. You can still use a decorator, but it probably should be a global function, or perhaps a static method, and it will need to work differently. Here's some code, that uses a metaclass and a decorator that adds function attributes:
class ValidatedMeta(type):
def __new__(meta, name, bases, dct):
validations = [Validation(f._validation_field_name, f._validation_code, f)
for f in dct.values if hasattr(f._validation_field_name)]
dct["_validations"] = validations
super(ValidatedMeta, meta).__new__(meta, name, bases, dct)
def apply_validation(field_name, code):
def decorator(f):
f._validation_field_name = field_name
f._validation_code = code
return f
return decorator
def ValidatedObject(metaclass=ValidatedMeta):
pass
class test(ValidatedObject):
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
After this code runs, test._validations will be [Validation("_name", "oh no!", test.name_validation)]. Note that the method that is be passed to Validation is unbound, so you'll need to pass it a self argument yourself when you call it (or perhaps drop the self argument and change the decorator created in apply_validation to return staticmethod(f)).
This code may not do what you want if you have validation methods defined at several levels of an inheritance hierarchy. The metaclass as written above only checks the immediate class's dict for methods with the appropriate attributes. If you need it include inherited methods in _validations too, you may need to modify the logic in ValidatedMeta.__new__. Probably the easiest way to go is to look for _validations attributes in the bases and concatenate the lists together.
Just an example for using decorators on class method:
from functools import wraps
def VALIDATE(dec):
#wraps(dec)
def _apply_validation(self, name):
self.validate(name)
return dec(self, name)
return _apply_validation
class A:
def validate(self, name):
if name != "aamir":
raise Exception, 'Invalid name "%s"' % name
class B(A):
#VALIDATE
def name_validation(self, name):
return name
b = B()
b.name_validation('jacob') # should raise exception
I'm defining several classes intended to be used for multiple inheritance, e.g.:
class A:
def __init__(self, bacon = None, **kwargs):
self.bacon = bacon
if bacon is None:
self.bacon = 100
super().__init__(**kwargs)
class Bacon(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Eggs(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Spam(Eggs, Bacon):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
However, I have multiple classes (e.g. possibly Bacon, A, and Spam, but not Eggs) that care about when their property bacon is changed. They don't need to modify the value, only to know what the new value is, like an event. Because of the Multiple Inheritance nature I have set up, this would mean having to notify the super class about the change (if it cares).
I know that it might be possible if I pass the class name to the method decorator, or if I use a class decorator. I don't want to have all the direct self-class referencing, having to create lots of decorators above each class, or forcing the methods to be the same name, as none of these sound very pythonic.
I was hoping to get syntax that looks something like this:
#on_change(bacon)
def on_bacon_change(self, bacon):
# read from old/new bacon
make_eggs(how_much = bacon)
I don't care about the previous value of bacon, so that bacon argument isn't necessary, if this is called after bacon is set.
Is it possible to check if a super class has a method with this
decorator?
If this isn't feasible, are there alternatives to passing events like
this, up through the multiple-inheritance chain?
EDIT:
The actual calling of the function in Spam would be done in A, by using a #property and #bacon.setter, as that would be the upper-most class that initializes bacon. Once it knows what function to call on self, the problem only lies in propagating the call up the MI chain.
EDIT 2:
If I override the attribute with a #bacon.setter, Would it be possible to determine whether the super() class has a setter for bacon?
What you call for would probably be nicely fit with a more complete framework of signals, and so on - maybe even invite for Aspected Oriented Programing.
Without going deep into it however, a metaclass and a decorator can do just what you are asking for - I came up with these, I hope they work for you.
If you'd like to evolve this in to something robust and usable, write me - if nothing like this exists out there, it wouldbe worth to keep an utility package in pipy for this.
def setattr_wrapper(cls):
def watcher_setattr(self, attr, val):
super(cls, self).__setattr__(attr, val)
watched = cls.__dict__["_watched_attrs"]
if attr in watched:
for method in watched[attr]:
getattr(self, method)(attr, val)
return watcher_setattr
class AttrNotifier(type):
def __new__(metacls, name, bases, dct):
dct["_watched_attrs"] = {}
for key, value in dct.items():
if hasattr(value, "_watched_attrs"):
for attr in getattr(value, "_watched_attrs"):
if not attr in dct["_watched_attrs"]:
dct["_watched_attrs"][attr] = set()
dct["_watched_attrs"][attr].add(key)
cls = type.__new__(metacls, name, bases, dct)
cls.__setattr__ = setattr_wrapper(cls)
return cls
def on_change(*args):
def decorator(meth):
our_args = args
#ensure that this decorator is stackable
if hasattr(meth, "_watched_attrs"):
our_args = getattr(meth, "_watched_attrs") + our_args
setattr(meth, "_watched_attrs", our_args)
return meth
return decorator
# from here on, example of use:
class A(metaclass=AttrNotifier):
#on_change("bacon")
def bacon_changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
class Spam(A):
#on_change("bacon", "pepper")
def changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
a = A()
a.bacon = 5
b = Spam()
b.pepper = 10
b.bacon = 20
(tested in Python 3.2 and Python 2.6 - changing the declaration of the "A" class for
Python 2 metaclass syntax)
edit - some words on what is being done
Here is what happens:
The metaclass picks all methods marked with the "on_close" decorator, and register then in a dictionary on the class - this dictionary is named _watched_attrs and it can be accessed as a normal class attribute.
The other thing the metaclass does is to override the __setattr__ method for the clas once it is created. This new __setattr__ just sets the attribute, and then checks the _wacthed_attrs dictionary if there are any methods on that class registered to be called when the attribute just changed has been modified - if so, it calls it.
The extra indirection level around watcher_setattr (which is the function that becomes each class's __setattr__ is there so that you can register different attributes to be watched on each class on the inheritance chain - all the classess have indepently acessible _watched_attrs dictionaries. If it was not for this, only the most specilized class on the inheritance chain _watched_attrs would be respected.
You are looking for python properties:
http://docs.python.org/library/functions.html#property
Search google for override superclass property setter resulted in this StackOverflow question:
Overriding inherited properties’ getters and setters in Python
I would like to control which methods appear when a user uses tab-completion on a custom object in ipython - in particular, I want to hide functions that I have deprecated. I still want these methods to be callable, but I don't want users to see them and start using them if they are inspecting the object. Is this something that is possible?
Partial answer for you. I'll post the example code and then explain why its only a partial answer.
Code:
class hidden(object): # or whatever its parent class is
def __init__(self):
self.value = 4
def show(self):
return self.value
def change(self,n):
self.value = n
def __getattr__(self, attrname):
# put the dep'd method/attribute names here
deprecateds = ['dep_show','dep_change']
if attrname in deprecateds:
print("These aren't the methods you're looking for.")
def dep_change(n):
self.value = n
def dep_show():
return self.value
return eval(attrname)
else:
raise AttributeError, attrname
So now the caveat: they're not methods (note the lack of self as the first variable). If you need your users (or your code) to be able to call im_class, im_func, or im_self on any of your deprecated methods, then this hack won't work. Also, i'm pretty sure there's going to be a performance hit because you're defining each dep'd function inside __getattr__. This won't affect your other attribute lookups (had I put them in __getattribute__, that would be a different matter), but it will slow down access to those deprecated methods. This can be (largely, but not entirely) negated by putting each function definition inside its own if block, instead of doing a list-membership check, but, depending on how big your function is, that could be really annoying to maintain.
UPDATE:
1) If you want to make the deprecated functions methods (and you do), just use
import types
return types.MethodType(eval(attrname), self)
instead of
return eval(attrname)
in the above snippet, and add self as the first argument to the function defs. It turns them into instancemethods (so you can use im_class, im_func, and im_self to your heart's content).
2) If the __getattr__ hook didn't thrill you, there's another option (that I know of) (albiet, with its own caveats, and we'll get to those): Put the deprecated functions definitions inside __init__, and hide them with a custom __dir__. Here's what the above code would look like done this way:
class hidden(object):
def __init__(self):
self.value = 4
from types import MethodType
def dep_show(self):
return self.value
self.__setattr__('dep_show', MethodType(dep_show, self))
def dep_change(self, n):
self.value = n
self.__setattr__('dep_change', MethodType(dep_change, self))
def show(self):
return self.value
def change(self, n):
self.value = n
def __dir__(self):
heritage = dir(super(self.__class__, self)) # inherited attributes
hide = ['dep_show', 'dep_change']
show = [k for k in self.__class__.__dict__.keys() + self.__dict__.keys() if not k in heritage + private]
return sorted(heritage + show)
The advantage here is that you're not defining the functions anew every lookup, which nets you speed. The disadvantage here is that because you're not defining functions anew each lookup, they have to 'persist' (if you will). So, while the custom __dir__ method hides your deprecateds from dir(hiddenObj) and, therefore, IPython's tab-completion, they still exist in the instance's __dict__ attribute, where users can discover them.
Seems like there is a special magic method for the introcpection which is called by dir(): __dir__(). Isn't it what you are lookin for?
The DeprecationWarning isn't emitted until the method is called, so you'd have to have a separate attribute on the class that stores the names of deprecated methods, then check that before suggesting a completion.
Alternatively, you could walk the AST for the method looking for DeprecationWarning, but that will fail if either the class is defined in C, or if the method may emit a DeprecationWarning based on the type or value of the arguments.
About the completion mechanism in IPython, it is documented here:
http://ipython.scipy.org/doc/manual/html/api/generated/IPython.core.completer.html#ipcompleter
But a really interesting example for you is the traits completer, that does precisely what you want to do: it hides some methods (based on their names) from the autocompletion.
Here is the code:
http://projects.scipy.org/ipython/ipython/browser/ipython/trunk/IPython/Extensions/ipy_traits_completer.py