I just realised that __setattr__ doesn't work on the class itself. So this implementation,
class Integer:
me_is_int = 0
def __setattr__(self, name, value):
if not isinstance(value, int):
raise TypeError
doesn't raise on this:
Integer.me_is_int = "lol"
So, switching to a metaclass:
class IntegerMeta:
def __setattr__(cls, name, value):
if not isinstance(value, int):
raise TypeError
class Integer(metaclass=IntegerMeta):
me_is_int = 0
this works, but this:
Integer().me_is_int = "lol"
doesn't work yet again. So do I need to copy the __setattr__ method in Integer again to make it work on instances? Is it not possible for Integer to use IntegerMeta's __setattr__ for instances?
You are right in your reasoning: having a custom __setattr__ special method in the metaclass will affect any value setting on the class, and having the it on the class will affect all instances of the class.
With that in mind, if you don't want to duplicate code, is to arrange the metaclass itself to inject the logic in a class, whenever it is created.
The way you've written it, even thinking as an example, is dangerous, as it will affect any attribute set on the class or instances - but if you have a list of the attributes you want to guard in that way, it would also work.
attributes_to_guard = {"me_is_int",}
class Meta:
def __init__(cls, name, bases, ns, **kw):
# This line itself would not work if the setattr would not check
# for a restricted set of attributes to guard:
cls.__setattr__ = cls.__class__.__setattr__
# Also, note that this overrides any manually customized
# __setattr__ on the classes. The mechanism to call those,
# and still add the guarding logic in the metaclass would be
# more complicated, but it can be done
super().__init__(name, bases, ns, **kw)
def __setattr__(self, name, value):
if name in attributes_to_guard not isinstance(value, int):
raise TypeError()
class Integer(metaclass=Meta):
me_is_int = 0
Related
I've got simple metaclass, that turns methods of classes starting with "get_" to properties:
class PropertyConvertMetaclass(type):
def __new__(mcs, future_class_name, future_class_parents, future_class_attr):
new_attr = {}
for name, val in future_class_attr.items():
if not name.startswith('__'):
if name.startswith('get_'):
new_attr[name[4:]] = property(val)
else:
new_attr[name] = val
return type.__new__(mcs, future_class_name, future_class_parents, new_attr)
Imagine I have TestClass:
class TestClass():
def __init__(self, x: int):
self._x = x
def get_x(self):
print("this is property")
return self._x
I want it to work like this: I create some new class that kinda inherits from them both
class NewTestClass(TestClass, PropertyConvertMetaclass):
pass
and I could reuse their both methods like this:
obj = NewTestClass(8)
obj.get_x() # 8
obj.x # 8
As I take it, I should create a new class, lets name it PropertyConvert and make NewTestClass inherit from It:
class PropertyConvert(metaclass=PropertyConvertMetaclass):
pass
class NewTestClass(TestClass, PropertyConvert):
pass
But it doesn't help, I still can't use new property method with NewClassTest. How can I make PropertyConvert inherit all the methods from its brother, not doing anything inside NewClassTest, changing only PropertyConverterMetaclass or PropertyConverter? I'm new to metaclasses, so I'm sorry, if this question might seem silly.
When you do TestClass():, the body of the class is run in a namespace which becomes the class __dict__. The metaclass just informs the construction of that namespace via __new__ and __init__. In this case, you have set up the metaclass of TestClass to be type.
When you inherit from TestClass, e. g. with class NewTestClass(TestClass, PropertyConverter):, the version of PropertyConvertMetaclass you wrote operates on the __dict__ of NewTestClass only. TestClass has been created at that point, with no properties, because its metaclass way type, and the child class is empty, so you see no properties.
There are a couple of possible solutions here. The simpler one, but out of reach because of your assignment, is to do class TestClass(metaclass=PropertyConvertMetaclass):. All children of TestClass will have PropertyConvertMetaclass and so all getters will be converted to properties.
The alternative is to look carefully at the arguments of PropertyConvertMetaclass.__new__. Under normal circumstances, you only operate on the future_class_attr attribute. However, you have access to future_class_bases as well. If you want to upgrade the immediate siblings of PropertyConverter, that's all you need:
class PropertyConvertMetaclass(type):
def __new__(mcs, future_class_name, future_class_parents, future_class_attr):
# The loop is the same for each base __dict__ as for future_class_attr,
# so factor it out into a function
def update(d):
for name, value in d.items():
# Don't check for dunders: dunder can't start with `get_`
if name.startswith('get_') and callable(value):
prop = name[4:]
# Getter and setter can't be defined in separate classes
if 'set_' + prop in d and callable(d['set_' + prop]):
setter = d['set_' + prop]
else:
setter = None
if 'del_' + prop in d and callable(d['del_' + prop]):
deleter = d['del_' + prop]
else:
deleter = None
future_class_attr[prop] = property(getter, setter, deleter)
update(future_class_dict)
for base in future_class_parents:
# Won't work well with __slots__ or custom __getattr__
update(base.__dict__)
return super().__new__(mcs, future_class_name, future_class_parents, future_class_attr)
This is probably adequate for your assignment, but lacks a certain amount of finesse. Specifically, there are two deficiencies that I can see:
There is no lookup beyond the immediate base classes.
You can't define a getter in one class and a setter in another.
To address the first issue, you will have to traverse the MRO of the class. As #jsbueno suggests, this is easier to do on the fully constructed class using __init__ rather than the pre-class dictionary. I would solve the second issue by making a table of available getters and setters before making any properties. You could also make the properties respect MRO by doing this. The only complication with using __init__ is that you have to call setattr on the class rather than simply updating its future __dict__.
class PropertyConvertMetaclass(type):
def __init__(cls, class_name, class_parents, class_attr):
getters = set()
setters = set()
deleters = set()
for base in cls.__mro__:
for name, value in base.__dict__.items():
if name.startswith('get_') and callable(value):
getters.add(name[4:])
if name.startswith('set_') and callable(value):
setters.add(name[4:])
if name.startswith('del_') and callable(value):
deleters.add(name[4:])
for name in getters:
def getter(self, *args, **kwargs):
return getattr(super(cls, self), 'get_' + name)(*args, **kwargs)
if name in setters:
def setter(self, *args, **kwargs):
return getattr(super(cls, self), 'set_' + name)(*args, **kwargs)
else:
setter = None
if name in deleters:
def deleter(self, *args, **kwargs):
return getattr(super(cls, self), 'del_' + name)(*args, **kwargs)
else:
deleter = None
setattr(cls, name, property(getter, setter, deleter)
Anything that you do in the __init__ of a metaclass can just as easily be done with a class decorator. The main difference is that the metaclass will apply to all child classes, while a decorator only applies where it is used.
There is nothing "impossible" there.
It is a problem that, however unusual, can be solved with metaclasses.
Your approach is good - the problem you got is that when you look into the "future_class_attr" (also known as the namespace in the classbody), it only contains the methods and attributes for the class currently being defined . In your examples, NewTestClass is empty, and so is "future_class_attr".
The way to overcome that is to check instead on all base classes, looking for the methods that match the pattern you are looking for, and then creating the appropriate property.
Doing this correctly before creating the target class would be tricky - for one would have to do attribute searching in the correct mro (method resolution order) of all superclasses -and there can be a lot of corner cases. (but note it is not "impossible", nonetheless)
But nothing prevents you of doing that after creating the new class. For that, you can just assign the return value of super().__new__(mcls, ...) to a variable (by the way, prefer using super().__new__ instead of hardcoding type.__new__: this allows your metaclass to be colaborative and be combined with, say, collections.ABC or enum.Enum). That variable is them your completed class and you can use dir on it to check for all attribute and method names, already consolidating all superclasses - then, just create your new properties and assign then to the newly created class with setattr(cls_variable, property_name, property_object).
Better yet, write the metaclass __init__ instead of its __new__ method: you retrieve the new class already created, and can proceed to introspecting it with dir and adding the properties immediately. (don't forget to call super().__init__(...) even though your class don't need it.)
Also, note that since Python 3.6, the same results can be achieved with no metaclass at all, if one just implements the needed logic in the __init_subclass__ method of a base class.
One of the solutions of my problem is parsing parents' dicts in PropertyConvertMetaclass:
class PropertyConvertMetaclass(type):
def __new__(mcs, future_class_name, future_class_parents, future_class_attr):
new_attr = {}
for parent in future_class_parents:
for name, val in parent.__dict__.items():
if not name.startswith('__'):
if name.startswith('get_'):
new_attr[name[4:]] = property(val, parent.__dict__['set_' + name[4:]])
new_attr[name] = val
for name, val in future_class_attr.items():
if not name.startswith('__'):
if name.startswith('get_'):
new_attr[name[4:]] = property(val, future_class_attr['set_'+name[4:]])
new_attr[name] = val
return type.__new__(mcs, future_class_name, future_class_parents, new_attr)
I have a class
class Animal:
def __init__(self, name='', num_of_owners=0, sleep=0):
self.name = name
self.num_of_owners = int(float(num_of_owners))
self.sleep = float(sleep)
let's say I'm reading all the properties from some file.
I'm using Properties for getters and setters.
#property
def name(self):
return self.name
#name.setter
def name(self, value):
self.name = value
now when reading from the file, I don't want to look for every property in the dictionary i got specifically.
So i can run a for over the dictionary and type
for name, value in animal_props.iteritems():
setattr(animal, name, value)
but then all the properties are set as strings.
The thing is I have about 8 properties some floats some int some strings.
Anyway to run this for, and not make regular setters and run a specific setter for each property.
example:
class Animal:
def __init__(self, name='', num_of_owners=0, sleep=0):
self._name = name
self._num_of_owners = int(float(num_of_owners))
self._sleep = float(sleep)
#property
def name(self):
return self._name
#name.setter
def name(self, value):
self._name = value
#property
def num_of_owners(self):
return self._num_of_owners
#num_of_owners.setter
def num_of_owners(self, value):
self._num_of_owners = int(value)
#property
def sleep (self):
return self._sleep
#sleep.setter
def sleep(self, value):
self._sleep = int(float(value))
d = {'name': 'doggy', 'num_of_owners': '3', 'sleep': '5.643'}
dog = Animal()
for name, value in d.iteritems():
setattr(dog, name, value)
print type(dog.sleep)
I need the type at the end to be float. since i will later use it as a float.
Creating separate 'ifs' and send to each setter is fine, but is there anyway to do it with just that one for.
You are using python 2 with old-style classes. Properties are only available with new-style classes:
class Animal(object):
...
If you're really using Python 2 (as your tag suggests), you need to change your class declaration so that you inherit from object. Doing so will make your class a "new style" class rather than an "old-style" class (which were the only kind of class back in the Python 2.1 days). If you don't know much about these two kinds of class, don't worry about learning about the old ones. Just create new-style classes always (they're the only kind in Python 3).
To make a new-style class, inherit from object:
class Animal(object):
#...
Note that if you're not doing type conversions or other kinds of validation in your property getter or setter methods (as with name in your example code), you might as well get rid of the whole property and just use a regular attribute instead. If you find you do need validation later on in your program's design, you can switch back to using a property at that point and the other parts of the code that read or write from the attribute won't need to change.
Preamble: I have objects, some of them could be created by default constructor and left without modifications, so such objects could be considered as "empty". Sometimes I need to verify whether some object is "empty" or not. It could be done in the following way (majik methods are implemented in the base class Animal):
>>> a = Bird()
>>> b = Bird()
>>> a == b
True
>>> a == Bird()
True
So the question: is it possible (and if yes then how) to achieve such syntax:
>>> a == Bird.default
True
At least this one (but the previous is sweeter):
>>> a == a.default
True
But: with implementation of default in the base class Animal (to not repeat it in all derived classes):
class Animal(object):
... tech stuff ...
- obj comparison
- obj representation
- etc
class Bird(Animal):
... all about birds ...
class Fish(Animal):
... all about fishes ...
Of course I don't need solutions to have Bird() calling in Animal class :)
I'd like to have a kind of templating implemented in base class which will stamp out derived class default instance and store its unique copy in the derived class or instance property. I think it could be achieved by playing with metaclasses or so, but don't know how.
Class default instance could be considered as any object instantiated by __init__() of its class (without further object modification of course).
UPDATE
The system is flooded with objects and I just want to have a possibility to separate circulating of freshly (by default) created objects (which are useless to display for example) from already somehow modified one. I do it by:
if a == Bird():
. . .
I don't want creation of new object for comparison, intuitevly, I'd like to have one instance copy as etalon for the instances of this class to compare with. Objects are JSON-like and contain only properties (besides implicit __str__, __call__, __eq__ methods), so I'd like to keep such style of using built-in Python features and avoid the using explicitly defined methods like is_empty() for example. It's like entering an object in the interactive shell and it prints it out calling __str__, it is implicit, but fun.
To achieve the first solution you should use a metaclass.
For example:
def add_default_meta(name, bases, attrs):
cls = type(name, bases, attrs)
cls.default = cls()
return cls
And use it as(assuming python3. In python2 set the __metaclass__ attribute in the class body):
class Animal(object, metaclass=add_default_meta):
# stuff
class NameClass(Animal, metaclass=add_default_meta):
# stuff
Note that you have repeat the metaclass=... for every subclass of Animal.
If instead of a function you use a class and its __new__ method to implement the metaclass, it can be inherited, i.e:
class AddDefaultMeta(type):
def __new__(cls, name, bases, attrs):
cls = super(AddDefaultMeta, cls).__new__(cls, name, bases, attrs)
cls.default = cls()
return cls
A different way to achieve the same effect is to use a class decorator:
def add_default(cls):
cls.default = cls()
return cls
#add_default
class Bird(Animal):
# stuff
Again, you must use the decorator for every subclass.
If you want to achieve the second solution, i.e. to check a == a.default, then you can simply reimplement Animal.__new__:
class Animal(object):
def __new__(cls, *args, **kwargs):
if not (args or kwargs) and not hasattr(cls, 'default'):
cls.default = object.__new__(cls)
return cls.default
else:
return object.__new__(cls)
This will create the empty instance whenever the first instance of the class is created and it is stored in the default attribute.
This means that you can do both:
a == a.default
and
a == Bird.default
But accessing Bird.default gives AttributeError if you didn't create any Bird instance.
Style note: Bird.Default looks very bad to me. Default is an instance of Bird not a type, hence you should use lowercase_with_underscore according to PEP 8.
In fact the whole thing looks fishy for me. You could simply have an is_empty() method. It's pretty easy to implement:
class Animal(object):
def __init__(self, *args, **kwargs):
# might require more complex condition
self._is_empty = not (bool(args) or bool(kwargs))
def is_empty(self):
return self._is_empty
Then when the subclasses create an empty instance that doesn't pass any arguments to the base class the _is_empty attribute will be True and hence the inherited method will return True accordingly, while in the other cases some argument would be passed to the base class which would set _is_empty to False.
You can play around with this in order to obtain a more robust condition that works better with your subclasses.
Another possible metaclass:
class DefaultType(type):
def __new__(cls, name, bases, attrs):
new_cls = super(DefaultType, cls).__new__(cls, name, bases, attrs)
new_cls.default = new_cls()
return new_cls
You only need to set the metaclass attribute for the Animal class, as all derived classes will inherit it:
class Animal(object):
__metaclass__ = DefaultType
# ...
class Bird(Animal):
# ...
This allows you to use both:
a == Bird.default
and:
a == a.default
I'm defining several classes intended to be used for multiple inheritance, e.g.:
class A:
def __init__(self, bacon = None, **kwargs):
self.bacon = bacon
if bacon is None:
self.bacon = 100
super().__init__(**kwargs)
class Bacon(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Eggs(A):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
class Spam(Eggs, Bacon):
def __init__(self, **kwargs):
"""Optional: bacon"""
super().__init__(**kwargs)
However, I have multiple classes (e.g. possibly Bacon, A, and Spam, but not Eggs) that care about when their property bacon is changed. They don't need to modify the value, only to know what the new value is, like an event. Because of the Multiple Inheritance nature I have set up, this would mean having to notify the super class about the change (if it cares).
I know that it might be possible if I pass the class name to the method decorator, or if I use a class decorator. I don't want to have all the direct self-class referencing, having to create lots of decorators above each class, or forcing the methods to be the same name, as none of these sound very pythonic.
I was hoping to get syntax that looks something like this:
#on_change(bacon)
def on_bacon_change(self, bacon):
# read from old/new bacon
make_eggs(how_much = bacon)
I don't care about the previous value of bacon, so that bacon argument isn't necessary, if this is called after bacon is set.
Is it possible to check if a super class has a method with this
decorator?
If this isn't feasible, are there alternatives to passing events like
this, up through the multiple-inheritance chain?
EDIT:
The actual calling of the function in Spam would be done in A, by using a #property and #bacon.setter, as that would be the upper-most class that initializes bacon. Once it knows what function to call on self, the problem only lies in propagating the call up the MI chain.
EDIT 2:
If I override the attribute with a #bacon.setter, Would it be possible to determine whether the super() class has a setter for bacon?
What you call for would probably be nicely fit with a more complete framework of signals, and so on - maybe even invite for Aspected Oriented Programing.
Without going deep into it however, a metaclass and a decorator can do just what you are asking for - I came up with these, I hope they work for you.
If you'd like to evolve this in to something robust and usable, write me - if nothing like this exists out there, it wouldbe worth to keep an utility package in pipy for this.
def setattr_wrapper(cls):
def watcher_setattr(self, attr, val):
super(cls, self).__setattr__(attr, val)
watched = cls.__dict__["_watched_attrs"]
if attr in watched:
for method in watched[attr]:
getattr(self, method)(attr, val)
return watcher_setattr
class AttrNotifier(type):
def __new__(metacls, name, bases, dct):
dct["_watched_attrs"] = {}
for key, value in dct.items():
if hasattr(value, "_watched_attrs"):
for attr in getattr(value, "_watched_attrs"):
if not attr in dct["_watched_attrs"]:
dct["_watched_attrs"][attr] = set()
dct["_watched_attrs"][attr].add(key)
cls = type.__new__(metacls, name, bases, dct)
cls.__setattr__ = setattr_wrapper(cls)
return cls
def on_change(*args):
def decorator(meth):
our_args = args
#ensure that this decorator is stackable
if hasattr(meth, "_watched_attrs"):
our_args = getattr(meth, "_watched_attrs") + our_args
setattr(meth, "_watched_attrs", our_args)
return meth
return decorator
# from here on, example of use:
class A(metaclass=AttrNotifier):
#on_change("bacon")
def bacon_changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
class Spam(A):
#on_change("bacon", "pepper")
def changed(self, attr, val):
print ("%s changed in %s to %s" % (attr, self.__class__.__name__, val))
a = A()
a.bacon = 5
b = Spam()
b.pepper = 10
b.bacon = 20
(tested in Python 3.2 and Python 2.6 - changing the declaration of the "A" class for
Python 2 metaclass syntax)
edit - some words on what is being done
Here is what happens:
The metaclass picks all methods marked with the "on_close" decorator, and register then in a dictionary on the class - this dictionary is named _watched_attrs and it can be accessed as a normal class attribute.
The other thing the metaclass does is to override the __setattr__ method for the clas once it is created. This new __setattr__ just sets the attribute, and then checks the _wacthed_attrs dictionary if there are any methods on that class registered to be called when the attribute just changed has been modified - if so, it calls it.
The extra indirection level around watcher_setattr (which is the function that becomes each class's __setattr__ is there so that you can register different attributes to be watched on each class on the inheritance chain - all the classess have indepently acessible _watched_attrs dictionaries. If it was not for this, only the most specilized class on the inheritance chain _watched_attrs would be respected.
You are looking for python properties:
http://docs.python.org/library/functions.html#property
Search google for override superclass property setter resulted in this StackOverflow question:
Overriding inherited properties’ getters and setters in Python
For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.