I'm implementing enumeration using a base class that defines a variety of methods. The actual enumerations are subclasses of that, with no additional methods or attributes. (Each subclass is populated with its own values using the constructor defined in the base class).
I use a registry (a class attribute that stores all the instances of that class). Ideally, I'd like to avoid defining it in each subclass. Unfortunately, if I define it in the base class, all the subclasses will end up sharing the same registry.
What's a good approach here?
Below is the implementation in case it helps (it's based on #jchl comment in python enumeration class for ORM purposes).
class IterRegistry(type):
def __iter__(cls):
return iter(cls._registry.values())
class EnumType(metaclass = IterRegistry):
_registry = {}
_frozen = False
def __init__(self, token):
if hasattr(self, 'token'):
return
self.token = token
self.id = len(type(self)._registry)
type(self)._registry[token] = self
def __new__(cls, token):
if token in cls._registry:
return cls._registry[token]
else:
if cls._frozen:
raise TypeError('No more instances allowed')
else:
return object.__new__(cls)
#classmethod
def freeze(cls):
cls._frozen = True
def __repr__(self):
return self.token
#classmethod
def instance(cls, token):
return cls._registry[token]
class Enum1(EnumType): pass
Enum1('a')
Enum1('b')
for i in Enum1:
print(i)
# not going to work properly because _registry is shared
class Enum2(EnumType): pass
As you already have a metaclass you might as well use it to put a add a separate _registry attribute to each subclass automatically.
class IterRegistry(type):
def __new__(cls, name, bases, attr):
attr['_registry'] = {} # now every class has it's own _registry
return type.__new__(cls, name, bases, attr)
Marty Alchin has a very nice pattern for this: see his blog entry.
What if you share the same registry, but with sub-registries per class, i.e.
if cls.__name__ not in self._registry:
self._registry[cls.__name__] = {}
self._registry[cls.__name__][token] = cls
You actually don't even need cls.__name__, you should be able to use cls itself as key.
Related
I've been battling with this for half an hour, so I have passed the try it yourself for half an hour rule and am asking for your help. I am trying to get the Child go the abstract class's setter abstract method, but it just won't work...
#!/usr/bin/env python3
from abc import ABC, abstractmethod
from typing import List
class Indicator(ABC):
def __init__(self, **kwargs):
super().__init__()
pass
#abstractmethod
def calculate(self):
"""
kwargs in children will most likely be date_from, date_to, index
"""
raise NotImplementedError("The calculate method is not implemented!")
#property
#abstractmethod
def db_ids(self):
return self._db_ids
#db_ids.setter
#abstractmethod
def db_ids(self, ids: List[int]):
assert isinstance(ids, list)
assert all(isinstance(id_, int) for id_ in ids)
self._db_ids = ids
#property
#abstractmethod
def name(self):
return self._name
#name.setter
#abstractmethod
def name(self, set_name: str):
assert isinstance(set_name, str)
self._name = set_name
# …………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………………
class ValueHistorical(Indicator):
def __init__(self, **kwargs):
if kwargs:
self.kwargs = kwargs
super(ValueHistorical, self).__init__(**kwargs)
self.db_ids = [119, 120, 121, 122]
self.name = 'Value Based on Historical'
#property
def db_ids(self):
return self._db_ids
#property
def name(self):
return self._name
def calculate(self):
pass
ValueHistorical(**{'date_from': '2010-01-01', 'date_to': '2012-01-01'})
arguments here don't matter. And the error I get is AttributeError: can't set the attribute'.
What I want to achieve is inside of ValueHistorical constructor it goes to it's Parent's abstract class's setters for db_ids and name when those are being assigned.
This has actually nothing to do with ABC, but with the fact that you rebound the properties in your child class, but without setters. This:
class ValueHistorical(Indicator):
#property
def db_ids(self):
return self._db_ids
#property
def name(self):
return self._name
Just replaces the parent's properties with the new ones, but defining those properties as read-only since you didn't provide a setter.
Remember that the decorator syntax is only syntactic sugar, so this:
#property
def getter(...): pass
is just a fancier way to write
def getter(...): pass
getter = property(getter)
Since the getter AND the setter are attributes of the property instance, when you redefine a property in a child class, you cannot just redefine the getter, you must also redefine the setter.
A common pattern here is to have the getter and setter (if there's one) delegating to another method, so you don't have to reimplement the whole thing, ie:
class Base(object):
#property
def foo(self):
return self._get_foo()
#foo.setter
def foo(self, value):
self._set_foo(value)
def _get_foo(self):
# ...
def _set_foo(self, value):
# ...
So a child class can override _get_foo and / or _set_foo without having to redefine the property.
Also, applying both property and abstractmethod to a function is totally useless. This:
#property
#abstractmethod
def db_ids(self):
return self._db_ids
is the equivalent of
def db_ids(self):
return self._db_ids
db_ids = property(abstractmethod(db_ids))
So what ABC will see here is the property - the fact that it's getter (and/or setter) have been decorated with abstractmethod is ignored, ABC will not inspect the propertie's getter and setter. And if you put them the other way round ie
db_ids = abstractmethod(property(db_ids))
then you don't define a property at all (actually, it will not work at all - you'll get an exception right from the start with " 'property' object has no attribute 'isabstractmethod'")
FWIW, the abstractmethod decorator is only meant to be used on methods that are NOT defined (empty body) so the child classes must implement them. If you have a default implementation, don't mark it as abstract, else why provide a default implementation at all ?
EDIT:
You mentionned in a comment (on a deleted answer) that:
I basically want ValueHistorical to go to the Abstract class's setter methods for db_ids and name when they are being assigned in the ValueHistorical constructor
Then the simplest solution is the one I explained above: define implementation methods for the getter and/or setter (you can make any of them or both abstract as you see fit) and use a concrete property to call those implementation methods.
Oh ans yes: assert is a developper tool, don't use it for typechecking in production code. If you really want to do typecheking (which sometimes makes sense but is most often than not a complete waste of time), use isinstance and raise a TypeError. As an example, your db_ids setter should look like this:
if not isinstance(ids, list):
raise TypeError("ids should be a list")
if not all(isinstance(id_, int) for id_ in ids)
raise TypeError("ids items should be ints")
Or even better:
# you don't care if it really was a list actually,
# as long as you can build a list out of it, and
# you don't care if it really contains ints as long
# as you can build ints out of them.
#
# No need for typecheck here, if `ids` is not iterable
# or what it yields cannot be used to build an int,
# this will raise, with way enough informations to
# debug the caller.
ids = [int(id) for id in ids)]
I read in https://pymotw.com/2/abc/
To use the decorator syntax does with read/write abstract properties, the methods to get and set the value should be named the same.
Don't think there's any way you can do this without requiring the setter. But IMO it's cleaner than using the super class setter logic with fset
from abc import ABC, abstractmethod, abstractproperty
from typing import List
class Indicator(ABC):
def __init__(self, **kwargs):
super().__init__()
#abstractproperty
def db_ids(self):
return self._db_ids
#db_ids.setter
#abstractmethod
def db_ids(self, ids: List[int]):
self._db_ids = ids
class ValueHistorical(Indicator):
def __init__(self, **kwargs):
if kwargs:
self.kwargs = kwargs
super(ValueHistorical, self).__init__(**kwargs)
self.db_ids = [119, 120, 121, 122]
#property
def db_ids(self):
return self._db_ids
#db_ids.setter
def db_ids(self, ids: List[int]):
self._db_ids = ids
i = ValueHistorical(**{'date_from': '2010-01-01', 'date_to': '2012-01-01'})
print(i.db_ids)
I am attempting to wrap a class from a third-party package in such a way that my new class looks exactly like a subclass of the third-party class. The third-party class does not support inheritance, and it has nontrivial features, such as functions that have a __getitem__ method. I can wrap almost every attribute and method using a solution based on Wrapping a class whose methods return instances of that class and How can I intercept calls to python's "magic" methods in new style classes?. However, I still need to override the __init__ method of the third-party class. How can I do that? Note: I am using new-style classes.
Code so far:
import copy
class WrapperMetaclass(type):
"""
Works with the `Wrapper` class to create proxies for the wrapped object's magic methods.
"""
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
class Wrapper(object):
"""
Used to provide a (nearly) seamless inheritance-like interface for classes that do not support direct inheritance.
"""
__metaclass__ = WrapperMetaclass
__wraps__ = None
# note that the __init__ method will be ignored by WrapperMetaclass
__ignore__ = "class mro new init setattr getattr getattribute dict"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
def __getattr__(self, name):
if name is '_obj':
zot = 1
orig_attr = self._obj.__getattribute__(name)
if callable(orig_attr) and not hasattr(orig_attr, '__getitem__'):
def hooked(*args, **kwargs):
result = orig_attr(*args, **kwargs)
if result is self._obj:
return self
elif isinstance(result, self.__wraps__):
return self.__class__(result)
else:
return result
return hooked
else:
return orig_attr
def __setattr__(self, attr, val):
object.__setattr__(self, attr, val)
if getattr(self._obj, attr, self._obj) is not self._obj: # update _obj's member if it exists
setattr(self._obj, attr, getattr(self, attr))
class ClassToWrap(object):
def __init__(self, data):
self.data = data
def theirfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
def __str__(self):
return str(self.data)
class Wrapped(Wrapper):
__wraps__ = ClassToWrap
def myfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
# can't instantiate Wrapped directly! This is the problem!
obj = ClassToWrap(0)
wr0 = Wrapped(obj)
print wr0
>> 0
print wr0.theirfun()
>> 1
This works, but for truly seamless inheritance-like behavior, I need to instantiate Wrapped directly, e.g.
wr0 = Wrapped(0)
which currently throws
ValueError: wrapped object must be of <class '__main__.ClassToWrap'>
I attempted to override by defining a new proxy for __init__ in WrapperMetaclass, but rapidly ran into infinite recursions.
My codebase is complex with users at different skill levels, so I can't afford to use monkey-patching or solutions that modify the definition of the example classes ClassToWrap or Wrapped. I am really hoping for an extension to the code above that overrides Wrapped.__init__.
Please note that this question is not simply a duplicate of e.g. Can I exactly mimic inheritance behavior with delegation by composition in Python?. That post does not have any answer that is nearly as detailed as what I'm already providing here.
It sounds like you just want Wrapper.__init__ method to work differently that it currently does. Rather than taking an already existing instance of the __wraps__ class, it should take the arguments that the other class expects in its constructor and built the instance for you. Try something like this:
def __init__(self, *args, **kwargs):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
else:
self._obj = self.__wraps__(*args, **kwargs)
If you want Wrapper to remain the same for some reason, you could put the logic in a new Wrapped.__init__ method instead:
def __init__(self, data): # I'm explicitly naming the argument here, but you could use *args
super(self, Wrapped).__init__(self.__wraps__(data)) # and **kwargs to make it extensible
I'm working as an application with classes and subclasses. For each class, both super and sub, there is a class variable called label. I would like the label variable for the super class to default to the class name. For example:
class Super():
label = 'Super'
class Sub(Super):
label = 'Sub'
Rather than manually type out the variable for each class, is it possible to derive the variable from the class name in the super class and have it automatically populated for the subclasses?
class Super():
label = # Code to get class name
class Sub(Super)
pass
# When inherited Sub.label == 'Sub'.
The reason for this is that this will be the default behavior. I'm also hoping that if I can get the default behavior, I can override it later by specifying an alternate label.
class SecondSub(Super):
label = 'Pie' # Override the default of SecondSub.label == 'SecondSub'
I've tried using __name__, but that's not working and just gives me '__main__'.
I would like to use the class variable label in #classmethod methods. So I would like to be able to reference the value without having to actually create a Super() or Sub() object, like below:
class Super():
label = # Magic
#classmethod
def do_something_with_label(cls):
print(cls.label)
you can return self.__class__.__name__ in label as a property
class Super:
#property
def label(self):
return self.__class__.__name__
class Sub(Super):
pass
print Sub().label
alternatively you could set it in the __init__ method
def __init__(self):
self.label = self.__class__.__name__
this will obviously only work on instantiated classes
to access the class name inside of a class method you would need to just call __name__ on the cls
class XYZ:
#classmethod
def my_label(cls):
return cls.__name__
print XYZ.my_label()
this solution might work too (snagged from https://stackoverflow.com/a/13624858/541038)
class classproperty(object):
def __init__(self, fget):
self.fget = fget
def __get__(self, owner_self, owner_cls):
return self.fget(owner_cls)
class Super(object):
#classproperty
def label(cls):
return cls.__name__
class Sub(Super):
pass
print Sub.label #works on class
print Sub().label #also works on an instance
class Sub2(Sub):
#classmethod
def some_classmethod(cls):
print cls.label
Sub2.some_classmethod()
You can use a descriptor:
class ClassNameDescriptor(object):
def __get__(self, obj, type_):
return type_.__name__
class Super(object):
label = ClassNameDescriptor()
class Sub(Super):
pass
class SecondSub(Super):
label = 'Foo'
Demo:
>>> Super.label
'Super'
>>> Sub.label
'Sub'
>>> SecondSub.label
'Foo'
>>> Sub().label
'Sub'
>>> SecondSub().label
'Foo'
If class ThirdSub(SecondSub) should have ThirdSub.label == 'ThirdSub' instead of ThirdSub.label == 'Foo', you can do that with a bit more work. Assigning label at the class level will be inherited, unless you use a metaclass (which is a lot more hassle than it's worth for this), but we can have the label descriptor look for a _label attribute instead:
class ClassNameDescriptor(object):
def __get__(self, obj, type_):
try:
return type_.__dict__['_label']
except KeyError:
return type_.__name__
Demo:
>>> class SecondSub(Super):
... _label = 'Foo'
...
>>> class ThirdSub(SecondSub):
... pass
...
>>> SecondSub.label
'Foo'
>>> ThirdSub.label
'ThirdSub'
A metaclass might be useful here.
class Labeller(type):
def __new__(meta, name, bases, dct):
dct.setdefault('label', name)
return super(Labeller, meta).__new__(meta, name, bases, dct)
# Python 2
# class Super(object):
# __metaclass__ = Labeller
class Super(metaclass=Labeller):
pass
class Sub(Super):
pass
class SecondSub(Super):
label = 'Pie'
class ThirdSub(SecondSub):
pass
Disclaimer: when providing a custom metaclass for your class, you need to make sure it is compatible with whatever metaclass(es) are used by any class in its ancestry. Generally, this means making sure your metaclass inherits from all the other metaclasses, but it can be nontrivial to do so. In practice, metaclasses aren't so commonly used, so it's usually just a matter of subclassing type, but it's something to be aware of.
As of Python 3.6, the cleanest way to achieve this is with __init_subclass__ hook introduced in PEP 487. It is much simpler (and easier to manage with respect to inheritance) than using a metaclass.
class Base:
#classmethod
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
if 'label' not in cls.__dict__: # Check if label has been set in the class itself, i.e. not inherited from any of its superclasses
cls.label = cls.__name__ # If not, default to class's __name__
class Sub1(Base):
pass
class Sub2(Base):
label = 'Custom'
class SubSub(Sub2):
pass
print(Sub1.label) # Sub1
print(Sub2.label) # Custom
print(SubSub.label) # SubSub
In a class, I want to define N persistent properties. I can implement them as follow:
#property
def prop1(self):
return self.__prop1
#prop1.setter
def prop1(self, value):
self.__prop1 = value
persistenceManagement()
#property
def prop2(self):
return self.__prop2
#prop2.setter
def prop2(self, value):
self.__prop2 = value
persistenceManagement()
[...]
#property
def propN(self):
return self.__propN
#propN.setter
def propN(self, value):
self.__propN = value
persistenceManagement()
Of course, the only different thing between these blocks is the property name (prop1, prop2, ..., propN). persistenceManagement() is a function that has to be called when the value of one of these property changes.
Since these blocks of code are identical except for a single information (i.e., the property name), I suppose there must be some way to replace each of these blocks by single lines declaring the existence of a persistent property with a given name. Something like
def someMagicalPatternFunction(...):
[...]
someMagicalPatternFunction("prop1")
someMagicalPatternFunction("prop2")
[...]
someMagicalPatternFunction("propN")
...or maybe some decorating trick that I cannot see at the moment. Is someone has an idea how this could be done?
Properties are just descriptor classes and you can create your own and use them:
class MyDescriptor(object):
def __init__(self, name, func):
self.func = func
self.attr_name = '__' + name
def __get__(self, instance, owner):
return getattr(self, self.attr_name)
def __set__(self, instance, value):
setattr(self, self.attr_name, value)
self.func(self.attr_name)
def postprocess(attr_name):
print 'postprocess called after setting', attr_name
class Example(object):
prop1 = MyDescriptor('prop1', postprocess)
prop2 = MyDescriptor('prop2', postprocess)
obj = Example()
obj.prop1 = 'answer' # prints 'postprocess called after setting __prop1'
obj.prop2 = 42 # prints 'postprocess called after setting __prop2'
Optionally you can make it a little easier to use with something like this:
def my_property(name, postprocess=postprocess):
return MyDescriptor(name, postprocess)
class Example(object):
prop1 = my_property('prop1')
prop2 = my_property('prop2')
If you like the decorator # syntax, you could do it this way (which also alleviates having to type the name of the property twice) -- however the dummy functions it requires seem a little weird...
def my_property(method):
name = method.__name__
return MyDescriptor(name, postprocess)
class Example(object):
#my_property
def prop1(self): pass
#my_property
def prop2(self): pass
The property class (yes it's a class) is just one possible implementation of the descriptor protocol (which is fully documented here: http://docs.python.org/2/howto/descriptor.html). Just write your own custom descriptor and you'll be done.
Is it possible to access the 'owner' class inside a descriptor during the __init__ function of that descriptor, without passing it in manually as in this example?
class FooDescriptor(object):
def __init__(self, owner):
#do things to owner here
setattr(owner, 'bar_attribute', 'bar_value')
class BarClass(object):
foo_attribute = FooDescriptor(owner=BarClass)
One way to do something like that is with a metaclass. Just make sure it's really what you want, and don't just copy blindly if you don't understand how it works.
class Descriptor(object):
pass
class Meta(type):
def __new__(cls, name, bases, attrs):
obj = type.__new__(cls, name, bases, attrs)
# obj is now a type instance
# this loop looks for Descriptor subclasses
# and instantiates them, passing the type as the first argument
for name, attr in attrs.iteritems():
if isinstance(attr, type) and issubclass(attr, Descriptor):
setattr(obj, name, attr(obj))
return obj
class FooDescriptor(Descriptor):
def __init__(self, owner):
owner.foo = 42
class BarClass(object):
__metaclass__ = Meta
foo_attribute = FooDescriptor # will be instantiated by the metaclass
print BarClass.foo
If you need to pass additional arguments, you could use e.g. a tuple of (class, args) in the place of the class, or make FooDescriptor a decorator that would return a class that takes only one argument in the ctor.
Since Python 3.6, you can use the __set_name__ special method:
class FooDescriptor(object):
def __set_name__(self, owner, name):
owner.foo = 42
class BarClass(object):
foo_attribute = FooDescriptor()
# foo_attribute.__set_name__(BarClass, "foo_attribute") called after class definition
__set_name__ is automatically called on all descriptors in a class immediately after the class is created.
See PEP 487 for more details.