I have a class which, by design, must follow the singleton pattern. So I went ahead and implemented it using a metaclass. Everything worked nicely until a bug was reported which, in summary, said that deepcopy-ied instances of my singleton class were not the same instances.
I can get around this bug by inheriting from a base singleton-type class, but I'd rather not, for reasons pointed out in this question.
A working example of this issue is presented below:
class SingletonMeta(type):
def __init__(cls, name, bases, dict):
super(SingletonMeta, cls).__init__(name, bases, dict)
cls.instance = None
def __call__(cls,*args,**kw):
print "SingletonMeta __call__ was called"
if cls.instance is None:
cls.instance = super(SingletonMeta, cls).__call__(*args, **kw)
return cls.instance
class MyClass1(object):
__metaclass__ = SingletonMeta
class SingletonBase(object):
_instance = None
def __new__(class_, *args, **kwargs):
print "SingletonBase __new__ was called"
if not isinstance(class_._instance, class_):
class_._instance = object.__new__(class_, *args, **kwargs)
return class_._instance
class MyClass2(SingletonBase):
pass
from copy import deepcopy as dcp
mm1 = MyClass1()
mm2 = dcp(mm1)
print "mm1 is mm2:", mm1 is mm2
mb1 = MyClass2()
mb2 = dcp(mb1)
print "mb1 is mb2:", mb1 is mb2
Output:
SingletonMeta __call__ was called
mm1 is mm2: False
SingletonBase __new__ was called
SingletonBase __new__ was called
mb1 is mb2: True
Can you give me any pointers as to how should one resolve this issue? I'm running Python 2.7.X
The docs on the copy module say this:
In order for a class to define its own copy implementation, it can define special methods __copy__() and __deepcopy__().
[...]
The latter is called to implement the deep copy operation; it is passed one argument, the memo dictionary.
[...]
So if you declare these to return self, that ought to do the trick.
When you need to customize class creation (not instance creation), you do it in the __new__ method of the metaclass:
def __new__(cls, name, bases, dict):
dict['__deepcopy__'] = dict['__copy__'] = lambda self, *args: self
return super(SingletonMeta, cls).__new__(cls, name, bases, dict)
and your test will give
SingletonMeta __call__ was called
mm1 is mm2: True
You need to define __copy__ as well or even shallow copies will result in new instances.
Glad that my solution in that thread came in handy.
Related
I'm wondering how to create a metaclass in Python that can create other classes that:
Store their instances in an array automatically
Have a special instance, NonMetaClass.all, whose properties:
When set, set all the class's instances with the same key to the same value (e.g., Foo.all.num = 3 makes all instances of Foo have a num of 3)
When accessed (get), returns an array of all of the class's instances's key values (e.g., Foo.all.num returns [5, 3, 2])
Cannot be deleted.
When called (if the attribute is a function), call that method on all the instances of a class.
In Python terms, I would like to turn a class that is like this:
class Foo(object):
BAR = 23
def __init__(self):
self.a = 5
def pointless():
print 'pointless.'
def change_a(self):
self.a = 52
Into this:
class Foo(object):
BAR = 23
instances = []
all = # Some black magic to create the special "all" instance
def __init__(self):
self.a = 5
Foo.instances.append(self)
def pointless(self):
print 'pointless.'
def change_a(self):
self.a = 52
And be able to use it like this:
>>> Foo()
>>> Foo.instances[0]
<__main__.Foo instance at 0x102ff5758>
>>> Foo()
>>> len(Foo.instances)
2
>>> Foo.all.a = 78
78
>>> Foo.all.a
[78, 78]
>>> Foo.all.change_a()
>>> Foo.all.a
[52, 52]
>>>
The only thing a metaclass is needed for there is actually quite easy:
exactly creating the intances and all attributes.
All it have to do is to insert those into the namespace. Ah, it will also have to wrap the class __new__ method to insert new instances into the instances list.
The part that is the behavior wanted from all is interesting, and that can be implemented using the descriptor protocol, and attribute access control, so we have to craft a couple special classes, that will return the appropriate objects when requested after the ".".
"All" is the class that will be instantiated as "all" - it just needs a __get__ method to return another special object, from the AllAttr class, already bound to the parent class.
"AllAttr" is a special object that on any attribute access, perform your requirements on the members of the owner class "instance" attribute.
And "CallAllList" is a special list subclass that is callable, and calls all its members in turn. It is used by AllAttr if the required attribute from the owner class is callable itself.
class CallAllList(list):
def __call__(self, *args, **kwargs):
return [instance(*args, **kwargs) for instance in self]
class AllAttr(object):
def __init__(self, owner):
self._owner = owner
def __getattr__(self, attr):
method = getattr(self._owner, attr, None)
cls = CallAllList if callable(method) else list
return cls(getattr(obj, attr) for obj in self._owner.instances)
def __setattr__(self, attr, value):
if attr == "_owner":
return super(AllAttr, self).__setattr__(attr, value)
for obj in self._owner.instances:
setattr(obj, attr, value)
class All(object):
def __get__(self, instance, owner):
return AllAttr(owner)
def __repr__(self):
return "Representation of all instances of '{}'".format(self.__class__.__name__)
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
original_new = getattr(cls, "__new__")
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
cls.__new__ = __new__
return cls
class Foo(metaclass=MetaAll):
pass
The code above is written so that it is Python 3 and Python 2 compatible, since you appear to still be using Python2 given your "print" example.
The only thing that cannot be written compatible with both forms is the metaclass using declaration itself - just declare a __metaclass__ = MetaAll inside the body of your Foo class if you are using Python 2. But you should not really be using Python2, just change to Python 3 as soon as you can.
update
It happens that Python 2 has the "unbound method" figure, and the special casing of __new__ does not work like in Python 3: you can't just attribute a function named __new__ to the class. In order to get the correct __new__ method from the superclasses, the easiest way is to create a disposable class, so that it can be searched linearly. Otherwise, one would have to reimplement the MRO algorithm to get the proper __new__ method.
So, for Python 2, the metaclass should be this:
class MetaAll(type):
def __new__(metacls, name, bases, namespace):
namespace["all"] = All()
namespace["instances"] = []
if "__new__" in namespace:
original_new = namespace["__new__"]
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
else:
# We create a disposable class just to get the '__mro__'
stub_cls = super(MetaAll, metacls).__new__(metacls, name, bases, {})
for parent in stub_cls.__mro__[1:]:
if "__new__" in parent.__dict__:
original_new = parent.__dict__["__new__"]
break
def __new__(cls, *args, **kwargs):
instance = original_new(cls, *args, **kwargs)
cls.instances.append(instance)
return instance
namespace["__new__"] = __new__
final_cls = super(MetaAll, metacls).__new__(metacls, name, bases, namespace)
return final_cls
class Foo(object):
__metaclass__ = MetaAll
(now, again, this thing is ancient. Just settle for Python 3.6)
Ok, I figured out how to do this for Python 2.7 on my own. This is what I believe to be the best solution though it may not be the only one. It allows you to set, get, and function call on attributes of Class.all. I've named the metaclass InstanceUnifier, but please comment if you think there's a better (shorter, more descriptive) name you can think of.
class InstanceUnifier(type):
'''
What we want: A metaclass that can give a class an array of instances and provide a static Class.all object, that, when a method is called on it, calls the same method on every instance of the class.
'''
def __new__(cls, name, base_classes, dct):
dct['all'] = None
dct['instances'] = []
return type.__new__(cls, name, base_classes, dct)
def __init__(cls, name, base_classes, dct):
class Accessor(object):
def __getattribute__(self, name):
array = [getattr(inst, name) for inst in cls.instances]
if all([callable(item) for item in array]):
def proxy_func(*args, **kwargs):
for i in range(len(cls.instances)):
this = cls.instances[i]
func = array[i]
func(*args, **kwargs)
return proxy_func
elif all([not callable(item) for item in array]):
return array
else:
raise RuntimeError('Some objects in class instance array for key "'+name+'" are callable, some are not.')
def __setattr__(self, name, value):
[setattr(inst, name, value) for inst in cls.instances]
def __delattr__(self, name):
[delattr(inst, name) for inst in cls.instances]
cls.all = Accessor()
return type.__init__(cls, name, base_classes, dct)
def __call__(cls, *args, **kwargs):
inst = type.__call__(cls, *args, **kwargs)
cls.instances.append(inst)
return inst
I am attempting to wrap a class from a third-party package in such a way that my new class looks exactly like a subclass of the third-party class. The third-party class does not support inheritance, and it has nontrivial features, such as functions that have a __getitem__ method. I can wrap almost every attribute and method using a solution based on Wrapping a class whose methods return instances of that class and How can I intercept calls to python's "magic" methods in new style classes?. However, I still need to override the __init__ method of the third-party class. How can I do that? Note: I am using new-style classes.
Code so far:
import copy
class WrapperMetaclass(type):
"""
Works with the `Wrapper` class to create proxies for the wrapped object's magic methods.
"""
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
class Wrapper(object):
"""
Used to provide a (nearly) seamless inheritance-like interface for classes that do not support direct inheritance.
"""
__metaclass__ = WrapperMetaclass
__wraps__ = None
# note that the __init__ method will be ignored by WrapperMetaclass
__ignore__ = "class mro new init setattr getattr getattribute dict"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
def __getattr__(self, name):
if name is '_obj':
zot = 1
orig_attr = self._obj.__getattribute__(name)
if callable(orig_attr) and not hasattr(orig_attr, '__getitem__'):
def hooked(*args, **kwargs):
result = orig_attr(*args, **kwargs)
if result is self._obj:
return self
elif isinstance(result, self.__wraps__):
return self.__class__(result)
else:
return result
return hooked
else:
return orig_attr
def __setattr__(self, attr, val):
object.__setattr__(self, attr, val)
if getattr(self._obj, attr, self._obj) is not self._obj: # update _obj's member if it exists
setattr(self._obj, attr, getattr(self, attr))
class ClassToWrap(object):
def __init__(self, data):
self.data = data
def theirfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
def __str__(self):
return str(self.data)
class Wrapped(Wrapper):
__wraps__ = ClassToWrap
def myfun(self):
new_obj = copy.deepcopy(self)
new_obj.data += 1
return new_obj
# can't instantiate Wrapped directly! This is the problem!
obj = ClassToWrap(0)
wr0 = Wrapped(obj)
print wr0
>> 0
print wr0.theirfun()
>> 1
This works, but for truly seamless inheritance-like behavior, I need to instantiate Wrapped directly, e.g.
wr0 = Wrapped(0)
which currently throws
ValueError: wrapped object must be of <class '__main__.ClassToWrap'>
I attempted to override by defining a new proxy for __init__ in WrapperMetaclass, but rapidly ran into infinite recursions.
My codebase is complex with users at different skill levels, so I can't afford to use monkey-patching or solutions that modify the definition of the example classes ClassToWrap or Wrapped. I am really hoping for an extension to the code above that overrides Wrapped.__init__.
Please note that this question is not simply a duplicate of e.g. Can I exactly mimic inheritance behavior with delegation by composition in Python?. That post does not have any answer that is nearly as detailed as what I'm already providing here.
It sounds like you just want Wrapper.__init__ method to work differently that it currently does. Rather than taking an already existing instance of the __wraps__ class, it should take the arguments that the other class expects in its constructor and built the instance for you. Try something like this:
def __init__(self, *args, **kwargs):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
else:
self._obj = self.__wraps__(*args, **kwargs)
If you want Wrapper to remain the same for some reason, you could put the logic in a new Wrapped.__init__ method instead:
def __init__(self, data): # I'm explicitly naming the argument here, but you could use *args
super(self, Wrapped).__init__(self.__wraps__(data)) # and **kwargs to make it extensible
Below is the well known code for creating a singleton metaclass:
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
cls.x = 5
return cls._instances[cls]
class MyClass(metaclass=Singleton):
pass
m = MyClass()
v = MyClass()
print (m.x)
m.x = 420
print (v.x)
My question is why do we need to use the call function of type class again to initialize the class? Why can't we call the init method to do that like normal class initialization. Something like this :
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = cls(*args, **kwargs)
cls.x = 5
return cls._instances[cls]
class MyClass(metaclass=Singleton):
pass
m = MyClass()
v = MyClass()
print (m.x)
m.x = 420
print (v.x)
This is getting into an infinite loop anyways.
Because trying to creating an instance of a class by just calling it as you do in the line cls._instances[cls] = cls(*args, **kwargs) just by itself calls the metaclass __call__ which is the exact method where you attempt the call, as is explained here.
Now, if one thing, you should not really be using metaclasses just for creating singletons.
The Metaclass mechanism in Python is complicated - the problem you've hit on this question shows you are grasping now how simple inheritance and call to methods on super-classes work - and metaclasses are an order of magnitude more complicated than that.
And, beyond been complicated, classes with a custom metaclass can't be ordinarily combined with other classes that feature custom metaclasses, so the general rule is keeping their usage to a minimum anyway.
How to create a singleton class:
But for creatign a singleton, you can just place all your checks in the class ordinary __new__ method. No need to feedle with metaclasses - just plain class inheritance:
_instances = {}
class Singleton(object):
def __new__(cls, *args, **kw):
if not cls in _instances:
instance = super().__new__(cls)
_instances[cls] = instance
return _instances[cls]
And just inherit your singleton classes from this one.
I saw a lot of methods of making a singleton in Python and I tried to use the metaclass implementation with Python 3.2 (Windows), but it doesn"t seem to return the same instance of my singleton class.
class Singleton(type):
_instances = {}
def __call__(cls, *args, **kwargs):
if cls not in cls._instances:
cls._instances[cls] = super(Singleton, cls).__call__(*args, **kwargs)
return cls._instances[cls]
class MyClass(object):
__metaclass__ = Singleton
a = MyClass()
b = MyClass()
print(a is b) # False
I use the decorator implementation now which is working, but I'm wondering what is wrong with this implementation?
The metaclass syntax has changed in Python3. See the documentaition.
class MyClass(metaclass=Singleton):
pass
And it works:
>>> MyClass() is MyClass()
True
Is it possible to access the 'owner' class inside a descriptor during the __init__ function of that descriptor, without passing it in manually as in this example?
class FooDescriptor(object):
def __init__(self, owner):
#do things to owner here
setattr(owner, 'bar_attribute', 'bar_value')
class BarClass(object):
foo_attribute = FooDescriptor(owner=BarClass)
One way to do something like that is with a metaclass. Just make sure it's really what you want, and don't just copy blindly if you don't understand how it works.
class Descriptor(object):
pass
class Meta(type):
def __new__(cls, name, bases, attrs):
obj = type.__new__(cls, name, bases, attrs)
# obj is now a type instance
# this loop looks for Descriptor subclasses
# and instantiates them, passing the type as the first argument
for name, attr in attrs.iteritems():
if isinstance(attr, type) and issubclass(attr, Descriptor):
setattr(obj, name, attr(obj))
return obj
class FooDescriptor(Descriptor):
def __init__(self, owner):
owner.foo = 42
class BarClass(object):
__metaclass__ = Meta
foo_attribute = FooDescriptor # will be instantiated by the metaclass
print BarClass.foo
If you need to pass additional arguments, you could use e.g. a tuple of (class, args) in the place of the class, or make FooDescriptor a decorator that would return a class that takes only one argument in the ctor.
Since Python 3.6, you can use the __set_name__ special method:
class FooDescriptor(object):
def __set_name__(self, owner, name):
owner.foo = 42
class BarClass(object):
foo_attribute = FooDescriptor()
# foo_attribute.__set_name__(BarClass, "foo_attribute") called after class definition
__set_name__ is automatically called on all descriptors in a class immediately after the class is created.
See PEP 487 for more details.