I have a base class extending unittest.TestCase, and I want to patch that base class, such that classes extending this base class will have the patches applied as well.
Code Example:
#patch("some.core.function", mocked_method)
class BaseTest(unittest.TestCase):
#methods
pass
class TestFunctions(BaseTest):
#methods
pass
Patching the TestFunctions class directly works, but patching the BaseTest class does not change the functionality of some.core.function in TestFunctions.
You probably want a metaclass here: a metaclass simply defines how a class is created.
By default, all classes are created using Python's built-in class type:
>>> class Foo:
... pass
...
>>> type(Foo)
<class 'type'>
>>> isinstance(Foo, type)
True
So classes are actually instances of type.
Now, we can subclass type to create a custom metaclass (a class that creates classes):
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
We need to control the creation of our classes, so we wanna override the type.__new__ here, and use the patch decorator on all new instances:
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
def __new__(meta, name, bases, attrs):
cls = type.__new__(meta, name, bases, attrs)
cls = patch("some.core.function", mocked_method)(cls)
return cls
And now you simply set the metaclass using __metaclass__ = PatchMeta:
class BaseTest(unittest.TestCase):
__metaclass__ = PatchMeta
# methods
The issue is this line:
cls = patch("some.core.function", mocked_method)(cls)
So currently we always decorate with arguments "some.core.function" and mocked_method.
Instead you could make it so that it uses the class's attributes, like so:
cls = patch(*cls.patch_args)(cls)
And then add patch_args to your classes:
class BaseTest(unittest.TestCase):
__metaclass__ = PatchMeta
patch_args = ("some.core.function", mocked_method)
Edit: As #mgilson mentioned in the comments, patch() modifies the class's methods in place, instead of returning a new class. Because of this, we can replace the __new__ with this __init__:
class PatchMeta(type):
"""A metaclass to patch all inherited classes."""
def __init__(cls, *args, **kwargs):
super(PatchMeta, self).__init__(*args, **kwargs)
patch(*cls.patch_args)(cls)
Which is quite unarguably cleaner.
Generally, I prefer to do this sort of thing in setUp. You can make sure that the patch gets cleaned up after the test is completed by making use of the tearDown method (or alternatively, registering a the patch's stop method with addCleanup):
class BaseTest(unittest.TestCase):
def setUp(self):
super(BaseTest, self).setUp()
my_patch = patch("some.core.function", mocked_method)
my_patch.start()
self.addCleanup(my_patch.stop)
class TestFunctions(BaseTest):
#methods
pass
Provided that you're disciplined enough to always call super in your overridden setUp methods, it should work just fine.
Related
This question already has answers here:
Create an abstract Enum class
(3 answers)
Closed 3 years ago.
Simple Example
The goal is to create an abstract enum class through a metaclass deriving from both abc.ABCMeta and enum.EnumMeta. For example:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
pass
class A(abc.ABC):
#abc.abstractmethod
def foo(self):
pass
class B(A, enum.IntEnum, metaclass=ABCEnumMeta):
X = 1
class C(A):
pass
Now, on Python3.7, this code will be interpreted without error (on 3.6.x and presumably below, it will not). In fact, everything looks great, our MRO shows B derived from both A and IntEnum.
>>> B.__mro__
(<enum 'B'>, __main__.A, abc.ABC, <enum 'IntEnum'>, int, <enum 'Enum'>, object)
Abstract Enum is not Abstract
However, even though B.foo has not been defined, we can still instantiate B without any issue, and call foo().
>>> B.X
<B.X: 1>
>>> B(1)
<B.X: 1>
>>> B(1).foo()
This seems rather weird, since any other class that derives from ABCMeta cannot be instantiated, even if I use a custom metaclass.
>>> class NewMeta(type):
... pass
...
... class AbcNewMeta(abc.ABCMeta, NewMeta):
... pass
...
... class D(metaclass=NewMeta):
... pass
...
... class E(A, D, metaclass=AbcNewMeta):
... pass
...
>>> E()
TypeError: Can't instantiate abstract class E with abstract methods foo
Question
Why does a class using a metaclass derived from EnumMeta and ABCMeta effectively ignore ABCMeta, while any other class using a metaclass derived from ABCMeta use it? This is true even if I custom define the __new__ operator.
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __new__(cls, name, bases, dct):
# Commented out lines reflect other variants that don't work
#return abc.ABCMeta.__new__(cls, name, bases, dct)
#return enum.EnumMeta.__new__(cls, name, bases, dct)
return super().__new__(cls, name, bases, dct)
I'm rather confused, since this seems to fly in the face of what a metaclass is: the metaclass should define how the class is defined and behaves, and in this case, defining a class using a metaclass that is both abstract and an enumeration creates a class that silently ignores the abstract component. Is this a bug, is this intended, or is there something greater I am not understanding?
As stated on #chepner's answer, what is going on is that Enum metaclass overrides the normal metaclass' __call__ method, so that an Enum class is never instantiated through the normal methods, and thus, ABCMeta checking does not trigger its abstractmethod check.
However, on class creation, the Metaclass's __new__ is run normally, and the attributes used by the abstract-class mechanisms to mark the class as abstract do create the attribute ___abstractmethods__ on the created class.
So, all you have to do for what you intend to work, is to further customize your metaclass to perform the abstract check in the code to __call__:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __call__(cls, *args, **kw):
if getattr(cls, "__abstractmethods__", None):
raise TypeError(f"Can't instantiate abstract class {cls.__name__} "
f"with frozen methods {set(cls.__abstractmethods__)}")
return super().__call__(*args, **kw)
This will make the B(1) expression to fail with the same error as abstractclass instantiation.
Note, however, that an Enum class can't be further inherited anyway, and it simply creating it without the missing abstractmethods may already violate what you want to check. That is: in your example above, class B can be declared and B.x will work, even with the missing foo method. If you want to prevent that, just put the same check in the metaclass' __new__:
import abc
import enum
class ABCEnumMeta(abc.ABCMeta, enum.EnumMeta):
def __new__(mcls, *args, **kw):
cls = super().__new__(mcls, *args, **kw)
if issubclass(cls, enum.Enum) and getattr(cls, "__abstractmethods__", None):
raise TypeError("...")
return cls
def __call__(cls, *args, **kw):
if getattr(cls, "__abstractmethods__", None):
raise TypeError(f"Can't instantiate abstract class {cls.__name__} "
f"with frozen methods {set(cls.__abstractmethods__)}")
return super().__call__(*args, **kw)
(Unfortunatelly, the ABC abstract method check in CPython seems to be performed in native code, outside the ABCMeta.__call__ method - otherwise, instead of mimicking the error, we could just call ABCMeta.__call__ explicitly overriding super's behavior instead of hardcoding the TypeError there.)
Calling an enumerated type doesn't create a new instance. Members of the enumerated type are created immediately at class-creation time by the meta class. The __new__ method simply performs lookup, which means ABCMeta is never invoked to prevent instantiation.
B(1).foo() works because, once you have an instance, it doesn't matter if the method was marked as abstract. It's still a real method, and can be called as such.
I want to create a class hierarchy in which I have a class Block which can be instantiated by itself. Then I have a class List which inherits from Block and contains methods common to all lists, and finally I have classes OrderedList, LableledList etc that inherit from List. I want people to be able to instantiate OrderedList etc, but not List.
In other words, you can instantiate a plain Block and you can instantiate an OrderedList that inherits from List that inherits from Block, but you can't instantiate List.
All attempts to Google this lead to Abstract Base Classes, but none provides and example that fits this case and I am having trouble extrapolating.
The following conversation with the interpreter should show how this is possible. After inheriting from the Abstract Base Class with Block, you only need to mark the initializer on List as being an abstractmethod. This will prevent instantiation of the class without causing problems for child classes.
>>> import abc
>>> class Block(abc.ABC):
def __init__(self, data):
self.data = data
>>> class List(Block):
#abc.abstractmethod
def __init__(self, data, extra):
super().__init__(data)
self.extra = extra
>>> class OrderedList(List):
def __init__(self, data, extra, final):
super().__init__(data, extra)
self.final = final
>>> instance = Block(None)
>>> instance = List(None, None)
Traceback (most recent call last):
File "<pyshell#42>", line 1, in <module>
instance = List(None, None)
TypeError: Can't instantiate abstract class List with abstract methods __init__
>>> instance = OrderedList(None, None, None)
>>>
Your List class should have ABCMeta as a metaclass and make the init methods abstract.
from abc import ABCMeta
class List(metaclass=ABCMeta):
#abstractmethod
__init__():
pass
https://docs.python.org/3/library/abc.html
Inherit from ABC located in the abc module and make methods that are implemented in base classes (that inherit from List) #abstractmethods (a decorator located in abc):
from abc import ABC, abstractmethod
class List(ABC, Block):
#abstractmethod
def size(self):
return 0
Having an ABC with #abstractmethods defined forbids from instantiation.
The "one and obvious" way to do this is to use ABCMeta and mark some methods as abstract as documented on other answers.
But if in your case you don't have a set of methods that one has to override in a mandatory way (let's suppose your __init__ is reusable in some cases, and other of the list methods as well):
In that case you can create a __new__ method that checks if the clas being istantiated is the own class, and raises. To do that, you have to use teh magic __class__ variable that is documentend only in corners of Python docs - if you as much as use the __class__ variable in any method body, it will automatically take the value of the class where it was declared, at run time. It is part of the parameterless super mechanism of Python 3.
Thus:
class List(Block):
def __new__(cls, *args, **kw):
if cls is __class__:
raise TypeError(cls.__name__ + " can't be directly instantiated")
return super().__new__(cls, *args, **kw)
Btw, you should give preference for the ABCMeta abstractmethods if your pattern allows it. Note that if your classes use a custom metaclass it will conflict with the ABCMeta as well - so you may need to resort to this as well
(If you don't further customize __new__, then you'd better not pass args and kw upstream on the __new__ method: Python's object.__new__ ignore extra args if __init__ is defined but __new__ is not in the subclasses - but if both are defined it raises an error)
Preamble: I have objects, some of them could be created by default constructor and left without modifications, so such objects could be considered as "empty". Sometimes I need to verify whether some object is "empty" or not. It could be done in the following way (majik methods are implemented in the base class Animal):
>>> a = Bird()
>>> b = Bird()
>>> a == b
True
>>> a == Bird()
True
So the question: is it possible (and if yes then how) to achieve such syntax:
>>> a == Bird.default
True
At least this one (but the previous is sweeter):
>>> a == a.default
True
But: with implementation of default in the base class Animal (to not repeat it in all derived classes):
class Animal(object):
... tech stuff ...
- obj comparison
- obj representation
- etc
class Bird(Animal):
... all about birds ...
class Fish(Animal):
... all about fishes ...
Of course I don't need solutions to have Bird() calling in Animal class :)
I'd like to have a kind of templating implemented in base class which will stamp out derived class default instance and store its unique copy in the derived class or instance property. I think it could be achieved by playing with metaclasses or so, but don't know how.
Class default instance could be considered as any object instantiated by __init__() of its class (without further object modification of course).
UPDATE
The system is flooded with objects and I just want to have a possibility to separate circulating of freshly (by default) created objects (which are useless to display for example) from already somehow modified one. I do it by:
if a == Bird():
. . .
I don't want creation of new object for comparison, intuitevly, I'd like to have one instance copy as etalon for the instances of this class to compare with. Objects are JSON-like and contain only properties (besides implicit __str__, __call__, __eq__ methods), so I'd like to keep such style of using built-in Python features and avoid the using explicitly defined methods like is_empty() for example. It's like entering an object in the interactive shell and it prints it out calling __str__, it is implicit, but fun.
To achieve the first solution you should use a metaclass.
For example:
def add_default_meta(name, bases, attrs):
cls = type(name, bases, attrs)
cls.default = cls()
return cls
And use it as(assuming python3. In python2 set the __metaclass__ attribute in the class body):
class Animal(object, metaclass=add_default_meta):
# stuff
class NameClass(Animal, metaclass=add_default_meta):
# stuff
Note that you have repeat the metaclass=... for every subclass of Animal.
If instead of a function you use a class and its __new__ method to implement the metaclass, it can be inherited, i.e:
class AddDefaultMeta(type):
def __new__(cls, name, bases, attrs):
cls = super(AddDefaultMeta, cls).__new__(cls, name, bases, attrs)
cls.default = cls()
return cls
A different way to achieve the same effect is to use a class decorator:
def add_default(cls):
cls.default = cls()
return cls
#add_default
class Bird(Animal):
# stuff
Again, you must use the decorator for every subclass.
If you want to achieve the second solution, i.e. to check a == a.default, then you can simply reimplement Animal.__new__:
class Animal(object):
def __new__(cls, *args, **kwargs):
if not (args or kwargs) and not hasattr(cls, 'default'):
cls.default = object.__new__(cls)
return cls.default
else:
return object.__new__(cls)
This will create the empty instance whenever the first instance of the class is created and it is stored in the default attribute.
This means that you can do both:
a == a.default
and
a == Bird.default
But accessing Bird.default gives AttributeError if you didn't create any Bird instance.
Style note: Bird.Default looks very bad to me. Default is an instance of Bird not a type, hence you should use lowercase_with_underscore according to PEP 8.
In fact the whole thing looks fishy for me. You could simply have an is_empty() method. It's pretty easy to implement:
class Animal(object):
def __init__(self, *args, **kwargs):
# might require more complex condition
self._is_empty = not (bool(args) or bool(kwargs))
def is_empty(self):
return self._is_empty
Then when the subclasses create an empty instance that doesn't pass any arguments to the base class the _is_empty attribute will be True and hence the inherited method will return True accordingly, while in the other cases some argument would be passed to the base class which would set _is_empty to False.
You can play around with this in order to obtain a more robust condition that works better with your subclasses.
Another possible metaclass:
class DefaultType(type):
def __new__(cls, name, bases, attrs):
new_cls = super(DefaultType, cls).__new__(cls, name, bases, attrs)
new_cls.default = new_cls()
return new_cls
You only need to set the metaclass attribute for the Animal class, as all derived classes will inherit it:
class Animal(object):
__metaclass__ = DefaultType
# ...
class Bird(Animal):
# ...
This allows you to use both:
a == Bird.default
and:
a == a.default
I'm having a load of confusion between the __metaclass__ property of a class and actual inheritance, and how __new__ is called in either of these scenarios. My issue comes from digging through some model code in the django framework.
Let's say I wanted to append an attribute to a class as it's defined in the child's Meta subclass:
class Parent(type):
def __new__(cls, name, base, attrs):
meta = attrs.pop('Meta', None)
new_class = super(Parent, cls).__new__(cls, name, base, attrs)
new_class.fun = getattr(meta, 'funtime', None)
return new_class
I don't understand why the actual __new__ method is called in django's code, but when I try to code something like that it doesn't work.
From what I've experienced, the following does not actually call the __new__ method of the parent:
class Child(Parent):
class Meta:
funtime = 'yaaay'
C = Child()
When I try to do this it complains with the TypeError:
TypeError: __new__() takes exactly 4 arguments (1 given)
However the source code I have been looking at appears to work in that way.
I understand that it could be done with a metaclass:
class Child(object):
__metaclass__ = Parent
But I don't understand why their way works for them and not for me, since the non __metaclass___ would be cleaner for making a distributable module.
Could somebody please point me in the right direction on what I'm missing?
Thanks!
In django, Model is not a metaclass. Actually the metaclass is ModelBase. That's why their way works and your way doesn't work.
Moreover, the latest django used a helper function, six.with_metaclass, to wrap 'ModelBase'.
If we want to follow django's style, Parent and Child class will look like
def with_metaclass(meta, base=object):
"""Create a base class with a metaclass."""
return meta("NewBase", (base,), {})
class ParentBase(type):
def __new__(cls, name, base, attrs):
meta = attrs.pop('Meta', None)
new_class = super(ParentBase, cls).__new__(cls, name, base, attrs)
new_class.fun = getattr(meta, 'funtime', None)
return new_class
class Parent(with_metaclass(ParentBase)):
pass
class Child(Parent):
class Meta:
funtime = 'yaaay'
c = Child()
>>> c.fun
'yaaay'
Let us focus on Parent. It is almost equivalent to
NewBase = ParentBase("NewBase", (object,), {})
class Parent(NewBase):
pass
The key is how to understand ParentBase("NewBase", (object,), {}).
Let us recall type().
type(name, bases, dict)
With three arguments, return a new type object. This is essentially a dynamic form of the class statement. The name string is the class name and becomes the name attribute; the bases tuple itemizes the base classes and becomes the bases attribute; and the dict dictionary is the namespace containing definitions for class body and becomes the dict attribute. For example, the following two statements create identical type objects:
Since ParentBase is a metaclass, a subclass of type. Therefore, ParentBase("NewBase", (object,), {}) is very similar to type("NewBase", (object,), {}). In this case, the only difference is the class created dynamically is not an instance of type, but ParentBase.
In other word, the metaclass of NewBase is ParentBase. Parent is equivalent to
class NewBase(object):
__metaclass__ = ParentBase
class Parent(NewBase):
pass
Finally, we got a __metaclass__.
in a metaclass that extends type, __new__ is used to create a class.
in a class, __new__ is used to create an instance.
metaclass is a class that creates a class. you're confused of class inheritance and metaclass.
your Child class inherits Parent and you want to create an instance of Child. however, Parent being a metaclass means Parent.__new__ shouldn't be used to create an instance of a class.
I want to automatically run a class method defined in a base class on any derived class during the creation of the class. For instance:
class Base(object):
#classmethod
def runme():
print "I am being run"
def __metclass__(cls,parents,attributes):
clsObj = type(cls,parents,attributes)
clsObj.runme()
return clsObj
class Derived(Base):
pass:
What happens here is that when Base is created, ''runme()'' will fire. But nothing happens when Derived is created.
The question is: How can I make ''runme()'' also fire when creating Derived.
This is what I have thought so far: If I explicitly set Derived's metaclass to Base's, it will work. But I don't want that to happen. I basically want Derived to use the Base's metaclass without me having to explicitly set it so.
See this answer. Basically, when calling type(cls,parents,attributes), you are creating a class without passing in the information about what that class's metaclass is. Thus, the class that is returned doesn't have the metaclass you want it to; instead it has metaclass type. (Ironically, by defining __metaclass__ to do as it does, you are explicitly causing your class to not have that metaclass.) Instead of directly calling type, you need to call type.__new__(meta, cls, parents, attrs), where meta is the metaclass.
However, you can't achieve this when you define __metaclass__ inline. From inside your __metaclass__ method, you have no way to refer to that method, because it's a method of a class that hasn't been defined yet. You want to do something like
def __metaclass__(cls, bases, attrs):
type.__new__(metaclassGoesHere, cls, bases, attrs)
. . . but there's nothing you can put in for metaclassGoesHere to make it do the right thing, because what you're trying to refer to is the method inside which you're trying to refer to it.
So just define your metaclass externally.
rename your runme() to __init__(self) and DO NOT override the __init__() method, and it will get called every time you make an instance of Derived
class Base(object):
def __init__(self):
print "I am being run"
class Derived(Base):
pass
dummy_instance = Derived()
Copy and paste that, it will print I am being run.
Try This,
class MyMetaclass(type):
def __new__(cls, name, bases, dct):
runme()
return super(MyMetaclass, cls).__new__(cls, name, bases, uppercase_attr)
If you want to do something when the class is created you have to do it in the __init__ method of __metaclass__.
class foo(object):
#classmethod
def method(cls):
print 'I am being run by', cls.__name__
class __metaclass__(type):
def __init__(self, *args, **kwargs):
type.__init__(self, *args, **kwargs)
self.method()
class bar(foo): pass
Which prints:
I am being run by foo
I am being run by bar