Assume you're writing an abstract class and one or more of its non-abstract class methods require the concrete class to have a specific class attribute; e.g., if instances of each concrete class can be constructed by matching against a different regular expression, you might want to give your ABC the following:
#classmethod
def parse(cls, s):
m = re.fullmatch(cls.PATTERN, s)
if not m:
raise ValueError(s)
return cls(**m.groupdict())
(Maybe this could be better implemented with a custom metaclass, but try to ignore that for the sake of the example.)
Now, because overriding of abstract methods & properties is checked at instance creation time, not subclass creation time, trying to use abc.abstractmethod to ensure concrete classes have PATTERN attributes won't work — but surely there should be something there to tell anyone looking at your code "I didn't forget to define PATTERN on the ABC; the concrete classes are supposed to define their own." The question is: Which something is the most Pythonic?
Pile of decorators
#property
#abc.abstractmethod
def PATTERN(self):
pass
(Assume Python 3.4 or higher, by the way.) This can be very misleading to readers, as it implies that PATTERN should be an instance property instead of a class attribute.
Tower of decorators
#property
#classmethod
#abc.abstractmethod
def PATTERN(cls):
pass
This can be very confusing to readers, as #property and #classmethod normally can't be combined; they only work together here (for a given value of "work") because the method is ignored once it's overridden.
Dummy value
PATTERN = ''
If a concrete class fails to define its own PATTERN, parse will only accept empty input. This option isn't widely applicable, as not all use cases will have an appropriate dummy value.
Error-inducing dummy value
PATTERN = None
If a concrete class fails to define its own PATTERN, parse will raise an error, and the programmer gets what they deserve.
Do nothing. Basically a more hardcore variant of #4. There can be a note in the ABC's docstring somewhere, but the ABC itself shouldn't have anything in the way of a PATTERN attribute.
Other???
You can use the __init_subclass__ method which was introduced in Python 3.6 to make customizing class creation easier without resorting to metaclasses. When defining a new class, it is called as the last step before the class object is created.
In my opinion, the most pythonic way to use this would be to make a class decorator that accepts the attributes to make abstract, thus making it explicit to the user what they need to define.
from custom_decorators import abstract_class_attributes
#abstract_class_attributes('PATTERN')
class PatternDefiningBase:
pass
class LegalPatternChild(PatternDefiningBase):
PATTERN = r'foo\s+bar'
class IllegalPatternChild(PatternDefiningBase):
pass
The traceback might be as follows, and occurs at subclass creation time, not instantiation time.
NotImplementedError Traceback (most recent call last)
...
18 PATTERN = r'foo\s+bar'
19
---> 20 class IllegalPatternChild(PatternDefiningBase):
21 pass
...
<ipython-input-11-44089d753ec1> in __init_subclass__(cls, **kwargs)
9 if cls.PATTERN is NotImplemented:
10 # Choose your favorite exception.
---> 11 raise NotImplementedError('You forgot to define PATTERN!!!')
12
13 #classmethod
NotImplementedError: You forgot to define PATTERN!!!
Before showing how the decorator is implemented, it is instructive to show how you could implement this without the decorator. The nice thing here is that if needed you could make your base class an abstract base class without having to do any work (just inherit from abc.ABC or make the metaclass abc.ABCMeta).
class PatternDefiningBase:
# Dear programmer: implement this in a subclass OR YOU'LL BE SORRY!
PATTERN = NotImplemented
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
# If the new class did not redefine PATTERN, fail *hard*.
if cls.PATTERN is NotImplemented:
# Choose your favorite exception.
raise NotImplementedError('You forgot to define PATTERN!!!')
#classmethod
def sample(cls):
print(cls.PATTERN)
class LegalPatternChild(PatternDefiningBase):
PATTERN = r'foo\s+bar'
Here is how the decorator could be implemented.
# custom_decorators.py
def abstract_class_attributes(*names):
"""Class decorator to add one or more abstract attribute."""
def _func(cls, *names):
""" Function that extends the __init_subclass__ method of a class."""
# Add each attribute to the class with the value of NotImplemented
for name in names:
setattr(cls, name, NotImplemented)
# Save the original __init_subclass__ implementation, then wrap
# it with our new implementation.
orig_init_subclass = cls.__init_subclass__
def new_init_subclass(cls, **kwargs):
"""
New definition of __init_subclass__ that checks that
attributes are implemented.
"""
# The default implementation of __init_subclass__ takes no
# positional arguments, but a custom implementation does.
# If the user has not reimplemented __init_subclass__ then
# the first signature will fail and we try the second.
try:
orig_init_subclass(cls, **kwargs)
except TypeError:
orig_init_subclass(**kwargs)
# Check that each attribute is defined.
for name in names:
if getattr(cls, name, NotImplemented) is NotImplemented:
raise NotImplementedError(f'You forgot to define {name}!!!')
# Bind this new function to the __init_subclass__.
# For reasons beyond the scope here, it we must manually
# declare it as a classmethod because it is not done automatically
# as it would be if declared in the standard way.
cls.__init_subclass__ = classmethod(new_init_subclass)
return cls
return lambda cls: _func(cls, *names)
I've been searching for something like this for quite a while, until yesterday I decided to dive into it. I like #SethMMorton's reply a lot, however 2 things are missing: allow a an abstract class to have a subclass that is abstract itself, and play nice with typehints and static typing tools such as mypy (which makes sense, since back in 2017 these were hardly a thing).
I started to set out to write a reply here with my own solution, however I realised I needed lots of tests and documentation, so I made it a proper python module: abstractcp.
Use (as of version 0.9.5):
class Parser(acp.Abstract):
PATTERN: str = acp.abstract_class_property(str)
#classmethod
def parse(cls, s):
m = re.fullmatch(cls.PATTERN, s)
if not m:
raise ValueError(s)
return cls(**m.groupdict())
class FooBarParser(Parser):
PATTERN = r"foo\s+bar"
def __init__(...): ...
class SpamParser(Parser):
PATTERN = r"(spam)+eggs"
def __init__(...): ...
See for full use the page on pypi or github.
Alternative Answer
Using dedicated class to annotate class variables
import abc
from typing import Generic, Set, TypeVar, get_type_hints
T = TypeVar('T')
class AbstractClassVar(Generic[T]):
pass
class Abstract(abc.ABC):
def __init_subclass__(cls) -> None:
def get_abstract_members(cls) -> Set[str]:
"""Gets a class's abstract members"""
abstract_members = set()
if cls is Abstract:
return abstract_members
for base_cls in cls.__bases__:
abstract_members.update(get_abstract_members(base_cls))
for (member_name, annotation) in get_type_hints(cls).items():
if getattr(annotation, '__origin__', None) is AbstractClassVar:
abstract_members.add(member_name)
return abstract_members
# Implementation checking for abstract class members
if Abstract not in cls.__bases__:
for cls_member in get_abstract_members(cls):
if not hasattr(cls, cls_member):
raise NotImplementedError(f"Wrong class implementation {cls.__name__} " +
f"with abstract class variable {cls_member}")
return super().__init_subclass__()
Usage
class Foo(Abstract):
foo_member: AbstractClassVar[str]
class UpperFoo(Foo):
# Everything should be implemented as intended or else...
...
Not Implementing the abstract class member foo_member will result in a NotImplementedError.
Answer was taken from my original answer to this question: enforcement for abstract properties in python3
Related
I have two classes that are supposed to implement the same test cases for two independent libraries (let's call them LibA and LibB). So far I define the test methods to be implemented in an abstract base class which ensures that both test classes implement all desired tests:
from abc import ABC, abstractmethod
class MyTests(ABC):
#abstractmethod
def test_foo(self):
pass
class TestsA(MyTests):
def test_foo(self):
pass
class TestsB(MyTests):
def test_foo(self):
pass
This works as expected, but what may still happen is that someone working on LibB accidentally adds a test_bar() method to TestB instead of the base class. The missing test_bar() in the TestA class would go unnoticed in that case.
Is there a way to prohibit the addition of new methods to an (abstract) base class? The objective is to force the addition of new methods to happen in the base class and thus force the implementation of new methods in all derived classes.
Yes. It can be done through a metaclass, or from Python 3.6 onwards, with a check in __init_subclass__ of the baseclass.
__init_sublass__ is a special method called by the language each time a subclass is instantiated. So it can check if the new class have any method that is not present in any of the superclasses and raise a TypeError when the subclass is declared. (__init_subclass__ is converted to a classmethod automatically)
class Base(ABC):
...
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
# By inspecting `cls.__dict__` we pick all methods declared directly on the class
for name, attr in cls.__dict__.items():
attr = getattr(cls, name)
if not callable(attr):
continue
for superclass in cls.__mro__[1:]:
if name in dir(superclass):
break
else:
# method not found in superclasses:
raise TypeError(f"Method {name} defined in {cls.__name__} does not exist in superclasses")
Note that unlike the TypeError raised by non-implemented abstractmethods, this error is raised at class declaration time, not class instantiation time. If the later is desired, you have to use a metaclass and move the check to its __call__ method - however that complicates things, as if one method is created in an intermediate class, that was never instantiated, it won't raise when the method is available in the leaf subclass. I guess what you need is more along the code above.
I have two classes that are supposed to implement the same test cases for two independent libraries (let's call them LibA and LibB). So far I define the test methods to be implemented in an abstract base class which ensures that both test classes implement all desired tests:
from abc import ABC, abstractmethod
class MyTests(ABC):
#abstractmethod
def test_foo(self):
pass
class TestsA(MyTests):
def test_foo(self):
pass
class TestsB(MyTests):
def test_foo(self):
pass
This works as expected, but what may still happen is that someone working on LibB accidentally adds a test_bar() method to TestB instead of the base class. The missing test_bar() in the TestA class would go unnoticed in that case.
Is there a way to prohibit the addition of new methods to an (abstract) base class? The objective is to force the addition of new methods to happen in the base class and thus force the implementation of new methods in all derived classes.
Yes. It can be done through a metaclass, or from Python 3.6 onwards, with a check in __init_subclass__ of the baseclass.
__init_sublass__ is a special method called by the language each time a subclass is instantiated. So it can check if the new class have any method that is not present in any of the superclasses and raise a TypeError when the subclass is declared. (__init_subclass__ is converted to a classmethod automatically)
class Base(ABC):
...
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
# By inspecting `cls.__dict__` we pick all methods declared directly on the class
for name, attr in cls.__dict__.items():
attr = getattr(cls, name)
if not callable(attr):
continue
for superclass in cls.__mro__[1:]:
if name in dir(superclass):
break
else:
# method not found in superclasses:
raise TypeError(f"Method {name} defined in {cls.__name__} does not exist in superclasses")
Note that unlike the TypeError raised by non-implemented abstractmethods, this error is raised at class declaration time, not class instantiation time. If the later is desired, you have to use a metaclass and move the check to its __call__ method - however that complicates things, as if one method is created in an intermediate class, that was never instantiated, it won't raise when the method is available in the leaf subclass. I guess what you need is more along the code above.
It seems that checking isinstance(..., io.IOBase) is the 'correct' way to determine if an object is 'file-like'.
However, when defining my own file-like class, it doesn't seem to work:
import io
class file_like():
def __init__(self):
pass
def write(self, line):
print("Written:", line)
def close(self):
pass
def flush(self):
pass
print(isinstance(file_like(), io.IOBase))
# Prints 'False'
How can I make it work?
isinstance(obj, some_class) just iterates up obj's inheritance chain, looking for some_class. Thus isinstance(file_like, io.IOBase), will be false, as your file_like class doesn't have io.IOBase in its ancestry. file_like doesn't designate an explicit parent, hence it implicitly inherits only from object. That's the only class - besides file_like itself - that will test positive for a file_like instance with isinstance().
What you are doing in file_like is defining the methods expected on a file-like object while not inheriting from any particular "file-like" class. This approach is called duck-typing, and it has many merits in dynamic languages, although it's more popular in others (e.g. Ruby) than Python. Still, if whatever you're providing your file_like instance to follows duck-typing, it should work, provided your file_like does in fact "quack like a file", i.e. behaves sufficiently like a file to not cause errors upon usage at the receiving end.
Of course, if the receiving end is not following duck-typing, for example tries to check types by isinstance() as you do here, this approach will fail.
Finally, a small stylistic nit: don't put empty parens on a class if it doesn't inherit anything explicitly. They are redundant.
Checking isinstance(something, io.IOBase) only checks if something is an instance of an io.IOBase or a class derived from it — so I don't understand where you got the mistaken idea that it's the "correct" way to determine if an object is "file-like".
A different way to do it is with an Abstract Base Class. Python has a number of built-in ones, but currently doesn't have one for "file-like" that could used with isinstance(). However you can define your own by using the abc module as outlined in PEP 3119.
The Python Module of the Week webiste has a good explanation of using the abc module to do things like as this. And this highly rated answer to the question Correct way to detect sequence parameter? shows a similar way of defining your own ABC.
To illustrate applying it to your case, you could define an ABC like this with all its methods abstract — thereby forcing derived classes to define all of them in order to be instantiated:
from abc import ABCMeta, abstractmethod
class ABCFileLike(metaclass=ABCMeta):
#abstractmethod
def __init__(self): pass
#abstractmethod
def write(self, line): pass
#abstractmethod
def close(self): pass
#abstractmethod
def flush(self): pass
You could then derive your own concrete classes from it, making sure to supply implementations of all the abstract methods. (If you don't define them all, then a TypeError will be be raised if any attempts are made to instantiate it.)
class FileLike(ABCFileLike):
""" Concrete implementation of a file-like class.
(Meaning all the abstract methods have an implementation.)
"""
def __init__(self):
pass
def write(self, line):
print("Written:", line)
def close(self):
pass
def flush(self):
pass
print(isinstance(FileLike(), ABCFileLike)) # -> True
You can even add existing classes to it by registering them with the new metaclass:
import io
print(isinstance(io.IOBase(), ABCFileLike)) # -> False
ABCFileLike.register(io.IOBase)
print(isinstance(io.IOBase(), ABCFileLike)) # -> True
This article has a snippet showing usage of __bases__ to dynamically change the inheritance hierarchy of some Python code, by adding a class to an existing classes collection of classes from which it inherits. Ok, that's hard to read, code is probably clearer:
class Friendly:
def hello(self):
print 'Hello'
class Person: pass
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
That is, Person doesn't inherit from Friendly at the source level, but rather this inheritance relation is added dynamically at runtime by modification of the __bases__attribute of the Person class. However, if you change Friendly and Person to be new style classes (by inheriting from object), you get the following error:
TypeError: __bases__ assignment: 'Friendly' deallocator differs from 'object'
A bit of Googling on this seems to indicate some incompatibilities between new-style and old style classes in regards to changing the inheritance hierarchy at runtime. Specifically: "New-style class objects don't support assignment to their bases attribute".
My question, is it possible to make the above Friendly/Person example work using new-style classes in Python 2.7+, possibly by use of the __mro__ attribute?
Disclaimer: I fully realise that this is obscure code. I fully realize that in real production code tricks like this tend to border on unreadable, this is purely a thought experiment, and for funzies to learn something about how Python deals with issues related to multiple inheritance.
Ok, again, this is not something you should normally do, this is for informational purposes only.
Where Python looks for a method on an instance object is determined by the __mro__ attribute of the class which defines that object (the M ethod R esolution O rder attribute). Thus, if we could modify the __mro__ of Person, we'd get the desired behaviour. Something like:
setattr(Person, '__mro__', (Person, Friendly, object))
The problem is that __mro__ is a readonly attribute, and thus setattr won't work. Maybe if you're a Python guru there's a way around that, but clearly I fall short of guru status as I cannot think of one.
A possible workaround is to simply redefine the class:
def modify_Person_to_be_friendly():
# so that we're modifying the global identifier 'Person'
global Person
# now just redefine the class using type(), specifying that the new
# class should inherit from Friendly and have all attributes from
# our old Person class
Person = type('Person', (Friendly,), dict(Person.__dict__))
def main():
modify_Person_to_be_friendly()
p = Person()
p.hello() # works!
What this doesn't do is modify any previously created Person instances to have the hello() method. For example (just modifying main()):
def main():
oldperson = Person()
ModifyPersonToBeFriendly()
p = Person()
p.hello()
# works! But:
oldperson.hello()
# does not
If the details of the type call aren't clear, then read e-satis' excellent answer on 'What is a metaclass in Python?'.
I've been struggling with this too, and was intrigued by your solution, but Python 3 takes it away from us:
AttributeError: attribute '__dict__' of 'type' objects is not writable
I actually have a legitimate need for a decorator that replaces the (single) superclass of the decorated class. It would require too lengthy a description to include here (I tried, but couldn't get it to a reasonably length and limited complexity -- it came up in the context of the use by many Python applications of an Python-based enterprise server where different applications needed slightly different variations of some of the code.)
The discussion on this page and others like it provided hints that the problem of assigning to __bases__ only occurs for classes with no superclass defined (i.e., whose only superclass is object). I was able to solve this problem (for both Python 2.7 and 3.2) by defining the classes whose superclass I needed to replace as being subclasses of a trivial class:
## T is used so that the other classes are not direct subclasses of object,
## since classes whose base is object don't allow assignment to their __bases__ attribute.
class T: pass
class A(T):
def __init__(self):
print('Creating instance of {}'.format(self.__class__.__name__))
## ordinary inheritance
class B(A): pass
## dynamically specified inheritance
class C(T): pass
A() # -> Creating instance of A
B() # -> Creating instance of B
C.__bases__ = (A,)
C() # -> Creating instance of C
## attempt at dynamically specified inheritance starting with a direct subclass
## of object doesn't work
class D: pass
D.__bases__ = (A,)
D()
## Result is:
## TypeError: __bases__ assignment: 'A' deallocator differs from 'object'
I can not vouch for the consequences, but that this code does what you want at py2.7.2.
class Friendly(object):
def hello(self):
print 'Hello'
class Person(object): pass
# we can't change the original classes, so we replace them
class newFriendly: pass
newFriendly.__dict__ = dict(Friendly.__dict__)
Friendly = newFriendly
class newPerson: pass
newPerson.__dict__ = dict(Person.__dict__)
Person = newPerson
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
We know that this is possible. Cool. But we'll never use it!
Right of the bat, all the caveats of messing with class hierarchy dynamically are in effect.
But if it has to be done then, apparently, there is a hack that get's around the "deallocator differs from 'object" issue when modifying the __bases__ attribute for the new style classes.
You can define a class object
class Object(object): pass
Which derives a class from the built-in metaclass type.
That's it, now your new style classes can modify the __bases__ without any problem.
In my tests this actually worked very well as all existing (before changing the inheritance) instances of it and its derived classes felt the effect of the change including their mro getting updated.
I needed a solution for this which:
Works with both Python 2 (>= 2.7) and Python 3 (>= 3.2).
Lets the class bases be changed after dynamically importing a dependency.
Lets the class bases be changed from unit test code.
Works with types that have a custom metaclass.
Still allows unittest.mock.patch to function as expected.
Here's what I came up with:
def ensure_class_bases_begin_with(namespace, class_name, base_class):
""" Ensure the named class's bases start with the base class.
:param namespace: The namespace containing the class name.
:param class_name: The name of the class to alter.
:param base_class: The type to be the first base class for the
newly created type.
:return: ``None``.
Call this function after ensuring `base_class` is
available, before using the class named by `class_name`.
"""
existing_class = namespace[class_name]
assert isinstance(existing_class, type)
bases = list(existing_class.__bases__)
if base_class is bases[0]:
# Already bound to a type with the right bases.
return
bases.insert(0, base_class)
new_class_namespace = existing_class.__dict__.copy()
# Type creation will assign the correct ‘__dict__’ attribute.
del new_class_namespace['__dict__']
metaclass = existing_class.__metaclass__
new_class = metaclass(class_name, tuple(bases), new_class_namespace)
namespace[class_name] = new_class
Used like this within the application:
# foo.py
# Type `Bar` is not available at first, so can't inherit from it yet.
class Foo(object):
__metaclass__ = type
def __init__(self):
self.frob = "spam"
def __unicode__(self): return "Foo"
# … later …
import bar
ensure_class_bases_begin_with(
namespace=globals(),
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
Use like this from within unit test code:
# test_foo.py
""" Unit test for `foo` module. """
import unittest
import mock
import foo
import bar
ensure_class_bases_begin_with(
namespace=foo.__dict__,
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
class Foo_TestCase(unittest.TestCase):
""" Test cases for `Foo` class. """
def setUp(self):
patcher_unicode = mock.patch.object(
foo.Foo, '__unicode__')
patcher_unicode.start()
self.addCleanup(patcher_unicode.stop)
self.test_instance = foo.Foo()
patcher_frob = mock.patch.object(
self.test_instance, 'frob')
patcher_frob.start()
self.addCleanup(patcher_frob.stop)
def test_instantiate(self):
""" Should create an instance of `Foo`. """
instance = foo.Foo()
The above answers are good if you need to change an existing class at runtime. However, if you are just looking to create a new class that inherits by some other class, there is a much cleaner solution. I got this idea from https://stackoverflow.com/a/21060094/3533440, but I think the example below better illustrates a legitimate use case.
def make_default(Map, default_default=None):
"""Returns a class which behaves identically to the given
Map class, except it gives a default value for unknown keys."""
class DefaultMap(Map):
def __init__(self, default=default_default, **kwargs):
self._default = default
super().__init__(**kwargs)
def __missing__(self, key):
return self._default
return DefaultMap
DefaultDict = make_default(dict, default_default='wug')
d = DefaultDict(a=1, b=2)
assert d['a'] is 1
assert d['b'] is 2
assert d['c'] is 'wug'
Correct me if I'm wrong, but this strategy seems very readable to me, and I would use it in production code. This is very similar to functors in OCaml.
This method isn't technically inheriting during runtime, since __mro__ can't be changed. But what I'm doing here is using __getattr__ to be able to access any attributes or methods from a certain class. (Read comments in order of numbers placed before the comments, it makes more sense)
class Sub:
def __init__(self, f, cls):
self.f = f
self.cls = cls
# 6) this method will pass the self parameter
# (which is the original class object we passed)
# and then it will fill in the rest of the arguments
# using *args and **kwargs
def __call__(self, *args, **kwargs):
# 7) the multiple try / except statements
# are for making sure if an attribute was
# accessed instead of a function, the __call__
# method will just return the attribute
try:
return self.f(self.cls, *args, **kwargs)
except TypeError:
try:
return self.f(*args, **kwargs)
except TypeError:
return self.f
# 1) our base class
class S:
def __init__(self, func):
self.cls = func
def __getattr__(self, item):
# 5) we are wrapping the attribute we get in the Sub class
# so we can implement the __call__ method there
# to be able to pass the parameters in the correct order
return Sub(getattr(self.cls, item), self.cls)
# 2) class we want to inherit from
class L:
def run(self, s):
print("run" + s)
# 3) we create an instance of our base class
# and then pass an instance (or just the class object)
# as a parameter to this instance
s = S(L) # 4) in this case, I'm using the class object
s.run("1")
So this sort of substitution and redirection will simulate the inheritance of the class we wanted to inherit from. And it even works with attributes or methods that don't take any parameters.
Take the following minimal example:
import abc
class FooClass(object):
__metaclass__ = abc.ABCMeta
#abc.abstractmethod
def FooMethod(self):
raise NotImplementedError()
def main():
derived_type = type('Derived', (FooClass,), {})
def BarOverride(self):
print 'Hello, world!'
derived_type.FooMethod = BarOverride
instance = derived_type()
Running main() gets you:
TypeError: Can't instantiate abstract class Derived with abstract methods FooMethod
(The exception occurs on the instance = derived_type() line.)
But FooMethod shouldn't be abstract: I've overridden it with BarOverride. So, why is this raising exceptions?
Disclaimer: Yes, I could use the explicit class syntax, and accomplish the exact same thing. (And even better, I can make it work!) But this is a minimal test case, and the larger example is dynamically creating classes. :-) And I'm curious as to why this doesn't work.
Edit: And to prevent the other obvious non-answer: I don't want to pass BarOverride in the third argument to type: In the real example, BarOverride needs to have derived_type bound to it. It is easier to do this if I can define BarOverride after the creation of derived_type. (If I can't do this, then why?)
Because the docs say so:
Dynamically adding abstract methods to a class, or attempting to
modify the abstraction status of a method or class once it is created,
are not supported. The abstractmethod() only affects subclasses
derived using regular inheritance; “virtual subclasses” registered
with the ABC’s register() method are not affected.
A metaclass is only called when a class is defined. When abstractmethod has marked a class as abstract that status won't change later.
Jochen is right; the abstract methods are set at class creation and won't me modified just because you reassign an attribute.
You can manually remove it from the list of abstract methods by doing
DerivedType.__abstractmethods__ = frozenset()
or
DerivedType.__abstractmethods__ = frozenset(
elem for elem in DerivedType.__abstractmethods__ if elem != 'FooMethod')
as well as setattr, so it doesn't still think that FooMethod is abstract.
I know this topic is really old but... That is really a nice question.
It doesn't work because abc can only check for abstract methods during instatiation of types, that is, when type('Derived', (FooClass,), {}) is running. Any setattr done after that is not accessible from abc.
So, setattr wont work, buuut...
Your problem of addressing the name of a class that wasn't previously declared or defined looks solvable:
I wrote a little metaclass that lets you use a placeholder "clazz" for accessing any class that will eventually get the method you are writing outside a class definition.
That way you won't get TypeError from abc anymore, since you can now define your method BEFORE instatiating your type, and then pass it to type at the dict argument. Then abc will see it as a proper method override.
Aaand, with the new metaclass you can refer to the class object during that method.
And this is super, because now you can use super! =P
I can guess you were worried about that too...
Take a look:
import abc
import inspect
clazz = type('clazz', (object,), {})()
def clazzRef(func_obj):
func_obj.__hasclazzref__ = True
return func_obj
class MetaClazzRef(type):
"""Makes the clazz placeholder work.
Checks which of your functions or methods use the decorator clazzRef
and swaps its global reference so that "clazz" resolves to the
desired class, that is, the one where the method is set or defined.
"""
methods = {}
def __new__(mcs, name, bases, dict):
ret = super(MetaClazzRef, mcs).__new__(mcs, name, bases, dict)
for (k,f) in dict.items():
if getattr(f, '__hasclazzref__', False):
if inspect.ismethod(f):
f = f.im_func
if inspect.isfunction(f):
for (var,value) in f.func_globals.items():
if value is clazz:
f.func_globals[var] = ret
return ret
class MetaMix(abc.ABCMeta, MetaClazzRef):
pass
class FooClass(object):
__metaclass__ = MetaMix
#abc.abstractmethod
def FooMethod(self):
print 'Ooops...'
#raise NotImplementedError()
def main():
#clazzRef
def BarOverride(self):
print "Hello, world! I'm a %s but this method is from class %s!" % (type(self), clazz)
super(clazz, self).FooMethod() # Now I have SUPER!!!
derived_type = type('Derived', (FooClass,), {'FooMethod': BarOverride})
instance = derived_type()
instance.FooMethod()
class derivedDerived(derived_type):
def FooMethod(self):
print 'I inherit from derived.'
super(derivedDerived,self).FooMethod()
instance = derivedDerived()
instance.FooMethod()
main()
The output is:
Hello, world! I'm a <class 'clazz.Derived'> but this method is from class <class 'clazz.Derived'>!
Ooops...
I inherit from derived.
Hello, world! I'm a <class 'clazz.derivedDerived'> but this method is from class <class 'clazz.Derived'>!
Ooops...
Well, if you must do it this way, then you could just pass a dummy dict {'FooMethod':None} as the third argument to type. This allows derived_type to satisfy ABCMeta's requirement that all abstract methods be overridden. Later on you can supply the real FooMethod:
def main():
derived_type = type('Derived', (FooClass,), {'FooMethod':None})
def BarOverride(self):
print 'Hello, world!'
setattr(derived_type, 'FooMethod', BarOverride)
instance = derived_type()