Enforce/Define python classes with only the specified attributes [duplicate] - python

I have two classes that are supposed to implement the same test cases for two independent libraries (let's call them LibA and LibB). So far I define the test methods to be implemented in an abstract base class which ensures that both test classes implement all desired tests:
from abc import ABC, abstractmethod
class MyTests(ABC):
#abstractmethod
def test_foo(self):
pass
class TestsA(MyTests):
def test_foo(self):
pass
class TestsB(MyTests):
def test_foo(self):
pass
This works as expected, but what may still happen is that someone working on LibB accidentally adds a test_bar() method to TestB instead of the base class. The missing test_bar() in the TestA class would go unnoticed in that case.
Is there a way to prohibit the addition of new methods to an (abstract) base class? The objective is to force the addition of new methods to happen in the base class and thus force the implementation of new methods in all derived classes.

Yes. It can be done through a metaclass, or from Python 3.6 onwards, with a check in __init_subclass__ of the baseclass.
__init_sublass__ is a special method called by the language each time a subclass is instantiated. So it can check if the new class have any method that is not present in any of the superclasses and raise a TypeError when the subclass is declared. (__init_subclass__ is converted to a classmethod automatically)
class Base(ABC):
...
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
# By inspecting `cls.__dict__` we pick all methods declared directly on the class
for name, attr in cls.__dict__.items():
attr = getattr(cls, name)
if not callable(attr):
continue
for superclass in cls.__mro__[1:]:
if name in dir(superclass):
break
else:
# method not found in superclasses:
raise TypeError(f"Method {name} defined in {cls.__name__} does not exist in superclasses")
Note that unlike the TypeError raised by non-implemented abstractmethods, this error is raised at class declaration time, not class instantiation time. If the later is desired, you have to use a metaclass and move the check to its __call__ method - however that complicates things, as if one method is created in an intermediate class, that was never instantiated, it won't raise when the method is available in the leaf subclass. I guess what you need is more along the code above.

Related

Does the final decorator help in preventing method overriding?

I am trying to create a final method in my class, where I want that it cannot be overridden by any sub-class, just like when creating a final class using final decorator which cannot be inherited.
from final_class import final
class Dummy:
def show(self):
print("show id running from dummy")
#final
def display(self):
print("display from dummy")
class Demo(Dummy):
def show(self):
print("show from demo")
def display(self):
print("display from demo")
d = Demo()
d.display()
I think we should get an error when accessing the display method from Demo, but when I run the program it gives "display from demo".
So what am I missing? I have checked final annotation and decorators in python3.8 but it talks about typechecking in typing packages while I was trying it from the final_class package.
As seem in the comments, the 3rd party library final_class.final is one thing: a class decorator that will prevent, at runtime, that a class is further inherited, anf typing.final which ships with Python, and is intended to decorate both classes and methods, but which has no enforcing behavior during program execution - it will, instead, make any compliant static analysis tool to raise an error in the type-checking stage.
It is, due to Python flexibility and dynamism, possible to create a final decorator for methods that will be enforced at runtime: i.e. whenever a subclass is created overriding a method marked as final in the inheritance chain, a RuntimeError, or other custom error can be raised.
The idea is that whenever a new class is created, both methods on the metaclass and the __init_subclass__ method of the bases is called - so, if one wants to create a custom metaclass or custom base-class to be used along with such a #final decorator, it should be something more or less straightforward.
What would be less straightforward would be such a decorator that would work regardless of an specific base class or custom-metaclass - and this also can be done: by injecting in the class being constructed an __init_subclass__ method which will perform a check of violation of the final clause.
The complicated part is to co-exist with eventual pre-existing __init_subclass__ methods which also need to be called, either on the same class or in any superclass, as well as emulate the working of super(), since we are creating a method outside the class body. The decorator code can inspect the context from which its called and inject a __init_subclass__ there, taking some care:
import sys
def final(method):
f = sys._getframe().f_back
_init_subclass_meth = "super"
def __init_subclass__(cls, *args, **kw):
# all of these are readonly, so nonlocal is optional:
# nonlocal final_methods, _init_subclass_meth, _original_init_subclass, __init_subclass__
# In a normal __init_subclass__, one can know about the class in which
# a method is declared, and call super(), via the `__class__`
# magic variable. But that won't work for a method defined
# outside the class and inkected in it.
# the line bellow should retrieve the equivalent to __class__
current_class = next(supercls for supercls in cls.__mro__ if getattr(supercls.__dict__.get("__init_subclass__", None), "__func__", None) is __init_subclass__)
for meth_name in cls.__dict__:
if meth_name in final_methods:
raise RuntimeError(f"Final method {meth_name} is redeclared in subclass {cls.__name__} from {current_class.__name__}")
if _init_subclass_meth == "wrap":
return _original_init_subclass(cls, *args, **kwd)
return super(current_class, None).__init_subclass__(*args, **kw)
__init_subclass__._final_mark = True
if "__init_subclass__" in f.f_locals and not getattr(f.f_locals["__init_subclass__"], "_final_mark", False):
_init_subclass_meth = "wrap"
_original_init_subclass = f.f_locals["__init_subclass__"]
# locals assignment: will work in this case because the caller context
# is a class body, inside which `f_locals` refers usually to a
# plain dict (unless a custom metaclass changed it).
# This normally would not work (= no effect) in an ordinary frame,
# represnting a plain function or method in execution:
f.f_locals["__init_subclass__"] = __init_subclass__
final_methods = f.f_locals.setdefault("_final_methods", set())
final_methods.add(method.__name__)
return method
class A:
#final
def b(self):
print("final b")
And this will raise an error:
class B(A):
def b(self):
# RuntimeError expected
...

Prohibit addition of new methods to a Python child class

I have two classes that are supposed to implement the same test cases for two independent libraries (let's call them LibA and LibB). So far I define the test methods to be implemented in an abstract base class which ensures that both test classes implement all desired tests:
from abc import ABC, abstractmethod
class MyTests(ABC):
#abstractmethod
def test_foo(self):
pass
class TestsA(MyTests):
def test_foo(self):
pass
class TestsB(MyTests):
def test_foo(self):
pass
This works as expected, but what may still happen is that someone working on LibB accidentally adds a test_bar() method to TestB instead of the base class. The missing test_bar() in the TestA class would go unnoticed in that case.
Is there a way to prohibit the addition of new methods to an (abstract) base class? The objective is to force the addition of new methods to happen in the base class and thus force the implementation of new methods in all derived classes.
Yes. It can be done through a metaclass, or from Python 3.6 onwards, with a check in __init_subclass__ of the baseclass.
__init_sublass__ is a special method called by the language each time a subclass is instantiated. So it can check if the new class have any method that is not present in any of the superclasses and raise a TypeError when the subclass is declared. (__init_subclass__ is converted to a classmethod automatically)
class Base(ABC):
...
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
# By inspecting `cls.__dict__` we pick all methods declared directly on the class
for name, attr in cls.__dict__.items():
attr = getattr(cls, name)
if not callable(attr):
continue
for superclass in cls.__mro__[1:]:
if name in dir(superclass):
break
else:
# method not found in superclasses:
raise TypeError(f"Method {name} defined in {cls.__name__} does not exist in superclasses")
Note that unlike the TypeError raised by non-implemented abstractmethods, this error is raised at class declaration time, not class instantiation time. If the later is desired, you have to use a metaclass and move the check to its __call__ method - however that complicates things, as if one method is created in an intermediate class, that was never instantiated, it won't raise when the method is available in the leaf subclass. I guess what you need is more along the code above.

What are the differences between a `classmethod` and a metaclass method?

In Python, I can create a class method using the #classmethod decorator:
>>> class C:
... #classmethod
... def f(cls):
... print(f'f called with cls={cls}')
...
>>> C.f()
f called with cls=<class '__main__.C'>
Alternatively, I can use a normal (instance) method on a metaclass:
>>> class M(type):
... def f(cls):
... print(f'f called with cls={cls}')
...
>>> class C(metaclass=M):
... pass
...
>>> C.f()
f called with cls=<class '__main__.C'>
As shown by the output of C.f(), these two approaches provide similar functionality.
What are the differences between using #classmethod and using a normal method on a metaclass?
As classes are instances of a metaclass, it is not unexpected that an "instance method" on the metaclass will behave like a classmethod.
However, yes, there are differences - and some of them are more than semantic:
The most important difference is that a method in the metaclass is not "visible" from a class instance. That happens because the attribute lookup in Python (in a simplified way - descriptors may take precedence) search for an attribute in the instance - if it is not present in the instance, Python then looks in that instance's class, and then the search continues on the superclasses of the class, but not on the classes of the class. The Python stdlib make use of this feature in the abc.ABCMeta.register method.
That feature can be used for good, as methods related with the class themselves are free to be re-used as instance attributes without any conflict (but a method would still conflict).
Another difference, though obvious, is that a method declared in the metaclass can be available in several classes, not otherwise related - if you have different class hierarchies, not related at all in what they deal with, but want some common functionality for all classes, you'd have to come up with a mixin class, that would have to be included as base in both hierarchies (say for including all classes in an application registry). (NB. the mixin may sometimes be a better call than a metaclass)
A classmethod is a specialized "classmethod" object, while a method in the metaclass is an ordinary function.
So, it happens that the mechanism that classmethods use is the "descriptor protocol". While normal functions feature a __get__ method that will insert the self argument when they are retrieved from an instance, and leave that argument empty when retrieved from a class, a classmethod object have a different __get__, that will insert the class itself (the "owner") as the first parameter in both situations.
This makes no practical differences most of the time, but if you want access to the method as a function, for purposes of adding dynamically adding decorator to it, or any other, for a method in the metaclass meta.method retrieves the function, ready to be used, while you have to use cls.my_classmethod.__func__ to retrieve it from a classmethod (and then you have to create another classmethod object and assign it back, if you do some wrapping).
Basically, these are the 2 examples:
class M1(type):
def clsmethod1(cls):
pass
class CLS1(metaclass=M1):
pass
def runtime_wrap(cls, method_name, wrapper):
mcls = type(cls)
setattr(mcls, method_name, wrapper(getatttr(mcls, method_name)))
def wrapper(classmethod):
def new_method(cls):
print("wrapper called")
return classmethod(cls)
return new_method
runtime_wrap(cls1, "clsmethod1", wrapper)
class CLS2:
#classmethod
def classmethod2(cls):
pass
def runtime_wrap2(cls, method_name, wrapper):
setattr(cls, method_name, classmethod(
wrapper(getatttr(cls, method_name).__func__)
)
)
runtime_wrap2(cls1, "clsmethod1", wrapper)
In other words: apart from the important difference that a method defined in the metaclass is visible from the instance and a classmethod object do not, the other differences, at runtime will seem obscure and meaningless - but that happens because the language does not need to go out of its way with special rules for classmethods: Both ways of declaring a classmethod are possible, as a consequence from the language design - one, for the fact that a class is itself an object, and another, as a possibility among many, of the use of the descriptor protocol which allows one to specialize attribute access in an instance and in a class:
The classmethod builtin is defined in native code, but it could just be coded in pure python and would work in the exact same way. The 5 line class bellow can be used as a classmethod decorator with no runtime differences to the built-in #classmethod" at all (though distinguishable through introspection such as calls toisinstance, and evenrepr` of course):
class myclassmethod:
def __init__(self, func):
self.__func__ = func
def __get__(self, instance, owner):
return lambda *args, **kw: self.__func__(owner, *args, **kw)
And, beyond methods, it is interesting to keep in mind that specialized attributes such as a #property on the metaclass will work as specialized class attributes, just the same, with no surprising behavior at all.
When you phrase it like you did in the question, the #classmethod and metaclasses may look similar but they have rather different purposes. The class that is injected in the #classmethod's argument is usually used for constructing an instance (i.e. an alternative constructor). On the other hand, the metaclasses are usually used to modify the class itself (e.g. like what Django does with its models DSL).
That is not to say that you can't modify the class inside a classmethod. But then the question becomes why didn't you define the class in the way you want to modify it in the first place? If not, it might suggest a refactor to use multiple classes.
Let's expand the first example a bit.
class C:
#classmethod
def f(cls):
print(f'f called with cls={cls}')
Borrowing from the Python docs, the above will expand to something like the following:
class ClassMethod(object):
"Emulate PyClassMethod_Type() in Objects/funcobject.c"
def __init__(self, f):
self.f = f
def __get__(self, obj, klass=None):
if klass is None:
klass = type(obj)
def newfunc(*args):
return self.f(klass, *args)
return newfunc
class C:
def f(cls):
print(f'f called with cls={cls}')
f = ClassMethod(f)
Note how __get__ can take either an instance or the class (or both), and thus you can do both C.f and C().f. This is unlike the metaclass example you give which will throw an AttributeError for C().f.
Moreover, in the metaclass example, f does not exist in C.__dict__. When looking up the attribute f with C.f, the interpreter looks at C.__dict__ and then after failing to find, looks at type(C).__dict__ (which is M.__dict__). This may matter if you want the flexibility to override f in C, although I doubt this will ever be of practical use.
In your example, the difference would be in some other classes that will have M set as their metaclass.
class M(type):
def f(cls):
pass
class C(metaclass=M):
pass
class C2(metaclass=M):
pass
C.f()
C2.f()
class M(type):
pass
class C(metaclass=M):
#classmethod
def f(cls):
pass
class C2(metaclass=M):
pass
C.f()
# C2 does not have 'f'
Here is more on metaclasses
What are some (concrete) use-cases for metaclasses?
Both #classmethod and Metaclass are different.
Everything in python is an object. Every thing means every thing.
What is Metaclass ?
As said every thing is an object. Classes are also objects in fact classes are instances of other mysterious objects formally called as meta-classes. Default metaclass in python is "type" if not specified
By default all classes defined are instances of type.
Classes are instances of Meta-Classes
Few important points are to understand metioned behaviour
As classes are instances of meta classes.
Like every instantiated object, like objects(instances) get their attributes from class. Class will get it's attributes from Meta-Class
Consider Following Code
class Meta(type):
def foo(self):
print(f'foo is called self={self}')
print('{} is instance of {}: {}'.format(self, Meta, isinstance(self, Meta)))
class C(metaclass=Meta):
pass
C.foo()
Where,
class C is instance of class Meta
"class C" is class object which is instance of "class Meta"
Like any other object(instance) "class C" has access it's attributes/methods defined in it's class "class Meta"
So, decoding "C.foo()" . "C" is instance of "Meta" and "foo" is method calling through instance of "Meta" which is "C".
First argument of method "foo" is reference to instance not class unlike "classmethod"
We can verify as if "class C" is instance of "Class Meta
isinstance(C, Meta)
What is classmethod?
Python methods are said to be bound. As python imposes the restriction that method has to be invoked with instance only.
Sometimes we might want to invoke methods directly through class without any instance (much like static members in java) with out having to create any instance.By default instance is required to call method. As a workaround python provides built-in function classmethod to bind given method to class instead of instance.
As class methods are bound to class. It takes at least one argument which is reference to class itself instead of instance (self)
if built-in function/decorator classmethod is used. First argument
will be reference to class instead of instance
class ClassMethodDemo:
#classmethod
def foo(cls):
print(f'cls is ClassMethodDemo: {cls is ClassMethodDemo}')
As we have used "classmethod" we call method "foo" without creating any instance as follows
ClassMethodDemo.foo()
Above method call will return True. Since first argument cls is indeed reference to "ClassMethodDemo"
Summary:
Classmethod's receive first argument which is "a reference to class(traditionally referred as cls) itself"
Methods of meta-classes are not classmethods. Methods of Meta-classes receive first argument which is "a reference to instance(traditionally referred as self) not class"

Most Pythonic way to declare an abstract class property

Assume you're writing an abstract class and one or more of its non-abstract class methods require the concrete class to have a specific class attribute; e.g., if instances of each concrete class can be constructed by matching against a different regular expression, you might want to give your ABC the following:
#classmethod
def parse(cls, s):
m = re.fullmatch(cls.PATTERN, s)
if not m:
raise ValueError(s)
return cls(**m.groupdict())
(Maybe this could be better implemented with a custom metaclass, but try to ignore that for the sake of the example.)
Now, because overriding of abstract methods & properties is checked at instance creation time, not subclass creation time, trying to use abc.abstractmethod to ensure concrete classes have PATTERN attributes won't work — but surely there should be something there to tell anyone looking at your code "I didn't forget to define PATTERN on the ABC; the concrete classes are supposed to define their own." The question is: Which something is the most Pythonic?
Pile of decorators
#property
#abc.abstractmethod
def PATTERN(self):
pass
(Assume Python 3.4 or higher, by the way.) This can be very misleading to readers, as it implies that PATTERN should be an instance property instead of a class attribute.
Tower of decorators
#property
#classmethod
#abc.abstractmethod
def PATTERN(cls):
pass
This can be very confusing to readers, as #property and #classmethod normally can't be combined; they only work together here (for a given value of "work") because the method is ignored once it's overridden.
Dummy value
PATTERN = ''
If a concrete class fails to define its own PATTERN, parse will only accept empty input. This option isn't widely applicable, as not all use cases will have an appropriate dummy value.
Error-inducing dummy value
PATTERN = None
If a concrete class fails to define its own PATTERN, parse will raise an error, and the programmer gets what they deserve.
Do nothing. Basically a more hardcore variant of #4. There can be a note in the ABC's docstring somewhere, but the ABC itself shouldn't have anything in the way of a PATTERN attribute.
Other???
You can use the __init_subclass__ method which was introduced in Python 3.6 to make customizing class creation easier without resorting to metaclasses. When defining a new class, it is called as the last step before the class object is created.
In my opinion, the most pythonic way to use this would be to make a class decorator that accepts the attributes to make abstract, thus making it explicit to the user what they need to define.
from custom_decorators import abstract_class_attributes
#abstract_class_attributes('PATTERN')
class PatternDefiningBase:
pass
class LegalPatternChild(PatternDefiningBase):
PATTERN = r'foo\s+bar'
class IllegalPatternChild(PatternDefiningBase):
pass
The traceback might be as follows, and occurs at subclass creation time, not instantiation time.
NotImplementedError Traceback (most recent call last)
...
18 PATTERN = r'foo\s+bar'
19
---> 20 class IllegalPatternChild(PatternDefiningBase):
21 pass
...
<ipython-input-11-44089d753ec1> in __init_subclass__(cls, **kwargs)
9 if cls.PATTERN is NotImplemented:
10 # Choose your favorite exception.
---> 11 raise NotImplementedError('You forgot to define PATTERN!!!')
12
13 #classmethod
NotImplementedError: You forgot to define PATTERN!!!
Before showing how the decorator is implemented, it is instructive to show how you could implement this without the decorator. The nice thing here is that if needed you could make your base class an abstract base class without having to do any work (just inherit from abc.ABC or make the metaclass abc.ABCMeta).
class PatternDefiningBase:
# Dear programmer: implement this in a subclass OR YOU'LL BE SORRY!
PATTERN = NotImplemented
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
# If the new class did not redefine PATTERN, fail *hard*.
if cls.PATTERN is NotImplemented:
# Choose your favorite exception.
raise NotImplementedError('You forgot to define PATTERN!!!')
#classmethod
def sample(cls):
print(cls.PATTERN)
class LegalPatternChild(PatternDefiningBase):
PATTERN = r'foo\s+bar'
Here is how the decorator could be implemented.
# custom_decorators.py
def abstract_class_attributes(*names):
"""Class decorator to add one or more abstract attribute."""
def _func(cls, *names):
""" Function that extends the __init_subclass__ method of a class."""
# Add each attribute to the class with the value of NotImplemented
for name in names:
setattr(cls, name, NotImplemented)
# Save the original __init_subclass__ implementation, then wrap
# it with our new implementation.
orig_init_subclass = cls.__init_subclass__
def new_init_subclass(cls, **kwargs):
"""
New definition of __init_subclass__ that checks that
attributes are implemented.
"""
# The default implementation of __init_subclass__ takes no
# positional arguments, but a custom implementation does.
# If the user has not reimplemented __init_subclass__ then
# the first signature will fail and we try the second.
try:
orig_init_subclass(cls, **kwargs)
except TypeError:
orig_init_subclass(**kwargs)
# Check that each attribute is defined.
for name in names:
if getattr(cls, name, NotImplemented) is NotImplemented:
raise NotImplementedError(f'You forgot to define {name}!!!')
# Bind this new function to the __init_subclass__.
# For reasons beyond the scope here, it we must manually
# declare it as a classmethod because it is not done automatically
# as it would be if declared in the standard way.
cls.__init_subclass__ = classmethod(new_init_subclass)
return cls
return lambda cls: _func(cls, *names)
I've been searching for something like this for quite a while, until yesterday I decided to dive into it. I like #SethMMorton's reply a lot, however 2 things are missing: allow a an abstract class to have a subclass that is abstract itself, and play nice with typehints and static typing tools such as mypy (which makes sense, since back in 2017 these were hardly a thing).
I started to set out to write a reply here with my own solution, however I realised I needed lots of tests and documentation, so I made it a proper python module: abstractcp.
Use (as of version 0.9.5):
class Parser(acp.Abstract):
PATTERN: str = acp.abstract_class_property(str)
#classmethod
def parse(cls, s):
m = re.fullmatch(cls.PATTERN, s)
if not m:
raise ValueError(s)
return cls(**m.groupdict())
class FooBarParser(Parser):
PATTERN = r"foo\s+bar"
def __init__(...): ...
class SpamParser(Parser):
PATTERN = r"(spam)+eggs"
def __init__(...): ...
See for full use the page on pypi or github.
Alternative Answer
Using dedicated class to annotate class variables
import abc
from typing import Generic, Set, TypeVar, get_type_hints
T = TypeVar('T')
class AbstractClassVar(Generic[T]):
pass
class Abstract(abc.ABC):
def __init_subclass__(cls) -> None:
def get_abstract_members(cls) -> Set[str]:
"""Gets a class's abstract members"""
abstract_members = set()
if cls is Abstract:
return abstract_members
for base_cls in cls.__bases__:
abstract_members.update(get_abstract_members(base_cls))
for (member_name, annotation) in get_type_hints(cls).items():
if getattr(annotation, '__origin__', None) is AbstractClassVar:
abstract_members.add(member_name)
return abstract_members
# Implementation checking for abstract class members
if Abstract not in cls.__bases__:
for cls_member in get_abstract_members(cls):
if not hasattr(cls, cls_member):
raise NotImplementedError(f"Wrong class implementation {cls.__name__} " +
f"with abstract class variable {cls_member}")
return super().__init_subclass__()
Usage
class Foo(Abstract):
foo_member: AbstractClassVar[str]
class UpperFoo(Foo):
# Everything should be implemented as intended or else...
...
Not Implementing the abstract class member foo_member will result in a NotImplementedError.
Answer was taken from my original answer to this question: enforcement for abstract properties in python3

Can one declare an abstract exception in Python?

I would like to declare a hierarchy of user-defined exceptions in Python. However, I would like my top-level user-defined class (TransactionException) to be abstract. That is, I intend TransactionException to specify methods that its subclasses are required to define. However, TransactionException should never be instantiated or raised.
I have the following code:
from abc import ABCMeta, abstractmethod
class TransactionException(Exception):
__metaclass__ = ABCMeta
#abstractmethod
def displayErrorMessage(self):
pass
However, the above code allows me to instantiate TransactionException...
a = TransactionException()
In this case a is meaningless, and should instead draw an exception. The following code removes the fact that TransactionException is a subclass of Exception...
from abc import ABCMeta, abstractmethod
class TransactionException():
__metaclass__ = ABCMeta
#abstractmethod
def displayErrorMessage(self):
pass
This code properly prohibits instantiation but now I cannot raise a subclass of TransactionException because it's not an Exception any longer.
Can one define an abstract exception in Python? If so, how? If not, why not?
NOTE: I'm using Python 2.7, but will happily accept an answer for Python 2.x or Python 3.x.
There's a great answer on this topic by Alex Martelli here. In essence, it comes down how the object initializers (__init__) of the various base classes (object, list, and, I presume, Exception) behave when abstract methods are present.
When an abstract class inherits from object (which is the default, if no other base class is given), its __init__ method is set to that of object's, which performs the heavy-lifting in checking if all abstract methods have been implemented.
If the abstract class inherits from a different base class, it will get that class' __init__ method. Other classes, such as list and Exception, it seems, do not check for abstract method implementation, which is why instantiating them is allowed.
The other answer provides a suggested workaround for this. Of course, another option that you have is simply to accept that the abstract class will be instantiable, and try to discourage it.
class TransactionException(Exception):
def __init__(self, *args, **kwargs):
raise NotImplementedError('you should not be raising this')
class EverythingLostException(TransactionException):
def __init__(self, msg):
super(TransactionException, self).__init__(msg)
try:
raise EverythingLostException('we are doomed!')
except TransactionException:
print 'check'
try:
raise TransactionException('we are doomed!')
except TransactionException:
print 'oops'
My implementation for an abstract exception class, in which the children of the class work out of the box.
class TransactionException(Exception):
def __init__(self):
self._check_abstract_initialization(self)
#staticmethod
def _check_abstract_initialization(self):
if type(self) == TransactionException:
raise NotImplementedError("TransactionException should not be instantiated directly")
class AnotherException(TransactionException):
pass
TransactionException() # NotImplementedError: TransactionException should not be instantiated directly
AnotherException # passes
Here's a helper function that can be used in such scenario:
def validate_abstract_methods(obj):
abstract_methods = []
for name in dir(obj):
value = getattr(obj, name, None)
if value is not None and getattr(value, '__isabstractmethod__', False):
abstract_methods.append(name)
if abstract_methods:
abstract_methods.sort()
raise TypeError(f"Can't instantiate abstract class {obj.__class__.__name__} with abstract methods {', '.join(abstract_methods)}")
This function roughly does the same thing as abc.ABC class - you just need to call it from your class' __init__ method.

Categories

Resources