There are a family of classes for which I wish to optionally extend and override some functionality. For these classes, I wish to add the same functionality, as they are all compatible. I am using Python 3.8+. The way I achieved this was by creating the class as a type with the additional functionality and passing the parent class as the bases. As a basic example:
class A:
def __init__(self, a, **kwargs):
self.a = a
self.params = kwargs
class B:
def __init__(self, b, **kwargs):
self.b = b
self.params = kwargs
def extend_class_with_print_params(base_class):
def print_params(self):
for k, v in self.params.items():
print(k, v)
return type(
f"Extended{base_class.__name__}",
(base_class,),
dict(print_params=print_params),
)
In the above, I define A and B. The function extend_class_with_print_params adds functionality compatible with both. My actual use case is adding pre-train and post-predict hooks to some instances of specific sklearn predictors, which is why I need the parent to be configurable.
import joblib
from test_classes import *
normal_a = A(a=10)
joblib.dump(normal_a, "normal_a")
normal_a = joblib.load("normal_a")
extended_a = extend_class_with_print_params(A)(a=15, alpha=0.1, beta=0.2)
joblib.dump(extended_a, "extended_a")
extended_a = joblib.load("extended_a")
When dumping extended_a, the following error is thrown:
_pickle.PicklingError: Can't pickle <class 'test_classes.ExtendedA'>: it's not found as test_classes.ExtendedA
As suggested in one of the below posts, I attempted setting new_class_name in globals to point to the new class before returning in the function. This allowed me to successfully dump, but not load the file in a different session, which makes sense since the globals would be reset. In general, I would also prefer not to modify globals anyway.
I have tried but failed to work out a solution using __reduce__ based on the following:
Pickling dynamically generated classes?
How can I pickle a dynamically created nested class in python?
Pickle a dynamically parameterized sub-class
I didn't find the above methods clearly to apply to my situation. The content may be relevant and directly applicable, but I failed to find a way.
I'm also entirely open to changing my pattern (even if it means not dynamically defining the class). In short, I have the following requirements:
Extend and override some arbitrary parent class's functionality, but not in all cases, since it will be optional whether to extend/override the class
The objects must be pickle-able
The objects must be pickled using joblib or the pickle library, not something like cloudpickle
It's probably best to avoid dynamically generating a class. Ideally, you can account for added functionality from the beginning. If you have control over classes A and B, you could do a pattern like this:
class A:
hook: Callable
def __init__(self, b, **kwargs):
self.b = b
self.params = kwargs
def print_param_hook(self):
if self.hook:
self.hook(self.params.items())
else:
raise ArithmeticError("No hook function supplied!")
def set_hook(self, hook: Callable):
self.hook = hook
def hook(items):
for k, v in items:
print(k, v)
a = A("foo", y="bar")
a.set_hook(hook1)
a.print_param_hook()
Here, A is defined with a pre-existing method that will call a generic function provided by the user. This, of course, constrains what sort of arguments your hook function can take.
Another option is to make a subclass of A and add your method to the subclass. Continuing the above example:
class SubA(A):
def print_params(self):
for k, v in self.params.items():
print(k, v)
subA = SubA("foo", y="bar")
subA.print_params()
Finally, if you must add an arbitrary method to a class, you can do this using setattr:
def attr_hook(self):
for k, v in self.params.items():
print(k, v)
setattr(A, 'attr_hook', attr_hook)
new_a = A("foo", y="bar")
new_a.attr_hook()
Note that this will affect every instance of A created, including those created before setattr, which isn't super desirable. You can read more about using setattr in this way in this blog post, including how to make a decorator to make it more seamless.
All of the options are completely pickleable:
import pickle
with open("test.pyc", "wb") as file:
pickle.dump(new_a, file)
with open("test.pyc", "rb") as file:
b = pickle.load(file)
b.attr_hook()
Related
I am working on a ctypes drop-in-replacement / extension and ran into an issue I do not fully understand.
I am trying to build a class factory for call-back function decorators similar to CFUNCTYPE and WINFUNCTYPE. Both factories produce classes derived from ctypes._CFuncPtr. Like every ctypes function interface, they have properties like argtypes and restype. I want to extend the classes allowing an additional property named some_param and I thought, why not, let's try this with "getter" and "setter" methods - how hard can it be ...
Because I am trying to use "getter" and "setter" methods (#property) on a property of a class (NOT a property of objects), I ended up writing a metaclass. Because my class is derived from ctypes._CFuncPtr, I think my metaclass must be derived from ctypes._CFuncPtr.__class__ (I could be wrong here).
The example below works, sort of:
import ctypes
class a_class:
def b_function(self, some_param_parg):
class c_class_meta(ctypes._CFuncPtr.__class__):
def __init__(cls, *args):
super().__init__(*args) # no idea if this is good ...
cls._some_param_ = some_param_parg
#property
def some_param(cls):
return cls._some_param_
#some_param.setter
def some_param(cls, value):
if not isinstance(value, list):
raise TypeError('some_param must be a list')
cls._some_param_ = value
class c_class(ctypes._CFuncPtr, metaclass = c_class_meta):
_argtypes_ = ()
_restype_ = None
_flags_ = ctypes._FUNCFLAG_STDCALL # change for CFUNCTYPE or WINFUNCTYPE etc ...
return c_class
d_class = a_class().b_function([1, 2, 3])
print(d_class.some_param)
d_class.some_param = [2, 6]
print(d_class.some_param)
d_class.some_param = {} # Raises an error - as expected
So far so good - using the above any further does NOT work anymore. The following pseudo-code (if used on an actual function from a DLL or shared object) will fail - in fact, it will cause the CPython interpreter to segfault ...
some_routine = ctypes.windll.LoadLibrary('some.dll').some_routine
func_type = d_class(ctypes.c_int16, ctypes.c_int16) # similar to CFUNCTYPE/WINFUNCTYPE
func_type.some_param = [4, 5, 6] # my "special" property
some_routine.argtypes = (ctypes.c_int16, func_type)
#func_type
def demo(x):
return x - 1
some_routine(4, demo) # segfaults HERE!
I am not entirely sure what goes wrong. ctypes._CFuncPtr is implemented in C, which could be a relevant limitation ... I could also have made a mistake in the implementation of the metaclass. Can someone enlighten me?
(For additional context, I am working on this function.)
Maybe ctypes metaclasses simply won't work nicely being subclasses - since it is itself written in C, it may bypass the routes inheritance imposes for some shortcuts and end up in failures.
Ideally this "bad behavior" would have to be properly documented, filled as bugs against CPython's ctypes and fixed - to my knowledge there are not many people who can fix ctypes bugs.
On the other hand, having a metaclass just because you want a property-like attribute at class level is overkill.
Python's property itself is just pre-made, very useful builtin class that implements the descriptor protocol. Any class you create yourself that implements proper __get__ and __set__ methods can replace "property" (and often, when logic is shared across property-attributes, leads to shorter, non duplicated code)
On a second though, unfortunately, descriptor setters will only work for instances, not for classes (which makes sense, since doing cls.attr will already get you the special code-guarded value, and there is no way a __set__ method could be called on it)
So, if you could work with "manually" setting the values in the cls.__dict__ and putting your logic in the __get__ attribute, you could do:
PREFIX = "_cls_prop_"
class ClsProperty:
def __set_name__(self, owner, name):
self.name = name
def __get__(self, instance, owner):
value = owner.__dict__.get(PREFIX + self.name)
# Logic to transform/check value goes here:
if not isinstance(value, list):
raise TypeError('some_param must be a list')
return value
def b_function(some_param_arg):
class c_class(ctypes._CFuncPtr):
_argtypes_ = ()
_restype_ = None
_flags_ = 0 # ctypes._FUNCFLAG_STDCALL # change for CFUNCTYPE or WINFUNCTYPE etc ...
_some_param_ = ClsProperty()
setattr(c_class, PREFIX + "_some_param_", some_param_arg)
return c_class
d_class = b_function([1, 2, 3])
print(d_class._some_param_)
d_class._some_param_ = [1, 2]
print(d_class._some_param_)
If that does not work, I don't think other approaches trying to extend CTypes metaclass will work anyway, but if you want a try, instead of a "meta-property", you might try to customize the metaclass' __setitem__ instead, to do your parameter checking, instead of using property.
I would like to create a class which defines a particular interface, and then require all subclasses to conform to this interface. For example, I would like to define a class
class Interface:
def __init__(self, arg1):
pass
def foo(self, bar):
pass
and then be assured that if I am holding any element a which has type A, a subclass of Interface, then I can call a.foo(2) it will work.
It looked like this question almost addressed the problem, but in that case it is up to the subclass to explicitly change it's metaclass.
Ideally what I'm looking for is something similar to Traits and Impls from Rust, where I can specify a particular Trait and a list of methods that trait needs to define, and then I can be assured that any object with that Trait has those methods defined.
Is there any way to do this in Python?
So, first, just to state the obvious - Python has a built-in mechanism to test for the existence of methods and attributes in derived classes - it just does not check their signature.
Second, a nice package to look at is zope.interface. Despte the zope namespace, it is a complete stand-alone package that allows really neat methods of having objects that can expose multiple interfaces, but just when needed - and then frees-up the namespaces. It sure involve some learning until one gets used to it, but it can be quite powerful and provide very nice patterns for large projects.
It was devised for Python 2, when Python had a lot less features than nowadays - and I think it does not perform automatic interface checking (one have to manually call a method to find-out if a class is compliant) - but automating this call would be easy, nonetheless.
Third, the linked accepted answer at How to enforce method signature for child classes? almost works, and could be good enough with just one change. The problem with that example is that it hardcodes a call to type to create the new class, and do not pass type.__new__ information about the metaclass itself. Replace the line:
return type(name, baseClasses, d)
for:
return super().__new__(cls, name, baseClasses, d)
And then, make the baseclass - the one defining your required methods use the metaclass - it will be inherited normally by any subclasses. (just use Python's 3 syntax for specifying metaclasses).
Sorry - that example is Python 2 - it requires change in another line as well, I better repost it:
from types import FunctionType
# from https://stackoverflow.com/a/23257774/108205
class SignatureCheckerMeta(type):
def __new__(mcls, name, baseClasses, d):
#For each method in d, check to see if any base class already
#defined a method with that name. If so, make sure the
#signatures are the same.
for methodName in d:
f = d[methodName]
for baseClass in baseClasses:
try:
fBase = getattr(baseClass, methodName)
if not inspect.getargspec(f) == inspect.getargspec(fBase):
raise BadSignatureException(str(methodName))
except AttributeError:
#This method was not defined in this base class,
#So just go to the next base class.
continue
return super().__new__(mcls, name, baseClasses, d)
On reviewing that, I see that there is no mechanism in it to enforce that a method is actually implemented. I.e. if a method with the same name exists in the derived class, its signature is enforced, but if it does not exist at all in the derived class, the code above won't find out about it (and the method on the superclass will be called - that might be a desired behavior).
The answer:
Fourth -
Although that will work, it can be a bit rough - since it does any method that override another method in any superclass will have to conform to its signature. And even compatible signatures would break. Maybe it would be nice to build upon the ABCMeta and #abstractmethod existind mechanisms, as those already work all corner cases. Note however that this example is based on the code above, and check signatures at class creation time, while the abstractclass mechanism in Python makes it check when the class is instantiated. Leaving it untouched will enable you to work with a large class hierarchy, which might keep some abstractmethods in intermediate classes, and just the final, concrete classes have to implement all methods.
Just use this instead of ABCMeta as the metaclass for your interface classes, and mark the methods you want to check the interface as #abstractmethod as usual.
class M(ABCMeta):
def __init__(cls, name, bases, attrs):
errors = []
for base_cls in bases:
for meth_name in getattr(base_cls, "__abstractmethods__", ()):
orig_argspec = inspect.getfullargspec(getattr(base_cls, meth_name))
target_argspec = inspect.getfullargspec(getattr(cls, meth_name))
if orig_argspec != target_argspec:
errors.append(f"Abstract method {meth_name!r} not implemented with correct signature in {cls.__name__!r}. Expected {orig_argspec}.")
if errors:
raise TypeError("\n".join(errors))
super().__init__(name, bases, attrs)
You could follow the pyspark pattern, where the method of the base class performs (optional) argument validity checking, and then calls a "non-public" method of the subclass, for example:
class Regressor():
def fit(self, X, y):
self._check_arguments(X, y)
self._fit(X, y)
def _check_arguments(self, X, y):
if True:
pass
else:
raise ValueError('Invalid arguments.')
class LinearRegressor(Regressor):
def _fit(self, X, y):
# code here
This is about multiple inheritance. Parent class A provides a few methods and B parent class B a few additional ones. By creating a class inheriting from A and B I could instantiate an object having both method sets.
Now my problem is, that I detect only after having instantiated A, that the methods from B would be helpful too (or more strictly stated, that my object is also of class B).
While
aInstance.bMethod = types.MethodType(localFunction, aInstance)
works in principle, it has to be repeated for any bMethod, and looks unnecessary complicated. It also requires stand-alone (local) functions instead of a conceptually cleaner class B. Is there a more streamlined approach?
Update:
I tried abstract base class with some success, but there only the methods of one additional class could be added.
What I finally achieved is a little routine, which adds all top-level procedures of a given module:
from types import MethodType
from inspect import ismodule, isfunction, getmembers
# adds all functions found in module as methods to given obj
def classMagic(obj, module):
assert(ismodule(module))
for name, fn in getmembers(module, isfunction):
if not name.startswith("__"):
setattr(obj, name, MethodType(fn, obj))
Functionally this is sufficient, and I'm also pleased with the automatism, that all functions are processed and I don't have separate places of function definition and adding it as method, so maintenace is easy. The only remaining issue is reflected by the startswith line, as an example for a neccessary naming convention, if selected functions shall not be added.
If I understand correctly, you want to add mixins to your class at run time. A very common way of adding mixins in Python is through decorators (rather than inheritance), so we can borrow this idea to do something runtime to the object (instead to the class).
I used functools.partial to freeze the self parameter, to emulate the process of binding a function to an object (i.e. turn a function into a method).
from functools import partial
class SimpleObject():
pass
def MixinA(obj):
def funcA1(self):
print('A1 - propertyA is equal to %s' % self.propertyA)
def funcA2(self):
print('A2 - propertyA is equal to %s' % self.propertyA)
obj.propertyA = 0
obj.funcA1 = partial(funcA1, self=obj)
obj.funcA2 = partial(funcA2, self=obj)
return obj
def MixinB(obj):
def funcB1(self):
print('B1')
obj.funcB1 = partial(funcB1, self=obj)
return obj
o = SimpleObject()
# need A characteristics?
o = MixinA(o)
# need B characteristics?
o = MixinB(o)
Instead of functools.partial, you can also use types.MethodType as you did in your question; I think that is a better/cleaner solution.
I'm seeking advice about design of my code.
Introduction
I have several classes, each represents one file type, eg: MediaImageFile, MediaAudioFile and generic (and also base class) MediaGenericFile.
Each file have two variants: Master and Version, so I created these classes to define theirs specific behaviour. EDIT: Version represents resized/cropped/trimmed/etc variant of Master file. It's used mainly for previews.
EDIT: The reason, why I want to do it dynamically is that this app should be reusable (it's Django-app) and therefore it should be easy to implement other MediaGenericFile subclass without modifying original code.
What I want to do
First of all, user should be able to register own MediaGenericFile subclasses without affecting original code.
Whether file is version or master is easily (one regexp) recognizable from filename.
/path/to/master.jpg -- master
/path/to/.versions/master_version.jpg -- version
Master/Version classes use some methods/properties of MediaGenericFile, like filename (you need to know filename to generate new version).
MediaGenericFile extends LazyFile, which is just lazy File object.
Now I need to put it together…
Used design
Before I start coding 'versions' feature, I had factory class MediaFile, which returns appropriate file type class according to extension:
>>> MediaFile('path/to/image.jpg')
<<< <MediaImageFile 'path/to/image.jpg'>
Classes Master and Version define new methods which use methods and attributes of MediaGenericFile and etc.
Approach 1
One approach is create dynamically new type, which inherits Master (or Version) and MediaGenericFile (or subclass).
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
bases = (Version, klass)
class_name = '{0}Version'.format(klass.__name__)
else:
bases = (Master, klass)
class_name = '{0}Master'.format(klass.__name__)
new_class = type(class_name, bases, {})
...
return new_class(*args, **kwargs)
Approach 2
Second approach is create method 'contribute_to_instance' in Master/Version and call it after creating new_class, but that's more tricky than I thought:
classs Master(object):
#classmethod
def contribute_to_instance(cls, instance):
methods = (...)
for m in methods:
setattr(instance, m, types.MethodType(getattr(cls, m), instance))
class MediaFile(object):
def __new__(*args, **kwargs):
... # decision about new_class
obj = new_class(*args, **kwargs)
if version:
version_class = Version
else:
version_class = Master
version_class.contribute_to_instance(obj)
...
return obj
However, this doesn't work. There are still problems with calling Master/Version's methods.
Questions
What would be good way to implement this multiple inheritance?
How is this problem called? :) I was trying to find some solutions, but I simply don't know how to name this problem.
Thanks in advance!
Note to answers
ad larsmans
Comparison and instance check wouldn't be problem for my case, because:
Comparisons are redefined anyway
class MediaGenericFile(object):
def __eq__(self, other):
return self.name == other.name
I never need to check isinstance(MediaGenericFileVersion, instance). I'm using isinstance(MediaGenericFile, instance) and isinstance(Version, instance) and both works as expected.
Nevertheless, creating new type per instance sounds to me as considerable defect.
Well, I could create both variations dynamically in metaclass and then use them, something like:
>>> MediaGenericFile.version_class
<<< <class MediaGenericFileVersion>
>>> MediaGenericFile.master_class
<<< <class MediaGenericFileMaster>
And then:
class MediaFile(object):
def __new__(cls, *args, **kwargs):
... # decision about klass
if version:
attr_name = 'version_class'
else:
attr_name = 'master_class'
new_class = getattr(klass, attr_name)
...
return new_class(*args, **kwargs)
Final solution
Finally the design pattern is factory class. MediaGenericFile subclasses are statically typed, users can implement and register their own. Master/Version variants are created dynamically (glued together from several mixins) in metaclass and stored in 'cache' to avoid perils mentioned by larsmans.
Thanks everyone for their suggestions. Finally I understand the metaclass concept. Well, at least I think that I understand it. Push origin master…
I'd certainly advise against the first approach of constructing classes in __new__. The problem with it is that you create a new type per instance, which causes overhead and worse, causes type comparisons to fail:
>>> Ham1 = type("Ham", (object,), {})
>>> Ham2 = type("Ham", (object,), {})
>>> Ham1 == Ham2
False
>>> isinstance(Ham1(), Ham2)
False
>>> isinstance(Ham2(), Ham1)
False
This violates the principle of least surprise because the classes may seem entirely identical:
>>> Ham1
<class '__main__.Ham'>
>>> Ham2
<class '__main__.Ham'>
You can get approach 1 to work properly, though, if you construct the classes at the module level, outside of MediaFile:
classes = {}
for klass in [MediaImageFile, MediaAudioFile]:
for variant in [Master, Version]:
# I'd actually do this the other way around,
# making Master and Version mixins
bases = (variant, klass)
name = klass.__name__ + variant.__name__
classes[name] = type(name, bases, {})
then, in MediaFile.__new__, look the required class up by name in classes. (Alternatively, set the newly constructed classes on the module instead of in a dict.)
I'm not sure how dynamic you want it to be, but using a "factory pattern" (here using a class factory), is fairly readable and understandable and may do what you want. This could serve as a base... MediaFactory could be smarter, and you could register multiple other classes, instead of hard-coding MediaFactoryMaster etc...
class MediaFactory(object):
__items = {}
#classmethod
def make(cls, item):
return cls.__items[item]
#classmethod
def register(cls, item):
def func(kls):
cls.__items[item] = kls
return kls
return func
class MediaFactoryMaster(MediaFactory, Master): pass
class MediaFactoryVersion(MediaFactory, Version): pass
class MediaFile(object):
pass
#MediaFactoryMaster.register('jpg') # adapt to take ['jpg', 'gif', 'png'] ?
class MediaFileImage(MediaFile):
pass
#MediaFactoryVersion.register('mp3') # adapt to take ['mp3', 'ogg', 'm4a'] ?
class MediaFileAudio(MediaFile):
pass
other possible MediaFactory.make
#classmethod
def make(cls, fname):
name, ext = somefunc(fname)
kls = cls.__items[ext]
other = Version if Version else Master
return type('{}{}'.format(kls.__name__,other.__name__), (kls, other), {})
How come you're not using inheritance but are playing around with __new__?
class GenericFile(File):
"""Base class"""
class Master(object):
"""Master Mixin"""
class Versioned(object):
"""Versioning mixin"""
class ImageFile(GenericFile):
"""Image Files"""
class MasterImage(ImageFile, Master):
"""Whatever"""
class VersionedImage(ImageFile, Versioned):
"""Blah blah blah"""
...
It's not clear why you're doing this though. I think there's a weird code smell here. I'd recommend fewer classes with a consistent interface (duck-typing) rather than a dozen classes and isinstance checks throughout the code to make it all work.
Perhaps you can update your question with what you'd like to do in your code and folks can help either identify the real pattern or a suggest a more idiomatic solution.
You do not have to create a new class for each instance. Don't create the new classes in __new__ create them in __metaclass__. define a metaclass in the base or in the base_module. The two "variant" subclasses are easily saved as as class attributes of their genaric parent and then __new__ just looks at the filename according to it's own rules and decides which subclass to return.
Watch out for __new__ that returns a class other than the one "nominated" during the constructor call. You may have to take steps to invoke __init__ from withing __new__
Subclasses will either have to:
"register" themselves with a factory or parent to be found
be imported and then have the parent or factory find them through a recursive search of cls.__subclasses (might have to happen once per creation but that's probably not a problem for file handeling)
found through the use of "setuptools" entry_points type tools but that requires more effort and coordination by the user
The OOD question you should be asking is "do the various classes of my proposed inheritance share any properties at all?"
The purpose of inheritance is to share common data or methods that the instances naturally have in common. Aside from both being files, what do Image files and Audio files have in common? If you really want to stretch your metaphors, you could conceivably have AudioFile.view() which could present — for example — a visualization of the power spectra of the audio data, but ImageFile.listen() makes even less sense.
I think your question side-steps this language independent conceptual issue in favor of the Python dependent mechanics of an object factory. I don't think you have a proper case of inheritance here, or you've failed to explain what common features your Media objects need to share.
This article has a snippet showing usage of __bases__ to dynamically change the inheritance hierarchy of some Python code, by adding a class to an existing classes collection of classes from which it inherits. Ok, that's hard to read, code is probably clearer:
class Friendly:
def hello(self):
print 'Hello'
class Person: pass
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
That is, Person doesn't inherit from Friendly at the source level, but rather this inheritance relation is added dynamically at runtime by modification of the __bases__attribute of the Person class. However, if you change Friendly and Person to be new style classes (by inheriting from object), you get the following error:
TypeError: __bases__ assignment: 'Friendly' deallocator differs from 'object'
A bit of Googling on this seems to indicate some incompatibilities between new-style and old style classes in regards to changing the inheritance hierarchy at runtime. Specifically: "New-style class objects don't support assignment to their bases attribute".
My question, is it possible to make the above Friendly/Person example work using new-style classes in Python 2.7+, possibly by use of the __mro__ attribute?
Disclaimer: I fully realise that this is obscure code. I fully realize that in real production code tricks like this tend to border on unreadable, this is purely a thought experiment, and for funzies to learn something about how Python deals with issues related to multiple inheritance.
Ok, again, this is not something you should normally do, this is for informational purposes only.
Where Python looks for a method on an instance object is determined by the __mro__ attribute of the class which defines that object (the M ethod R esolution O rder attribute). Thus, if we could modify the __mro__ of Person, we'd get the desired behaviour. Something like:
setattr(Person, '__mro__', (Person, Friendly, object))
The problem is that __mro__ is a readonly attribute, and thus setattr won't work. Maybe if you're a Python guru there's a way around that, but clearly I fall short of guru status as I cannot think of one.
A possible workaround is to simply redefine the class:
def modify_Person_to_be_friendly():
# so that we're modifying the global identifier 'Person'
global Person
# now just redefine the class using type(), specifying that the new
# class should inherit from Friendly and have all attributes from
# our old Person class
Person = type('Person', (Friendly,), dict(Person.__dict__))
def main():
modify_Person_to_be_friendly()
p = Person()
p.hello() # works!
What this doesn't do is modify any previously created Person instances to have the hello() method. For example (just modifying main()):
def main():
oldperson = Person()
ModifyPersonToBeFriendly()
p = Person()
p.hello()
# works! But:
oldperson.hello()
# does not
If the details of the type call aren't clear, then read e-satis' excellent answer on 'What is a metaclass in Python?'.
I've been struggling with this too, and was intrigued by your solution, but Python 3 takes it away from us:
AttributeError: attribute '__dict__' of 'type' objects is not writable
I actually have a legitimate need for a decorator that replaces the (single) superclass of the decorated class. It would require too lengthy a description to include here (I tried, but couldn't get it to a reasonably length and limited complexity -- it came up in the context of the use by many Python applications of an Python-based enterprise server where different applications needed slightly different variations of some of the code.)
The discussion on this page and others like it provided hints that the problem of assigning to __bases__ only occurs for classes with no superclass defined (i.e., whose only superclass is object). I was able to solve this problem (for both Python 2.7 and 3.2) by defining the classes whose superclass I needed to replace as being subclasses of a trivial class:
## T is used so that the other classes are not direct subclasses of object,
## since classes whose base is object don't allow assignment to their __bases__ attribute.
class T: pass
class A(T):
def __init__(self):
print('Creating instance of {}'.format(self.__class__.__name__))
## ordinary inheritance
class B(A): pass
## dynamically specified inheritance
class C(T): pass
A() # -> Creating instance of A
B() # -> Creating instance of B
C.__bases__ = (A,)
C() # -> Creating instance of C
## attempt at dynamically specified inheritance starting with a direct subclass
## of object doesn't work
class D: pass
D.__bases__ = (A,)
D()
## Result is:
## TypeError: __bases__ assignment: 'A' deallocator differs from 'object'
I can not vouch for the consequences, but that this code does what you want at py2.7.2.
class Friendly(object):
def hello(self):
print 'Hello'
class Person(object): pass
# we can't change the original classes, so we replace them
class newFriendly: pass
newFriendly.__dict__ = dict(Friendly.__dict__)
Friendly = newFriendly
class newPerson: pass
newPerson.__dict__ = dict(Person.__dict__)
Person = newPerson
p = Person()
Person.__bases__ = (Friendly,)
p.hello() # prints "Hello"
We know that this is possible. Cool. But we'll never use it!
Right of the bat, all the caveats of messing with class hierarchy dynamically are in effect.
But if it has to be done then, apparently, there is a hack that get's around the "deallocator differs from 'object" issue when modifying the __bases__ attribute for the new style classes.
You can define a class object
class Object(object): pass
Which derives a class from the built-in metaclass type.
That's it, now your new style classes can modify the __bases__ without any problem.
In my tests this actually worked very well as all existing (before changing the inheritance) instances of it and its derived classes felt the effect of the change including their mro getting updated.
I needed a solution for this which:
Works with both Python 2 (>= 2.7) and Python 3 (>= 3.2).
Lets the class bases be changed after dynamically importing a dependency.
Lets the class bases be changed from unit test code.
Works with types that have a custom metaclass.
Still allows unittest.mock.patch to function as expected.
Here's what I came up with:
def ensure_class_bases_begin_with(namespace, class_name, base_class):
""" Ensure the named class's bases start with the base class.
:param namespace: The namespace containing the class name.
:param class_name: The name of the class to alter.
:param base_class: The type to be the first base class for the
newly created type.
:return: ``None``.
Call this function after ensuring `base_class` is
available, before using the class named by `class_name`.
"""
existing_class = namespace[class_name]
assert isinstance(existing_class, type)
bases = list(existing_class.__bases__)
if base_class is bases[0]:
# Already bound to a type with the right bases.
return
bases.insert(0, base_class)
new_class_namespace = existing_class.__dict__.copy()
# Type creation will assign the correct ‘__dict__’ attribute.
del new_class_namespace['__dict__']
metaclass = existing_class.__metaclass__
new_class = metaclass(class_name, tuple(bases), new_class_namespace)
namespace[class_name] = new_class
Used like this within the application:
# foo.py
# Type `Bar` is not available at first, so can't inherit from it yet.
class Foo(object):
__metaclass__ = type
def __init__(self):
self.frob = "spam"
def __unicode__(self): return "Foo"
# … later …
import bar
ensure_class_bases_begin_with(
namespace=globals(),
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
Use like this from within unit test code:
# test_foo.py
""" Unit test for `foo` module. """
import unittest
import mock
import foo
import bar
ensure_class_bases_begin_with(
namespace=foo.__dict__,
class_name=str('Foo'), # `str` type differs on Python 2 vs. 3.
base_class=bar.Bar)
class Foo_TestCase(unittest.TestCase):
""" Test cases for `Foo` class. """
def setUp(self):
patcher_unicode = mock.patch.object(
foo.Foo, '__unicode__')
patcher_unicode.start()
self.addCleanup(patcher_unicode.stop)
self.test_instance = foo.Foo()
patcher_frob = mock.patch.object(
self.test_instance, 'frob')
patcher_frob.start()
self.addCleanup(patcher_frob.stop)
def test_instantiate(self):
""" Should create an instance of `Foo`. """
instance = foo.Foo()
The above answers are good if you need to change an existing class at runtime. However, if you are just looking to create a new class that inherits by some other class, there is a much cleaner solution. I got this idea from https://stackoverflow.com/a/21060094/3533440, but I think the example below better illustrates a legitimate use case.
def make_default(Map, default_default=None):
"""Returns a class which behaves identically to the given
Map class, except it gives a default value for unknown keys."""
class DefaultMap(Map):
def __init__(self, default=default_default, **kwargs):
self._default = default
super().__init__(**kwargs)
def __missing__(self, key):
return self._default
return DefaultMap
DefaultDict = make_default(dict, default_default='wug')
d = DefaultDict(a=1, b=2)
assert d['a'] is 1
assert d['b'] is 2
assert d['c'] is 'wug'
Correct me if I'm wrong, but this strategy seems very readable to me, and I would use it in production code. This is very similar to functors in OCaml.
This method isn't technically inheriting during runtime, since __mro__ can't be changed. But what I'm doing here is using __getattr__ to be able to access any attributes or methods from a certain class. (Read comments in order of numbers placed before the comments, it makes more sense)
class Sub:
def __init__(self, f, cls):
self.f = f
self.cls = cls
# 6) this method will pass the self parameter
# (which is the original class object we passed)
# and then it will fill in the rest of the arguments
# using *args and **kwargs
def __call__(self, *args, **kwargs):
# 7) the multiple try / except statements
# are for making sure if an attribute was
# accessed instead of a function, the __call__
# method will just return the attribute
try:
return self.f(self.cls, *args, **kwargs)
except TypeError:
try:
return self.f(*args, **kwargs)
except TypeError:
return self.f
# 1) our base class
class S:
def __init__(self, func):
self.cls = func
def __getattr__(self, item):
# 5) we are wrapping the attribute we get in the Sub class
# so we can implement the __call__ method there
# to be able to pass the parameters in the correct order
return Sub(getattr(self.cls, item), self.cls)
# 2) class we want to inherit from
class L:
def run(self, s):
print("run" + s)
# 3) we create an instance of our base class
# and then pass an instance (or just the class object)
# as a parameter to this instance
s = S(L) # 4) in this case, I'm using the class object
s.run("1")
So this sort of substitution and redirection will simulate the inheritance of the class we wanted to inherit from. And it even works with attributes or methods that don't take any parameters.