Using metaclass to automatically assign member variables of a class - python

In a Python class, I would like to automatically assign member variables to be the same as the __init__ function arguments, like this:
class Foo(object):
def __init__(self, arg1, arg2 = 1):
self.arg1 = arg1
self.arg2 = arg2
I would like to explicitly have argument names in the init
function for the sake of code clarity.
I don't want to use decorators for the same reason.
Is it possible to achieve this using a custom metaclass?

First, a disclaimer. Python object creation and initialization can be complicated and highly dynamic. This means that it can be difficult to come up with a solution that works for the corner cases. Moreover, the solutions tend to use some darker magic, and so when they inevitably do go wrong they can be hard to debug.
Second, the fact that your class has so many initialization parameters might be a hint that it has too many parameters. Some of them are probably related and can fit together in a smaller class. For example, if I'm building a car, it's better to have:
class Car:
def __init__(self, tires, engine):
self.tires = tires
self.engine = engine
class Tire:
def __init__(self, radius, winter=False):
self.radius = radius
self.winter = winter
class Engine:
def __init__(self, big=True, loud=True):
self.big = big
self.loud = loud
as opposed to
class Car:
def __init__(self, tire_radius, winter_tires=False,
engine_big=True, engine_loud=True):
self.tire_radius = tire_radius
self.winter_tires winter_tires
self.engine_big = engine_big
self.engine_loud = engine_loud
All of that said, here is a solution. I haven't used this in my own code, so it isn't "battle-tested". But it at least appears to work in the simple case. Use at your own risk.
First, metaclasses aren't necessary here, and we can use a simple decorator on the __init__ method. I think this is more readable, anyways, since it is clear that we are only modifying the behavior of __init__, and not something deeper about class creation.
import inspect
import functools
def unpack(__init__):
sig = inspect.signature(__init__)
#functools.wraps(__init__)
def __init_wrapped__(self, *args, **kwargs):
bound = sig.bind(self, *args, **kwargs)
bound.apply_defaults()
# first entry is the instance, should not be set
# discard it, taking only the rest
attrs = list(bound.arguments.items())[1:]
for attr, value in attrs:
setattr(self, attr, value)
return __init__(self, *args, **kwargs)
return __init_wrapped__
This decorator uses the inspect module to retrieve the signature of the __init__ method. Then we simply loop through the attributes and use setattr to assign them.
In use, it looks like:
class Foo(object):
#unpack
def __init__(self, a, b=88):
print('This still runs!')
so that
>>> foo = Foo(42)
This still runs!
>>> foo.a
42
>>> foo.b
88
I am not certain that every introspection tool will see the right signature of the decorated __init__. In particular, I'm not sure if Sphinx will do the "right thing". But at least the inspect module's signature function will return the signature of the wrapped function, as can be tested.
If you really want a metaclass solution, it's simple enough (but less readable and more magic, IMO). You need only write a class factory that applies the unpack decorator:
def unpackmeta(clsname, bases, dct):
dct['__init__'] = unpack(dct['__init__'])
return type(clsname, bases, dct)
class Foo(metaclass=unpackmeta):
def __init__(self, a, b=88):
print('This still runs!')
The output will be the same as the above example.

Related

Can I create a class that inherits from another class passed as an argument?

Like the question posted here, I want to create a class that inherits from another class passed as an argument.
class A():
def __init__(self, args):
stuff
class B():
def __init__(self, args):
stuff
class C():
def __init__(self, cls, args):
self.inherit(cls, args)
args = #arguments to create instances of A and B
class_from_A = C(A, args) #instance of C inherited from A
class_from_B = C(B, args) #instance of C inherited from B
I want to do this so that I can keep track of calls I make to different web api's. The thought is that I am just adding my own functionality to any api-type object. The problem with the solution to the linked question is that I don't want to have to go through the additional "layer" to use the api-type object. I want to say obj.get_data() instead of obj.api.get_data().
I've tried looking into how super() works but haven't came across anything that would help (although I could've easily missed something). Any help would be nice, and I'm open to any other approaches for what I'm trying to do, however, just out of curiosity I'd like to know if this is possible.
I don't think it's possible because __init__ is called after __new__ which is where you would specify base classes, but I think you can achieve your goal of tracking api calls using a metaclass. Since you didn't give any examples of what tracking the calls means, I'll leave you with an example metaclass which counts method calls. You can adapt it to your needs.
Another alternative would be to subclass A and B with methods that track whatever you need, and just return super().whatever(). I think I'd prefer that method unless A and B contain too many methods worth managing like that.
Here's an implementation from python-course.eu, by Bernd Klein. Click the link for more detail.
class FuncCallCounter(type):
""" A Metaclass which decorates all the methods of the
subclass using call_counter as the decorator
"""
#staticmethod
def call_counter(func):
""" Decorator for counting the number of function
or method calls to the function or method func
"""
def helper(*args, **kwargs):
helper.calls += 1
return func(*args, **kwargs)
helper.calls = 0
helper.__name__= func.__name__
return helper
def __new__(cls, clsname, superclasses, attributedict):
""" Every method gets decorated with the decorator call_counter,
which will do the actual call counting
"""
for attr in attributedict:
if callable(attributedict[attr]) and not attr.startswith("__"):
attributedict[attr] = cls.call_counter(attributedict[attr])
return type.__new__(cls, clsname, superclasses, attributedict)

Inserting a dictionary into a python class using a decorator

I am trying to design a wrapper that can accept the name of a class (not an object) and insert a dictionary into each instance of the class. Below is a snippet of how I achieve this when I am wrapping an existing function.
def insert_fn_into_class(cls):
"""Decorator function that consumes a function and inserts it into the cls class."""
def decorator(func):
#wraps(func)
def wrapper(self, *args, **kwargs):
return func(*args, **kwargs)
setattr(cls, f'prefix_{func.__name__}', wrapper)
return decorator
How can I use a similar template to decorate a dictionary and insert it into the cls class. Since dictionaries are not callable, how would such a wrapper be designed?
Update
Thank you for the constructive feedback on this.
A fellow stack-overflow user rightly pointed out that I failed to explain WHY I would want to do this. So here goes:
I am trying to build a framework that can essentially consume a bunch of user defined functions and extend an existing class. So I have a class A, and user-defined functions f_1, f_2, ..., f_n which I want to inject into class A such that an instance of this class obj_a = A() can call these functions such as obj_a.f_1(). I have managed to achieve this using a decorator function similar to the code snippet above.
Now, every user defined function has some repetitive code that I think I can do away with if all instances of my base class A can have access to a user defined dictionary. My thought process for achieving this was to try and modify my existing wrapper function to add it to the class. However, I am aware that dictionaries are not callable and hence, the question.
I hope this was sufficiently elaborate.
Update 2
Looks like there was scope for some more elaboration. Here is an example of how the various components that I described earlier looks like.
module_a.py
class BaseClass():
def __init__(self):
...
def base_class_method(self):
...
def run(self):
getattr(self, 'prefix_user_fn_1')()
def extend(cls=BaseClass):
def decorator(func):
def wrapper(self, *args, **kwargs):
return func(*args, **kwargs)
setattr(cls, f'prefix_{func.__name__}', wrapper)
return decorator
user_defined_functions.py
from module_a import BaseClass, extend
#extend()
def user_fn_1():
dict_ = {'a':'b', 'c':'d'}
...
#extend()
def user_fn_2():
dict_ = {'my':'dict', 'c':'d'}
...
main.py
from module_a import BaseClass
b = BaseClass()
b.run()
Each user function is contains a subset of a commonly used dictionary. To take advantage of this, I think it would be convenient if this can be accessed as part of BaseClass's attributes injected dynamically.
modified_user_defined_functions.py
# THIS WILL NOT WORK FOR OBVIOUS REASONS
from module_a import BaseClass, extend, common_config, insert_config_into_class
# #common_config ?? Is this even a good idea?
dict_ = {'my':'dict', 'a':'b', 'c':'d'}
insert_config_into_class(BaseClass, dict_)
#extend()
def user_fn_1(self):
print(self.dict_)
# do something with self.dict_
...
To incorporate this, I might have to change the run method on BaseClass to look like this:
modified_module_a.py
...
class BaseClass():
...
def run(self):
getattr(self, 'prefix_user_fn_1')(self)
Update 3 - Potential Solution
def shared_config(data, cls=BaseClass):
setattr(cls, 'SHARED_DICT', data)
This solution works, but also somewhat answers my question for which I think I can say that maybe I was overengineering my solution by writing a decorator for this when a simple function could possibly achieve this.
However, an aspect of the original question still remains - Is this a good approach?
The way you're asking questions gives me a hint that you don't quite understand how python uses object references.
Decorators aren't meant to be used with non-callable objects, so it's clarification
accept the name of a class (not an object)
is not really clear. And there's no difference in what you want to attach to class — dict or function, you'll only manipulate abstract reference anyway.
Apart from all that, take a look at this snippet:
def patcher(cls):
def wrapper(*args, **kwargs):
instance = cls(*args, **kwargs)
instance.payload = {'my': 'dict'}
return instance
return wrapper
#patcher
class A:
pass
Does it cover your needs?
I think I can say that maybe I was overengineering my solution
Well, kind of indeed... FWIW the whole thing looks quite overcomplicated.
First thing first: if you define your dict as a global (module) name, all functions within that module can directly access it:
# simple.py
SHARED_DATA = {"foo": "bar", "answer": 42}
def func1():
print SHARED_DATA["foo"]
def func2():
print SHARED_DATA["bar"]
so I don't really see the point of "injecting" this in a class imported from another module just to access it from those functions.
Now with your decorator:
def extend(cls=BaseClass):
def decorator(func):
def wrapper(self, *args, **kwargs):
return func(*args, **kwargs)
setattr(cls, f'prefix_{func.__name__}', wrapper)
return decorator
if the goal is to make the function an attribute of the class but without turning it into an instance method, you can just use staticmethod
def extend(cls=BaseClass):
def decorator(func):
setattr(cls, f'prefix_{func.__name__}', staticmethod(func))
return func
return decorator
Now if you have other (unexplained) reason to make SHARED_DICT a class attribute (of BaseClass or another class), you can indeed provide a configuration function:
# module_a
def configure(cls, data):
cls.SHARED_DATA = data
# less_simple.py
from module_a import BaseClass, configure
SHARED_DATA = {"foo": "bar", "answer": 42}
configure(BaseClass, SHARED_DATA)
def func1():
print SHARED_DATA["foo"]
def func2():
print SHARED_DATA["bar"]
But note that you still don't need to go thru BaseClass nor self to access this dict from the module's functions.
You could of course pass either the class or instance to your used-defined functions but here again, no need for a bizarre contraption - either make them classmethod or just directly set them as attributes of the class, Python will take care of injecting the class or instance as first argument (here an example making them instance methods):
# module_a
def configure(cls, data):
cls.SHARED_DATA = data
def extend(cls=BaseClass):
def decorator(func):
setattr(cls, f'prefix_{func.__name__}', func)
return func
return decorator
# rube_goldberg.py
from module_a import BaseClass, configure, extend
SHARED_DATA = {"foo": "bar", "answer": 42}
configure(BaseClass, SHARED_DATA)
#extend
def func1(self):
print SHARED_DATA["foo"]
#extend
def func2(self):
print SHARED_DATA["bar"]
Note that this is still totally useless from the decorated functions POV - they don't need self at all to access SHARED_DATA, but now they cannot be executed without a BaseClass instance as first argument so the user cannot test them directly.
Now maybe there are other things you didn't mention in your question, but so far it looks like you're trying really hard to make simple things complicated ;-)

Using a base class function that takes parameters as a decorator for derived class function

I feel like I have a pretty good grasp on using decorators when dealing with regular functions, but between using methods of base classes for decorators in derived classes, and passing parameters to said decorators, I cannot figure out what to do next.
Here is a snippet of code.
class ValidatedObject:
...
def apply_validation(self, field_name, code):
def wrap(self, f):
self._validations.append(Validation(field_name, code, f))
return f
return wrap
class test(ValidatedObject):
....
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
If I try this as is, I get an "apply_validation" is not found.
If I try it with #self.apply_validation I get a "self" isn't found.
I've also been messing around with making apply_validation a class method without success.
Would someone please explain what I'm doing wrong, and the best way to fix this? Thank you.
The issue you're having is that apply_validation is a method, which means you need to call it on an instance of ValidatedObject. Unfortunately, at the time it is being called (during the definition of the test class), there is no appropriate instance available. You need a different approach.
The most obvious one is to use a metaclass that searches through its instance dictionaries (which are really class dictionaries) and sets up the _validations variable based on what it finds. You can still use a decorator, but it probably should be a global function, or perhaps a static method, and it will need to work differently. Here's some code, that uses a metaclass and a decorator that adds function attributes:
class ValidatedMeta(type):
def __new__(meta, name, bases, dct):
validations = [Validation(f._validation_field_name, f._validation_code, f)
for f in dct.values if hasattr(f._validation_field_name)]
dct["_validations"] = validations
super(ValidatedMeta, meta).__new__(meta, name, bases, dct)
def apply_validation(field_name, code):
def decorator(f):
f._validation_field_name = field_name
f._validation_code = code
return f
return decorator
def ValidatedObject(metaclass=ValidatedMeta):
pass
class test(ValidatedObject):
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
After this code runs, test._validations will be [Validation("_name", "oh no!", test.name_validation)]. Note that the method that is be passed to Validation is unbound, so you'll need to pass it a self argument yourself when you call it (or perhaps drop the self argument and change the decorator created in apply_validation to return staticmethod(f)).
This code may not do what you want if you have validation methods defined at several levels of an inheritance hierarchy. The metaclass as written above only checks the immediate class's dict for methods with the appropriate attributes. If you need it include inherited methods in _validations too, you may need to modify the logic in ValidatedMeta.__new__. Probably the easiest way to go is to look for _validations attributes in the bases and concatenate the lists together.
Just an example for using decorators on class method:
from functools import wraps
def VALIDATE(dec):
#wraps(dec)
def _apply_validation(self, name):
self.validate(name)
return dec(self, name)
return _apply_validation
class A:
def validate(self, name):
if name != "aamir":
raise Exception, 'Invalid name "%s"' % name
class B(A):
#VALIDATE
def name_validation(self, name):
return name
b = B()
b.name_validation('jacob') # should raise exception

python: subclass a metaclass

For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.

How to find class of bound method during class construction in Python 3.1?

i want to write a decorator that enables methods of classes to become visible to other parties; the problem i am describing is, however, independent of that detail. the code will look roughly like this:
def CLASS_WHERE_METHOD_IS_DEFINED( method ):
???
def foobar( method ):
print( CLASS_WHERE_METHOD_IS_DEFINED( method ) )
class X:
#foobar
def f( self, x ):
return x ** 2
my problem here is that the very moment that the decorator, foobar(), gets to see the method, it is not yet callable; instead, it gets to see an unbound version of it. maybe this can be resolved by using another decorator on the class that will take care of whatever has to be done to the bound method. the next thing i will try to do is to simply earmark the decorated method with an attribute when it goes through the decorator, and then use a class decorator or a metaclass to do the postprocessing. if i get that to work, then i do not have to solve this riddle, which still puzzles me:
can anyone, in the above code, fill out meaningful lines under CLASS_WHERE_METHOD_IS_DEFINED so that the decorator can actually print out the class where f is defined, the moment it gets defined? or is that possibility precluded in python 3?
When the decorator is called, it's called with a function as its argument, not a method -- therefore it will avail nothing to the decorator to examine and introspect its method as much as it wants to, because it's only a function and carries no information whatsoever about the enclosing class. I hope this solves your "riddle", although in the negative sense!
Other approaches might be tried, such as deep introspection on nested stack frames, but they're hacky, fragile, and sure not to carry over to other implementations of Python 3 such as pynie; I would therefore heartily recommend avoiding them, in favor of the class-decorator solution that you're already considering and is much cleaner and more solid.
As I mentioned in some other answers, since Python 3.6 the solution to this problem is very easy thanks to object.__set_name__ which gets called with the class object that is being defined.
We can use it to define a decorator that has access to the class in the following way:
class class_decorator:
def __init__(self, fn):
self.fn = fn
def __set_name__(self, owner, name):
# do something with "owner" (i.e. the class)
print(f"decorating {self.fn} and using {owner}")
# then replace ourself with the original method
setattr(owner, name, self.fn)
Which can then be used as a normal decorator:
>>> class A:
... #class_decorator
... def hello(self, x=42):
... return x
...
decorating <function A.hello at 0x7f9bedf66bf8> and using <class '__main__.A'>
>>> A.hello
<function __main__.A.hello(self, x=42)>
This is a very old post, but introspection isn't the way to solve this problem, because it can be more easily solved with a metaclass and a bit of clever class construction logic using descriptors.
import types
# a descriptor as a decorator
class foobar(object):
owned_by = None
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
# a proxy for `func` that gets used when
# `foobar` is referenced from by a class
return self.func(*args, **kwargs)
def __get__(self, inst, cls=None):
if inst is not None:
# return a bound method when `foobar`
# is referenced from by an instance
return types.MethodType(self.func, inst, cls)
else:
return self
def init_self(self, name, cls):
print("I am named '%s' and owned by %r" % (name, cls))
self.named_as = name
self.owned_by = cls
def init_cls(self, cls):
print("I exist in the mro of %r instances" % cls)
# don't set `self.owned_by` here because
# this descriptor exists in the mro of
# many classes, but is only owned by one.
print('')
The key to making this work is the metaclass - it searches through the attributes defined on the classes it creates to find foobar descriptors. Once it does, it passes them information about the classes they are involved in through the descriptor's init_self and init_cls methods.
init_self is called only for the class which the descriptor is defined on. This is where modifications to foobar should be made, because the method is only called once. While init_cls is called for all classes which have access to the decorated method. This is where modifications to the classes foobar can be referenced by should be made.
import inspect
class MetaX(type):
def __init__(cls, name, bases, classdict):
# The classdict contains all the attributes
# defined on **this** class - no attribute in
# the classdict is inherited from a parent.
for k, v in classdict.items():
if isinstance(v, foobar):
v.init_self(k, cls)
# getmembers retrieves all attributes
# including those inherited from parents
for k, v in inspect.getmembers(cls):
if isinstance(v, foobar):
v.init_cls(cls)
example
# for compatibility
import six
class X(six.with_metaclass(MetaX, object)):
def __init__(self):
self.value = 1
#foobar
def f(self, x):
return self.value + x**2
class Y(X): pass
# PRINTS:
# I am named 'f' and owned by <class '__main__.X'>
# I exist in the mro of <class '__main__.X'> instances
# I exist in the mro of <class '__main__.Y'> instances
print('CLASS CONSTRUCTION OVER\n')
print(Y().f(3))
# PRINTS:
# 10

Categories

Resources