I am making a metaclass where I customize the __new__ method to customize how the new class will be created according to the provided values in kwargs. This probably makes more sense in an example:
class FooMeta(type):
def __new__(cls, name, bases, kwargs):
# do something with the kwargs...
# for example:
if 'foo' in kwargs:
kwargs.update({
'fooattr': 'foovalue'
})
return super(FooMeta, cls).__new__(cls, name, bases, kwargs)
My problem is how can I make this compatible for both Python 2 and 3. Six is a great compatibility library however it does not solve my problem. You would use it as:
class FooClass(six.with_metaclass(FooMeta, FooBase)):
pass
This does not work because six creates a new base class by using the given metaclass. Below is the six's code (link) (as of 1.3):
def with_metaclass(meta, base=object):
return meta("NewBase", (base,), {})
As a result, my __new__ method will be called without any kwargs hence essentially "breaking" my function. The question is how can I accomplish the behavior I want without breaking the compatibility for both Python 2 and 3. Please note that I don't need Python 2.6 support so if something is possible for only Python 2.7.x and 3.x I am fine with that.
Background
I need this to work in Django. I want to create a model factory metaclass which depending on the model attributes will customize how the model is created.
class ModelMeta(ModelBase):
def __new__(cls, name, bases, kwargs):
# depending on kwargs a different model is constructed
# ...
class FooModel(six.with_metaclass(ModelMeta, models.Model):
# some attrs here to be passed to the metaclass
If I understand you right, there is no problem and it already works. Did you actually check whether your metaclass has the desired effect? Because I think it already does.
The class returned by with_metaclass is not meant to play the role of your class FooClass. It is a dummy class used as the base class of your class. Since this base class is of metaclass FooMeta, your derived class will also have metaclass FooMeta and so the metaclass will be called again with the appropriate arguments.
class FooMeta(type):
def __new__(cls, name, bases, attrs):
# do something with the kwargs...
# for example:
if 'foo' in attrs:
attrs['fooattr'] = 'foovalue'
return super(FooMeta, cls).__new__(cls, name, bases, attrs)
class FooBase(object):
pass
class FooClass(with_metaclass(FooMeta, FooBase)):
foo = "Yes"
>>> FooClass.fooattr
'foovalue'
Incidentally, it is confusing to call the third argument of your metaclass kwargs. It isn't really keyword arguments. It is the __dict__ of the class to be created --- that is, the class's attributes and their values. In my experience this attribute is conventionally called attrs or perhaps dct (see e.g., here and here).
Related
Given a Python 2 metaclass defined as:
class Meta(type):
def __init__(cls, name, bases, dct):
pass
what is the purpose of the dct argument? Based on what I've found, it contains identical information to cls.__dict__. The difference being that making changes to dct has no effect on the concrete class but adding things to cls.__dict__ (i.e. by cls.__dict__['a']=True or cls.a=True) will have an effect.
class Meta(type):
def __init__(cls, name, bases, dct):
cls.a = True
dct['b'] = True
class Test(object):
__metaclass__ = Meta
t = Test()
print t.a
print t.b # Raises an AttributeError
Are there situations where they are different?
The dct parameter provides you with the original class namespace, so it's not really used in the metaclass __init__ as the class was already created in __new__. The dct and cls.__dict__ might be equal, but they are not the same dictionary. It can be useful in a situation where you might need the information contained in the original namespace, without any manipulation performed by the __new__ method or by parent metaclasses.
They might be different if the __new__ method is adding attributes to the class -- after its creation -- that weren't in the original namespace.
I have been using the following (Jython 2.7) code to decorate functions in some classes:
import sys
import inspect
from decorator import decorator
def useless_decorator(method, *args, **kwargs):
#Does nothing yet :D
return method(*args, **kwargs)
class UselessMetaClass(type):
def __new__(cls, clsname, bases, dict):
for name, method in dict.items():
if not name.startswith('_') and inspect.isroutine(method):
dict[name] = decorator(useless_decorator, method)
return type.__new__(cls, clsname, bases, dict)
class Useless(object):
__metaclass__ = UselessMetaClass
The goal is to decorate all public functions (i.e. ones with names that don't start with an underscore) with the useless_decorator. Of course, this behaviour is only expected in classes that inherit from Useless.
Unfortunately I've been running into metaclass conflict errors. I've had great difficulty debugging them and I think they're occurring for reasons beyond my control (due to a third party library I'm using: Sikuli).
But, maybe I don't need to use a metaclass at all! Does anyone know a way to simulate my above code without using a metaclass?
I.E., Is there any other way to apply a decorator to all functions in a class?
(P.S. I know I could manually decorate each function, but that's not the solution I'm looking for)
Converting your metaclass to a class decorator should be straight forward. A class decorator simly receives the class as argument and returns the (modified) class:
def useless_class_decorator(cls):
for name, method in cls.__dict__.items():
if not name.startswith('_') and inspect.isroutine(method):
setattr(cls, name, decorator(useless_decorator, method))
return cls
The main difference here is that you can't direcly change cls.__dict__ here, as for new style classes that will be a dictproxy which does not support assignment, so you have to use setattr on the class instead. Then you simply create your class:
#useless_class_decorator
class Useless(object):
def method_to_decorate(self, *args, *kwargs):
...
However this won't affect subclasses of Useless, those would also have to be decorated using the class decorator. If that's not acceptable, then a metaclass may be the better option...
This is a Python 3.x version of the How to pass arguments to the metaclass from the class definition? question, listed separately by request since the answer is significantly different from Python 2.x.
In Python 3.x, how do I pass arguments to a metaclass's __prepare__, __new__, and __init__ functions so a class author can give input to the metaclass on how the class should be created?
As my use case, I'm using metaclasses to enable automatic registration of classes and their subclasses into PyYAML for loading/saving YAML files. This involves some extra runtime logic not available in PyYAML's stock YAMLObjectMetaClass. In addition, I want to allow class authors to optionally specify the tag/tag-format-template that PyYAML uses to represent the class and/or the function objects to use for construction and representation. I've already figured out that I can't use a subclass of PyYAML's YAMLObjectMetaClass to accomplish this--"because we don't have access to the actual class object in __new__" according to my code comment--so I'm writing my own metaclass that wraps PyYAML's registration functions.
Ultimately, I want to do something along the lines of:
from myutil import MyYAMLObjectMetaClass
class MyClass(metaclass=MyYAMLObjectMetaClass):
__metaclassArgs__ = ()
__metaclassKargs__ = {"tag": "!MyClass"}
...where __metaclassArgs__ and __metaclassKargs__ would be arguments going to the __prepare__, __new__, and __init__ methods of MyYAMLObjectMetaClass when the MyClass class object is getting created.
Of course, I could use the "reserved attribute names" approach listed in the Python 2.x version of this question, but I know there is a more elegant approach available.
After digging through Python's official documentation, I found that Python 3.x offers a native method of passing arguments to the metaclass, though not without its flaws.
Simply add additional keyword arguments to your class declaration:
class C(metaclass=MyMetaClass, myArg1=1, myArg2=2):
pass
...and they get passed into your metaclass like so:
class MyMetaClass(type):
#classmethod
def __prepare__(metacls, name, bases, **kargs):
#kargs = {"myArg1": 1, "myArg2": 2}
return super().__prepare__(name, bases, **kargs)
def __new__(metacls, name, bases, namespace, **kargs):
#kargs = {"myArg1": 1, "myArg2": 2}
return super().__new__(metacls, name, bases, namespace)
#DO NOT send "**kargs" to "type.__new__". It won't catch them and
#you'll get a "TypeError: type() takes 1 or 3 arguments" exception.
def __init__(cls, name, bases, namespace, myArg1=7, **kargs):
#myArg1 = 1 #Included as an example of capturing metaclass args as positional args.
#kargs = {"myArg2": 2}
super().__init__(name, bases, namespace)
#DO NOT send "**kargs" to "type.__init__" in Python 3.5 and older. You'll get a
#"TypeError: type.__init__() takes no keyword arguments" exception.
You have to leave kargs out of the call to type.__new__ and type.__init__ (Python 3.5 and older; see "UPDATE" below) or will get you a TypeError exception due to passing too many arguments. This means that--when passing in metaclass arguments in this manner--we always have to implement MyMetaClass.__new__ and MyMetaClass.__init__ to keep our custom keyword arguments from reaching the base class type.__new__ and type.__init__ methods. type.__prepare__ seems to handle the extra keyword arguments gracefully (hence why I pass them through in the example, just in case there's some functionality I don't know about that relies on **kargs), so defining type.__prepare__ is optional.
UPDATE
In Python 3.6, it appears type was adjusted and type.__init__ can now handle extra keyword arguments gracefully. You'll still need to define type.__new__ (throws TypeError: __init_subclass__() takes no keyword arguments exception).
Breakdown
In Python 3, you specify a metaclass via keyword argument rather than class attribute:
class MyClass(metaclass=MyMetaClass):
pass
This statement roughly translates to:
MyClass = metaclass(name, bases, **kargs)
...where metaclass is the value for the "metaclass" argument you passed in, name is the string name of your class ('MyClass'), bases is any base classes you passed in (a zero-length tuple () in this case), and kargs is any uncaptured keyword arguments (an empty dict {} in this case).
Breaking this down further, the statement roughly translates to:
namespace = metaclass.__prepare__(name, bases, **kargs) #`metaclass` passed implicitly since it's a class method.
MyClass = metaclass.__new__(metaclass, name, bases, namespace, **kargs)
metaclass.__init__(MyClass, name, bases, namespace, **kargs)
...where kargs is always the dict of uncaptured keyword arguments we passed in to the class definition.
Breaking down the example I gave above:
class C(metaclass=MyMetaClass, myArg1=1, myArg2=2):
pass
...roughly translates to:
namespace = MyMetaClass.__prepare__('C', (), myArg1=1, myArg2=2)
#namespace={'__module__': '__main__', '__qualname__': 'C'}
C = MyMetaClass.__new__(MyMetaClass, 'C', (), namespace, myArg1=1, myArg2=2)
MyMetaClass.__init__(C, 'C', (), namespace, myArg1=1, myArg2=2)
Most of this information came from Python's Documentation on "Customizing Class Creation".
Here's a version of the code from my answer to that other question about metaclass arguments which has been updated so that it'll work in both Python 2 and 3. It essentially does the same thing that Benjamin Peterson's six module's with_metaclass() function does — which namely is to explicitly create a new base class using the desired metaclass on-the-fly, whenever needed and thereby avoiding errors due to the metaclass syntax differences between the two versions of Python (because the way to do that didn't change).
from __future__ import print_function
from pprint import pprint
class MyMetaClass(type):
def __new__(cls, class_name, parents, attrs):
if 'meta_args' in attrs:
meta_args = attrs['meta_args']
attrs['args'] = meta_args[0]
attrs['to'] = meta_args[1]
attrs['eggs'] = meta_args[2]
del attrs['meta_args'] # clean up
return type.__new__(cls, class_name, parents, attrs)
# Creates base class on-the-fly using syntax which is valid in both
# Python 2 and 3.
class MyClass(MyMetaClass("NewBaseClass", (object,), {})):
meta_args = ['spam', 'and', 'eggs']
myobject = MyClass()
pprint(vars(MyClass))
print(myobject.args, myobject.to, myobject.eggs)
Output:
dict_proxy({'to': 'and', '__module__': '__main__', 'args': 'spam',
'eggs': 'eggs', '__doc__': None})
spam and eggs
In Python 3, you specify a metaclass via keyword argument rather than class attribute:
It's worth to say, that this style is not backward compatible to python 2. If you want to support both python 2 and 3, you should use:
from six import with_metaclass
# or
from future.utils import with_metaclass
class Form(with_metaclass(MyMetaClass, object)):
pass
Here's the simplest way to pass arguments to a metaclass in Python 3:
Python 3.x
class MyMetaclass(type):
def __new__(mcs, name, bases, namespace, **kwargs):
return super().__new__(mcs, name, bases, namespace)
def __init__(cls, name, bases, namespace, custom_arg='default'):
super().__init__(name, bases, namespace)
print('Argument is:', custom_arg)
class ExampleClass(metaclass=MyMetaclass, custom_arg='something'):
pass
You can also create a base class for metaclasses that only use __init__ with extra arguments:
class ArgMetaclass(type):
def __new__(mcs, name, bases, namespace, **kwargs):
return super().__new__(mcs, name, bases, namespace)
class MyMetaclass(ArgMetaclass):
def __init__(cls, name, bases, namespace, custom_arg='default'):
super().__init__(name, bases, namespace)
print('Argument:', custom_arg)
class ExampleClass(metaclass=MyMetaclass, custom_arg='something'):
pass
Preamble: I have objects, some of them could be created by default constructor and left without modifications, so such objects could be considered as "empty". Sometimes I need to verify whether some object is "empty" or not. It could be done in the following way (majik methods are implemented in the base class Animal):
>>> a = Bird()
>>> b = Bird()
>>> a == b
True
>>> a == Bird()
True
So the question: is it possible (and if yes then how) to achieve such syntax:
>>> a == Bird.default
True
At least this one (but the previous is sweeter):
>>> a == a.default
True
But: with implementation of default in the base class Animal (to not repeat it in all derived classes):
class Animal(object):
... tech stuff ...
- obj comparison
- obj representation
- etc
class Bird(Animal):
... all about birds ...
class Fish(Animal):
... all about fishes ...
Of course I don't need solutions to have Bird() calling in Animal class :)
I'd like to have a kind of templating implemented in base class which will stamp out derived class default instance and store its unique copy in the derived class or instance property. I think it could be achieved by playing with metaclasses or so, but don't know how.
Class default instance could be considered as any object instantiated by __init__() of its class (without further object modification of course).
UPDATE
The system is flooded with objects and I just want to have a possibility to separate circulating of freshly (by default) created objects (which are useless to display for example) from already somehow modified one. I do it by:
if a == Bird():
. . .
I don't want creation of new object for comparison, intuitevly, I'd like to have one instance copy as etalon for the instances of this class to compare with. Objects are JSON-like and contain only properties (besides implicit __str__, __call__, __eq__ methods), so I'd like to keep such style of using built-in Python features and avoid the using explicitly defined methods like is_empty() for example. It's like entering an object in the interactive shell and it prints it out calling __str__, it is implicit, but fun.
To achieve the first solution you should use a metaclass.
For example:
def add_default_meta(name, bases, attrs):
cls = type(name, bases, attrs)
cls.default = cls()
return cls
And use it as(assuming python3. In python2 set the __metaclass__ attribute in the class body):
class Animal(object, metaclass=add_default_meta):
# stuff
class NameClass(Animal, metaclass=add_default_meta):
# stuff
Note that you have repeat the metaclass=... for every subclass of Animal.
If instead of a function you use a class and its __new__ method to implement the metaclass, it can be inherited, i.e:
class AddDefaultMeta(type):
def __new__(cls, name, bases, attrs):
cls = super(AddDefaultMeta, cls).__new__(cls, name, bases, attrs)
cls.default = cls()
return cls
A different way to achieve the same effect is to use a class decorator:
def add_default(cls):
cls.default = cls()
return cls
#add_default
class Bird(Animal):
# stuff
Again, you must use the decorator for every subclass.
If you want to achieve the second solution, i.e. to check a == a.default, then you can simply reimplement Animal.__new__:
class Animal(object):
def __new__(cls, *args, **kwargs):
if not (args or kwargs) and not hasattr(cls, 'default'):
cls.default = object.__new__(cls)
return cls.default
else:
return object.__new__(cls)
This will create the empty instance whenever the first instance of the class is created and it is stored in the default attribute.
This means that you can do both:
a == a.default
and
a == Bird.default
But accessing Bird.default gives AttributeError if you didn't create any Bird instance.
Style note: Bird.Default looks very bad to me. Default is an instance of Bird not a type, hence you should use lowercase_with_underscore according to PEP 8.
In fact the whole thing looks fishy for me. You could simply have an is_empty() method. It's pretty easy to implement:
class Animal(object):
def __init__(self, *args, **kwargs):
# might require more complex condition
self._is_empty = not (bool(args) or bool(kwargs))
def is_empty(self):
return self._is_empty
Then when the subclasses create an empty instance that doesn't pass any arguments to the base class the _is_empty attribute will be True and hence the inherited method will return True accordingly, while in the other cases some argument would be passed to the base class which would set _is_empty to False.
You can play around with this in order to obtain a more robust condition that works better with your subclasses.
Another possible metaclass:
class DefaultType(type):
def __new__(cls, name, bases, attrs):
new_cls = super(DefaultType, cls).__new__(cls, name, bases, attrs)
new_cls.default = new_cls()
return new_cls
You only need to set the metaclass attribute for the Animal class, as all derived classes will inherit it:
class Animal(object):
__metaclass__ = DefaultType
# ...
class Bird(Animal):
# ...
This allows you to use both:
a == Bird.default
and:
a == a.default
For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.