How to get the class name of a staticmethod - python

So there is a static method in base class, and sub classes should use it.
However, I need to know which sub class calls the static method.
The code is like this:
class BaseClass():
#staticmethod
def getname():
#some magic
class SubClassA(BaseClass):
pass
class SubClassB(BaseClass):
pass
SubClassA.getname() #hope to see 'SubClassA'
SubClassB.getname() #hope to see 'SubClassB'
Or, is this even possible?

not possible with staticmethod. is possible, however, with classmethod
class A(object):
#classmethod
def f(cls):
print cls

By definition, a staticmethod is not provided with a reference to either an invoking class or instance -- that's what "static" means in Python.
You can do some hacky things to achieve this with either metaclasses or decorators.
Note that I'm using the example idea of just getting the class's name from your post, however, you can modify this example to work with any kind of function you want to have as a static function. You would just define that function inside getnameable below, like I have defined getname and make use of the some_class argument below (and you'd use a different name than all the "getname" stuff I use here).
With decorators, you could do this:
def getnameable(some_class):
# Note, this could be any kind of static method that you want, not
# just for getting the name. And it can use `some_class` in whatever
# way is needed.
def getname():
return some_class.__name__
some_class.getname = staticmethod(getname)
return some_class
then this works:
In [334]: #getnameable
class SubClassA(BaseClass):
pass
.....:
In [335]: SubClassA.getname()
Out[335]: 'SubClassA'
but note that if you implemented this in BaseClass directly, then the class name which gets bound would be BaseClass, even in the children classes. So in this case, you'd need to put the decorator on every class you wanted.
Metaclasses offers away around that, by indicating that you want this business of decorating the class to be part of class-creation (not instance creation, mind you).
class GetNameableMeta(type):
def __new__(cls, name, bases, attrs):
temp_class = super(GetNameableMeta, cls).__new__(cls, name, bases, attrs)
return getnameable(temp_class)
class BaseClass(object):
__metaclass__ = GetNameableMeta
class SubClassA(BaseClass):
pass
Testing it out:
In [337]: %cpaste
Pasting code; enter '--' alone on the line to stop or use Ctrl-D.
:class GetNameableMeta(type):
: def __new__(cls, name, bases, attrs):
: temp_class = super(GetNameableMeta, cls).__new__(cls, name, bases, attrs)
: return getnameable(temp_class)
:
:class BaseClass(object):
: __metaclass__ = GetNameableMeta
:
:class SubClassA(BaseClass):
: pass
:--
In [338]: SubClassA.getname()
Out[338]: 'SubClassA'
In [339]: BaseClass.getname()
Out[339]: 'BaseClass'
Notice how much high-falutin code we needed to write to do this, when we had several reasonable alternatives:
Just ask directly for the __name__ attribute.
Make it a classmethod to begin with.
Use the decorators just where we need this and give up the inheritance part.
I suspect that in "the essence of Python" any of these is better than the machinery I describe above, since it makes for simpler, easier to understand code.

Related

Using a base class function that takes parameters as a decorator for derived class function

I feel like I have a pretty good grasp on using decorators when dealing with regular functions, but between using methods of base classes for decorators in derived classes, and passing parameters to said decorators, I cannot figure out what to do next.
Here is a snippet of code.
class ValidatedObject:
...
def apply_validation(self, field_name, code):
def wrap(self, f):
self._validations.append(Validation(field_name, code, f))
return f
return wrap
class test(ValidatedObject):
....
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
If I try this as is, I get an "apply_validation" is not found.
If I try it with #self.apply_validation I get a "self" isn't found.
I've also been messing around with making apply_validation a class method without success.
Would someone please explain what I'm doing wrong, and the best way to fix this? Thank you.
The issue you're having is that apply_validation is a method, which means you need to call it on an instance of ValidatedObject. Unfortunately, at the time it is being called (during the definition of the test class), there is no appropriate instance available. You need a different approach.
The most obvious one is to use a metaclass that searches through its instance dictionaries (which are really class dictionaries) and sets up the _validations variable based on what it finds. You can still use a decorator, but it probably should be a global function, or perhaps a static method, and it will need to work differently. Here's some code, that uses a metaclass and a decorator that adds function attributes:
class ValidatedMeta(type):
def __new__(meta, name, bases, dct):
validations = [Validation(f._validation_field_name, f._validation_code, f)
for f in dct.values if hasattr(f._validation_field_name)]
dct["_validations"] = validations
super(ValidatedMeta, meta).__new__(meta, name, bases, dct)
def apply_validation(field_name, code):
def decorator(f):
f._validation_field_name = field_name
f._validation_code = code
return f
return decorator
def ValidatedObject(metaclass=ValidatedMeta):
pass
class test(ValidatedObject):
#apply_validation("_name", "oh no!")
def name_validation(self, name):
return name == "jacob"
After this code runs, test._validations will be [Validation("_name", "oh no!", test.name_validation)]. Note that the method that is be passed to Validation is unbound, so you'll need to pass it a self argument yourself when you call it (or perhaps drop the self argument and change the decorator created in apply_validation to return staticmethod(f)).
This code may not do what you want if you have validation methods defined at several levels of an inheritance hierarchy. The metaclass as written above only checks the immediate class's dict for methods with the appropriate attributes. If you need it include inherited methods in _validations too, you may need to modify the logic in ValidatedMeta.__new__. Probably the easiest way to go is to look for _validations attributes in the bases and concatenate the lists together.
Just an example for using decorators on class method:
from functools import wraps
def VALIDATE(dec):
#wraps(dec)
def _apply_validation(self, name):
self.validate(name)
return dec(self, name)
return _apply_validation
class A:
def validate(self, name):
if name != "aamir":
raise Exception, 'Invalid name "%s"' % name
class B(A):
#VALIDATE
def name_validation(self, name):
return name
b = B()
b.name_validation('jacob') # should raise exception

Is there a way apply a decorator to a Python method that needs informations about the class?

When you decorate a method, it is not bound yet to the class, and therefor doesn't have the im_class attribute yet. I looking for a way to get the information about the class inside the decorator. I tried this:
import types
def decorator(method):
def set_signal(self, name, value):
print name
if name == 'im_class':
print "I got the class"
method.__setattr__ = types.MethodType(set_signal, method)
return method
class Test(object):
#decorator
def bar(self, foo):
print foo
But it doesn't print anything.
I can imagine doing this:
class Test(object):
#decorator(klass=Test)
def bar(self, foo):
print foo
But if I can avoid it, it would make my day.
__setattr__ is only called on explicit object.attribute = assignments; building a class does not use attribute assignment but builds a dictionary (Test.__dict__) instead.
To access the class you have a few different options though:
Use a class decorator instead; it'll be passed the completed class after building it, you could decorate individual methods on that class by replacing them (decorated) in the class. You could use a combination of a function decorator and a class decorator to mark which methods are to be decorated:
def methoddecoratormarker(func):
func._decorate_me = True
return func
def realmethoddecorator(func):
# do something with func.
# Note: it is still an unbound function here, not a method!
return func
def classdecorator(klass):
for name, item in klass.__dict__.iteritems():
if getattr(item, '_decorate_me', False):
klass.__dict__[name] = realmethoddecorator(item)
You could use a metaclass instead of a class decorator to achieve the same, of course.
Cheat, and use sys._getframe() to retrieve the class from the calling frame:
import sys
def methoddecorator(func):
callingframe = sys._getframe(1)
classname = callingframe.f_code.co_name
Note that all you can retrieve is the name of the class; the class itself is still being built at this time. You can add items to callingframe.f_locals (a mapping) and they'll be made part of the new class object.
Access self whenever the method is called. self is a reference to the instance after all, and self.__class__ is going to be, at the very least, a sub-class of the original class the function was defined in.
My strict answer would be: It's not possible, because the class does not yet exist when the decorator is executed.
The longer answer would depend on your very exact requirements. As I wrote, you cannot access the class if it does not yet exists. One solution would be, to mark the decorated method to be "transformed" later. Then use a metaclass or class decorator to apply your modifications after the class has been created.
Another option involves some magic. Look for the implementation of the implements method in zope.interfaces. It has some access to the information about the class which is just been parsed. Don't know if it will be enough for your use case.
You might want to take a look at descriptors. They let you implement a __get__ that is used when an attribute is accessed, and can return different things depending on the object and its type.
Use method decorators to add some marker attributes to the interesting methods, and use a metaclass which iterates over the methods, finds the marker attributes, and does the logic. The metaclass code is run when the class is created, so it has a reference to the newly created class.
class MyMeta(object):
def __new__(...):
...
cls = ...
... iterate over dir(cls), find methods having .is_decorated, act on them
return cls
def decorator(f):
f.is_decorated = True
return f
class MyBase(object):
__metaclass__ = MyMeta
class MyClass(MyBase):
#decorator
def bar(self, foo):
print foo
If you worry about that the programmer of MyClass forgets to use MyBase, you can forcibly set the metaclass in decorator, by exampining the globals dicitionary of the caller stack frame (sys._getframe()).

In Python can one implement mixin behavior without using inheritance?

Is there a reasonable way in Python to implement mixin behavior similar to that found in Ruby -- that is, without using inheritance?
class Mixin(object):
def b(self): print "b()"
def c(self): print "c()"
class Foo(object):
# Somehow mix in the behavior of the Mixin class,
# so that all of the methods below will run and
# the issubclass() test will be False.
def a(self): print "a()"
f = Foo()
f.a()
f.b()
f.c()
print issubclass(Foo, Mixin)
I had a vague idea to do this with a class decorator, but my attempts led to confusion. Most of my searches on the topic have led in the direction of using inheritance (or in more complex scenarios, multiple inheritance) to achieve mixin behavior.
def mixer(*args):
"""Decorator for mixing mixins"""
def inner(cls):
for a,k in ((a,k) for a in args for k,v in vars(a).items() if callable(v)):
setattr(cls, k, getattr(a, k).im_func)
return cls
return inner
class Mixin(object):
def b(self): print "b()"
def c(self): print "c()"
class Mixin2(object):
def d(self): print "d()"
def e(self): print "e()"
#mixer(Mixin, Mixin2)
class Foo(object):
# Somehow mix in the behavior of the Mixin class,
# so that all of the methods below will run and
# the issubclass() test will be False.
def a(self): print "a()"
f = Foo()
f.a()
f.b()
f.c()
f.d()
f.e()
print issubclass(Foo, Mixin)
output:
a()
b()
c()
d()
e()
False
You can add the methods as functions:
Foo.b = Mixin.b.im_func
Foo.c = Mixin.c.im_func
I am not that familiar with Python, but from what I know about Python metaprogramming, you could actually do it pretty much the same way it is done in Ruby.
In Ruby, a module basically consists of two things: a pointer to a method dictionary and a pointer to a constant dictionary. A class consists of three things: a pointer to a method dictionary, a pointer to a constant dictionary and a pointer to the superclass.
When you mix in a module M into a class C, the following happens:
an anonymous class α is created (this is called an include class)
α's method dictionary and constant dictionary pointers are set equal to M's
α's superclass pointer is set equal to C's
C's superclass pointer is set to α
In other words: a fake class which shares its behavior with the mixin is injected into the inheritance hierarchy. So, Ruby actually does use inheritance for mixin composition.
I left out a couple of subleties above: first off, the module doesn't actually get inserted as C's superclass, it gets inserted as C's superclasses' (which is C's singleton class) superclass. And secondly, if the mixin itself has mixed in other mixins, then those also get wrapped into fake classes which get inserted directly above α, and this process is applied recursively, in case the mixed in mixins in turn have mixins.
Basically, the whole mixin hierarchy gets flattened into a straight line and spliced into the inheritance chain.
AFAIK, Python actually allows you to change a class's superclass(es) after the fact (something which Ruby does not allow you to do), and it also gives you access to a class's dict (again, something that is impossible in Ruby), so you should be able to implement this yourself.
EDIT: Fixed what could (and probably should) be construed as a bug. Now it builds a new dict and then updates that from the class's dict. This prevents mixins from overwriting methods that are defined directly on the class. The code is still untested but should work. I'm busy ATM so I'll test it later. It worked fine except for a syntax error. In retrospect, I decided that I don't like it (even after my further improvements) and much prefer my other solution even if it is more complicated. The test code for that one applies here as well but I wont duplicate it.
You could use a metaclass factory:
import inspect
def add_mixins(*mixins):
Dummy = type('Dummy', mixins, {})
d = {}
for mixin in reversed(inspect.getmro(Dummy)):
d.update(mixin.__dict__)
class WithMixins(type):
def __new__(meta, classname, bases, classdict):
d.update(classdict)
return super(WithMixins, meta).__new__(meta, classname, bases, d)
return WithMixins
then use it like:
class Foo(object):
__metaclass__ = add_mixins(Mixin1, Mixin2)
# rest of the stuff
This one is based on the way it's done in ruby as explained by Jörg W Mittag. All of the wall of code after if __name__=='__main__' is test/demo code. There's actually only 13 lines of real code to it.
import inspect
def add_mixins(*mixins):
Dummy = type('Dummy', mixins, {})
d = {}
# Now get all the class attributes. Use reversed so that conflicts
# are resolved with the proper priority. This rules out the possibility
# of the mixins calling methods from their base classes that get overridden
# using super but is necessary for the subclass check to fail. If that wasn't a
# requirement, we would just use Dummy above (or use MI directly and
# forget all the metaclass stuff).
for base in reversed(inspect.getmro(Dummy)):
d.update(base.__dict__)
# Create the mixin class. This should be equivalent to creating the
# anonymous class in Ruby.
Mixin = type('Mixin', (object,), d)
class WithMixins(type):
def __new__(meta, classname, bases, classdict):
# The check below prevents an inheritance cycle from forming which
# leads to a TypeError when trying to inherit from the resulting
# class.
if not any(issubclass(base, Mixin) for base in bases):
# This should be the the equivalent of setting the superclass
# pointers in Ruby.
bases = (Mixin,) + bases
return super(WithMixins, meta).__new__(meta, classname, bases,
classdict)
return WithMixins
if __name__ == '__main__':
class Mixin1(object):
def b(self): print "b()"
def c(self): print "c()"
class Mixin2(object):
def d(self): print "d()"
def e(self): print "e()"
class Mixin3Base(object):
def f(self): print "f()"
class Mixin3(Mixin3Base): pass
class Foo(object):
__metaclass__ = add_mixins(Mixin1, Mixin2, Mixin3)
def a(self): print "a()"
class Bar(Foo):
def f(self): print "Bar.f()"
def test_class(cls):
print "Testing {0}".format(cls.__name__)
f = cls()
f.a()
f.b()
f.c()
f.d()
f.e()
f.f()
print (issubclass(cls, Mixin1) or
issubclass(cls, Mixin2) or
issubclass(cls, Mixin3))
test_class(Foo)
test_class(Bar)
You could decorate the classes __getattr__ to check in the mixin. The problem is that all methods of the mixin would always require an object the type of the mixin as their first parameter, so you would have to decorate __init__ as well to create a mixin-object. I believe you could achieve this using a class decorator.
from functools import partial
class Mixin(object):
#staticmethod
def b(self): print "b()"
#staticmethod
def c(self): print "c()"
class Foo(object):
def __init__(self, mixin_cls):
self.delegate_cls = mixin_cls
def __getattr__(self, attr):
if hasattr(self.delegate_cls, attr):
return partial(getattr(self.delegate_cls, attr), self)
def a(self): print "a()"
f = Foo(Mixin)
f.a()
f.b()
f.c()
print issubclass(Foo, Mixin)
This basically uses the Mixin class as a container to hold ad-hoc functions (not methods) that behave like methods by taking an object instance (self) as the first argument. __getattr__ will redirect missing calls to these methods-alike functions.
This passes your simple tests as shown below. But I cannot guarantee it will do all the things you want. Make more thorough test to make sure.
$ python mixin.py
a()
b()
c()
False
Composition? It seems like that would be the simplest way to handle this: either wrap your object in a decorator or just import the methods as an object into your class definition itself. This is what I usually do: put the methods that I want to share between classes in a file and then import the file. If I want to override some behavior I import a modified file with the same method names as the same object name. It's a little sloppy, but it works.
For example, if I want the init_covers behavior from this file (bedg.py)
import cove as cov
def init_covers(n):
n.covers.append(cov.Cover((set([n.id]))))
id_list = []
for a in n.neighbors:
id_list.append(a.id)
n.covers.append(cov.Cover((set(id_list))))
def update_degree(n):
for a in n.covers:
a.degree = 0
for b in n.covers:
if a != b:
a.degree += len(a.node_list.intersection(b.node_list))
In my bar class file I would do: import bedg as foo
and then if I want to change my foo behaviors in another class that inherited bar, I write
import bild as foo
Like I say, it is sloppy.

python: subclass a metaclass

For putting methods of various classes into a global registry I'm using a decorator with a metaclass. The decorator tags, the metaclass puts the function in the registry:
class ExposedMethod (object):
def __init__(self, decoratedFunction):
self._decoratedFunction = decoratedFunction
def __call__(__self,*__args,**__kw):
return __self._decoratedFunction(*__args,**__kw)
class ExposedMethodDecoratorMetaclass(type):
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.iteritems():
if isinstance(obj, ExposedMethod):
WorkerFunctionRegistry.addWorkerToWorkerFunction(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class MyClass (object):
__metaclass__ = DiscoveryExposedMethodDecoratorMetaclass
#ExposeDiscoveryMethod
def myCoolExposedMethod (self):
pass
I've now came to the point where two function registries are needed. The first thought was to subclass the metaclass and put the other registry in. For that the new method has simply to be rewritten.
Since rewriting means redundant code this is not what I really want. So, it would be nice if anyone could name a way how to put an attribute inside of the metaclass which is able to be read when new is executed. With that the right registry could be put in without having to rewrite new.
Your ExposedMethod instances do not behave as normal instance methods but rather like static methods -- the fact that you're giving one of them a self argument hints that you're not aware of that. You may need to add a __get__ method to the ExposedMethod class to make it a descriptor, just like function objects are -- see here for more on descriptors.
But there is a much simpler way, since functions can have attributes...:
def ExposedMethod(registry=None):
def decorate(f):
f.registry = registry
return f
return decorate
and in a class decorator (simpler than a metaclass! requires Python 2.6 or better -- in 2.5 or earlier you'll need to stick w/the metaclass or explicitly call this after the class statement, though the first part of the answer and the functionality of the code below are still perfectly fine):
def RegisterExposedMethods(cls):
for name, f in vars(cls).iteritems():
if not hasattr(f, 'registry'): continue
registry = f.registry
if registry is None:
registry = cls.registry
registry.register(name, cls.__name__)
return cls
So you can do:
#RegisterExposedMethods
class MyClass (object):
#ExposeMethod(WorkerFunctionRegistry)
def myCoolExposedMethod (self):
pass
and the like. This is easily extended to allowing an exposed method to have several registries, get the default registry elsewhere than from the class (it could be in the class decorator, for example, if that works better for you) and avoids getting enmeshed with metaclasses without losing any functionality. Indeed that's exactly why class decorators were introduced in Python 2.6: they can take the place of 90% or so of practical uses of metaclasses and are much simpler than custom metaclasses.
You can use a class attribute to point to the registry you want to use in the specialized metaclasses, e.g. :
class ExposedMethodDecoratorMetaclassBase(type):
registry = None
def __new__(mcs, name, bases, dct):
for obj_name, obj in dct.items():
if isinstance(obj, ExposedMethod):
mcs.registry.register(obj_name, name)
return type.__new__(mcs, name, bases, dct)
class WorkerExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = WorkerFunctionRegistry
class RetiredExposedMethodDecoratorMetaclass(ExposedMethodDecoratorMetaclassBase):
registry = RetiredFunctionRegistry
Thank you both for your answers. Both helped alot to find a proper way for my request.
My final solution to the problem is the following:
def ExposedMethod(decoratedFunction):
decoratedFunction.isExposed = True
return decoratedFunction
class RegisterExposedMethods (object):
def __init__(self, decoratedClass, registry):
self._decoratedClass = decoratedClass
for name, f in vars(self._decoratedClass).iteritems():
if hasattr(f, "isExposed"):
registry.addComponentClassToComponentFunction(name, self._decoratedClass.__name__)
# cloak us as the original class
self.__class__.__name__ = decoratedClass.__name__
def __call__(self,*__args,**__kw):
return self._decoratedClass(*__args,**__kw)
def __getattr__(self, name):
return getattr(self._decoratedClass, name)
On a Class I wish to expose methods from I do the following:
#RegisterExposedMethods
class MyClass (object):
#ExposedMethod
def myCoolExposedMethod (self):
pass
The class decorator is now very easy to be subclassed. Here is an example:
class DiscoveryRegisterExposedMethods (RegisterExposedMethods):
def __init__(self, decoratedClass):
RegisterExposedMethods.__init__(self,
decoratedClass,
DiscoveryFunctionRegistry())
With that the comment of Alex
Your ExposedMethod instances do not behave as normal instance methods ...
is no longer true, since the method is simply tagged and not wrapped.

Python: Class factory using user input as class names

I want to add class atttributes to a superclass dynamically. Furthermore, I want to create classes that inherit from this superclass dynamically, and the name of those subclasses should depend on user input.
There is a superclass "Unit", to which I can add attributes at runtime. This already works.
def add_attr (cls, name, value):
setattr(cls, name, value)
class Unit(object):
pass
class Archer(Unit):
pass
myArcher = Archer()
add_attr(Unit, 'strength', 5)
print "Strenght ofmyarcher: " + str(myArcher.strength)
Unit.strength = 2
print "Strenght ofmyarcher: " + str(myArcher.strength)
This leads to the desired output:
Strenght ofmyarcher: 5
Strenght ofmyarcher: 2
But now I don't want to predefine the subclass Archer, but I'd rather let the user decide how to call this subclass. I've tried something like this:
class Meta(type, subclassname):
def __new__(cls, subclassname, bases, dct):
return type.__new__(cls, subclassname, Unit, dct)
factory = Meta()
factory.__new__("Soldier")
but no luck. I guess I haven't really understood what new does here.
What I want as a result here is
class Soldier(Unit):
pass
being created by the factory. And if I call the factory with the argument "Knight", I'd like a class Knight, subclass of Unit, to be created.
Any ideas? Many thanks in advance!
Bye
-Sano
To create a class from a name, use the class statement and assign the name. Observe:
def meta(name):
class cls(Unit):
pass
cls.__name__ = name
return cls
Now I suppose I should explain myself, and so on. When you create a class using the class statement, it is done dynamically-- it is equivalent of calling type().
For example, the following two snippets do the same thing:
class X(object): pass
X = type("X", (object,), {})
The name of a class-- the first argument to type-- is assigned to __name__, and that's basically the end of that (the only time __name__ is itself used is probably in the default __repr__() implementation). To create a class with a dynamic name, you can in fact call type like so, or you can just change the class name afterward. The class syntax exists for a reason, though-- it's convenient, and it's easy to add to and change things later. If you wanted to add methods, for example, it would be
class X(object):
def foo(self): print "foo"
def foo(self): print "foo"
X = type("X", (object,), {'foo':foo})
and so on. So I would advise using the class statement-- if you had known you could do so from the beginning, you likely would have done so. Dealing with type and so on is a mess.
(You should not, by the way, call type.__new__() by hand, only type())
Have a look at the type() builtin function.
knight_class = type('Knight', (Unit,), {})
First parameter: Name of new class
Second parameter: Tuple of parent classes
Third parameter: dictionary of class attributes.
But in your case, if the subclasses don't implement a different behaviour, maybe giving the Unit class a name attribute is sufficient.

Categories

Resources