In Python 3, if any value that is not an instance of cls is returned, the __init__ method is never called. So I can, for example, do this:
class Foo:
#staticmethod
def bar(n):
return n * 5
def __new__(cls, n):
return Foo.bar(n)
print(Foo(3)) # => 15
I was under the impression that the order was __call__ (if it's an instance) -> __new__ -> __init__.
However, in Python 2, this seems to raise a TypeError: this constructor takes no arguments due to the lack of an __init__. I can fix that by inheriting from object. So, running this:
class Foo:
def __new__(cls, *args, **kwargs):
print("new called")
def __init__(self, *args, **kwargs):
print("init called")
Foo()
"""
Python2: "init called"
Python3: "new called"
"""
In Python 2, I even messed around with metaclasses.
Meta = type("Meta", (type,), dict(__call__=lambda self, x: x * 5))
class Foo(object):
__metaclass__ = Meta
print(Foo(4)) # => 20
But this does not work in Python3 because the init/new methods seem to be reversed.
Is there any Python2/3 compatible way of doing this?
Solution:
This is the way I did it. I don't like it, but it works:
class Foo(object):
#staticmethod
def __call__(i):
return i * 5
def __new__(cls, i):
return Foo.__call__(i)
Surely there is a more pythonic way of doing this.
In Python 2, you need to use new-style classes to make classes work properly. That means you need to define your class as class Foo(object). Then your first example will work in both Python 2 and Python 3.
Related
I am trying to find a good way for returning a (new) class object in class method that can be extended as well.
I have a class (classA) which has among other methods, a method that returns a new classA object after some processing
class classA:
def __init__(): ...
def methodX(self, **kwargs):
process data
return classA(new params)
Now, I am extending this class to another classB. I need methodX to do the same, but return classB this time, instead of classA
class classB(classA):
def __init__(self, params):
super().__init__(params)
self.newParams = XYZ
def methodX(self, **kwargs):
???
This may be something trivial but I simply cannot figure it out. In the end I dont want to rewrite the methodX each time the class gets extended.
Thank you for your time.
Use the __class__ attribute like this:
class A:
def __init__(self, **kwargs):
self.kwargs = kwargs
def methodX(self, **kwargs):
#do stuff with kwargs
return self.__class__(**kwargs)
def __repr__(self):
return f'{self.__class__}({self.kwargs})'
class B(A):
pass
a = A(foo='bar')
ax = a.methodX(gee='whiz')
b = B(yee='haw')
bx = b.methodX(cool='beans')
print(a)
print(ax)
print(b)
print(bx)
class classA:
def __init__(self, x):
self.x = x
def createNew(self, y):
t = type(self)
return t(y)
class classB(classA):
def __init__(self, params):
super().__init__(params)
a = classA(1)
newA = a.createNew(2)
b = classB(1)
newB = b.createNew(2)
print(type(newB))
# <class '__main__.classB'>
I want to propose what I think is the cleanest approach, albeit similar to existing answers. The problem feels like a good fit for a class method:
class A:
#classmethod
def method_x(cls, **kwargs):
return cls(<init params>)
Using the #classmethod decorator ensures that the first input (traditionally named cls) will refer to the Class to which the method belongs, rather than the instance.
(usually we call the first method input self and this refers to the instance to which the method belongs)
Because cls refers to A, rather than an instance of A, we can call cls() as we would call A().
However, in a class that inherits from A, cls will instead refer to the child class, as required:
class A:
def __init__(self, x):
self.x = x
#classmethod
def make_new(cls, **kwargs):
y = kwargs["y"]
return cls(y) # returns A(y) here
class B(A):
def __init__(self, x):
super().__init__(x)
self.z = 3 * x
inst = B(1).make_new(y=7)
print(inst.x, inst.z)
And now you can expect that print statement to produce 7 21.
That inst.z exists should confirm for you that the make_new call (which was only defined on A and inherited unaltered by B) has indeed made an instance of B.
However, there's something I must point out. Inheriting the unaltered make_new method only works because the __init__ method on B has the same call signature as the method on A. If this weren't the case then the call to cls might have had to be altered.
This can be circumvented by allowing **kwargs on the __init__ method and passing generic **kwargs into cls() in the parent class:
class A:
def __init__(self, **kwargs):
self.x = kwargs["x"]
#classmethod
def make_new(cls, **kwargs):
return cls(**kwargs)
class B(A):
def __init__(self, x, w):
super().__init__(x=x)
self.w = w
inst = B(1,2).make_new(x="spam", w="spam")
print(inst.x, inst.w)
Here we were able to give B a different (more restrictive!) signature.
This illustrates a general principle, which is that parent classes will typically be more abstract/less specific than their children.
It follows that, if you want two classes that substantially share behaviour but which do quite specific different things, it will be better to create three classes: one rather abstract one that defines the behaviour-in-common, and two children that give you the specific behaviours you want.
Lets say I have this class:
class Test(object):
def __init__(self, a):
self.a = a
def test(self, b):
if isinstance(self, Test):
return self.a + b
else:
return self + b
This would ideally in my world do this:
>>> Test.test(1,2)
3
>>> Test(1).test(2)
3
Now this doesn't work because you get this error:
TypeError: unbound method test() must be called with Test instance as first argument (got int instance instead)
In python3 this works fine, and I have the sneaking suspicion this is possible with a decorator in python2 but my python foo isn't strong enough to get that to work.
Plot Twist: So what happens when I need something on self when it's not called statically.
If you want something that will actually receive self if called on an instance, but can also be called on the class, writing your own descriptor type may be advisable:
import types
class ClassOrInstanceMethod(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __get__(self, instance, owner):
if instance is None:
instance = owner
return self.wrapped.__get__(instance, owner)
class demo(object):
#ClassOrInstanceMethod
def foo(self):
# self will be the class if this is called on the class
print(self)
Demo.
For the original version of your question, you could just write it like any other static method, with #staticmethod. Calling a static method on an instance works the same as calling it on the class:
class Test(object):
#staticmethod
def test(a, b):
return a + b
Demo.
I have a class B that inherits from A :
class A():
def do_something(self, x):
"""Prints x."""
print(x)
class B(A):
def something_else(self, x):
print("This isn't the same.")
I'd like to achieve a few things :
I'd like for B.do_something to inherit the docstring from A.do_something. I think functools.wraps is the recommended solution : is that right ?
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
def method_of_A(self, *args, **kwargs):
return A(super(self.__class__, self).method_of_A(*args, **kwargs))
There's likely a better way - especially given that I have to do this for a large number of classes. Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B ? EDIT : I can't make changes to A's codebase.
Are there solutions that are Py2 and Py3 compatible ?
Thanks very much for any suggestions.
Yes, you can use functools.wraps to copy the function name and docstring. You can return an instance of the current class using self.__class__
class A(object):
def func(self):
return self.__class__()
class B(A):
#functools.wraps(A.func)
def func(self):
return super(B, self).func()
>>> b = B()
>>> obj = b.return_object()
>>> print type(obj)
"<class '__main__.B'>"
Is there same way to check if a function is defined within B and, if not but available in A, have it decorated / wrapped to return an instance of B?
You may be able to do this using metaclasses, assuming A isn't already using a custom metaclass that you're not able to inherit from (like if it is only defined in C and hasn't been exposed to python). The way you use metaclasses is slightly different in python 2 and 3.
class MetaB(type):
def __new__(cls, name, bases, attrs):
if A in bases:
for attr, value in A.__dict__.items():
if isinstance(value, types.FunctionType) and attr not in attrs:
new_func = MyMeta.make_wrapper_func(value)
attrs[attr] = new_func
return super(MetaB, cls).__new__(cls, name, bases, attrs)
#staticmethod
def make_wrapper_func(func):
#functools.wraps(func)
def _func(self, *args, **kwargs):
value = func(self, *args, **kwargs)
if isinstance(value, A):
value = self.__class__(value)
return value
return _func
class B(A):
__metaclass__ = MetaB
...
In python 3, metaclasses are used a little differently
class B(A, metaclass=MetaB):
...
This assumes you can create an object of the B() type just by passing an instance of A() to the constructor for it (ie. return self.__class__(value)). That was just a guess. I'd have to know a litte more about your object to know how to translate an A object to a B object, but the general method would be the same. This solution also only works on regular class methods. It's not going to work on some other stuff like classmethods and staticmethods or other types of descriptor objects. You certainly could make it work for all those, your metaclass would just need to be a little more complex.
Let's say there are some methods of A that return an instance of A. If I call those methods from B, I'd like them to return an instance of B. So far, I'm overloading each function manually.
Use a classmethod.
class A(object):
#classmethod
def f(cls):
return cls
when b (instance of B) will call f, it will return B.
I am trying to make a python decorator that adds attributes to methods of a class so that I can access and modify those attributes from within the method itself. The decorator code is
from types import MethodType
class attribute(object):
def __init__(self, **attributes):
self.attributes = attributes
def __call__(self, function):
class override(object):
def __init__(self, function, attributes):
self.__function = function
for att in attributes:
setattr(self, att, attributes[att])
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, owner):
return MethodType(self, instance, owner)
retval = override(function, self.attributes)
return retval
I tried this decorator on the toy example that follows.
class bar(object):
#attribute(a=2)
def foo(self):
print self.foo.a
self.foo.a = 1
Though I am able to access the value of attribute 'a' from within foo(), I can't set it to another value. Indeed, when I call bar().foo(), I get the following AttributeError.
AttributeError: 'instancemethod' object has no attribute 'a'
Why is this? More importantly how can I achieve my goal?
Edit
Just to be more specific, I am trying to find a simple way to implement static variable that are located within class methods. Continuing from the example above, I would like instantiate b = bar(), call both foo() and doo() methods and then access b.foo.a and b.doo.a later on.
class bar(object):
#attribute(a=2)
def foo(self):
self.foo.a = 1
#attribute(a=4)
def doo(self):
self.foo.a = 3
The best way to do this is to not do it at all.
First of all, there is no need for an attribute decorator; you can just assign it yourself:
class bar(object):
def foo(self):
print self.foo.a
self.foo.a = 1
foo.a = 2
However, this still encounters the same errors. You need to do:
self.foo.__dict__['a'] = 1
You can instead use a metaclass...but that gets messy quickly.
On the other hand, there are cleaner alternatives.
You can use defaults:
def foo(self, a):
print a[0]
a[0] = 2
foo.func_defaults = foo.func_defaults[:-1] + ([2],)
Of course, my preferred way is to avoid this altogether and use a callable class ("functor" in C++ words):
class bar(object):
def __init__(self):
self.foo = self.foo_method(self)
class foo_method(object):
def __init__(self, bar):
self.bar = bar
self.a = 2
def __call__(self):
print self.a
self.a = 1
Or just use classic class attributes:
class bar(object):
def __init__(self):
self.a = 1
def foo(self):
print self.a
self.a = 2
If it's that you want to hide a from derived classes, use whatever private attributes are called in Python terminology:
class bar(object):
def __init__(self):
self.__a = 1 # this will be implicitly mangled as __bar__a or similar
def foo(self):
print self.__a
self.__a = 2
EDIT: You want static attributes?
class bar(object):
a = 1
def foo(self):
print self.a
self.a = 2
EDIT 2: If you want static attributes visible to only the current function, you can use PyExt's modify_function:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
It's slightly ugly and hackish. But it works.
My recommendation would be just to use double underscores:
class bar(object):
__a = 1
def foo(self):
print self.__a
self.__a = 2
Although this is visible to the other functions, it's invisible to anything else (actually, it's there, but it's mangled).
FINAL EDIT: Use this:
import pyext
def wrap_mod(*args, **kw):
def inner(f):
return pyext.modify_function(f, *args, **kw)
return inner
class bar(object):
#wrap_mod(globals={'a': [1]})
def foo(self):
print a[0]
a[0] = 2
foo.a = foo.func_globals['a']
b = bar()
b.foo() # prints 1
b.foo() # prints 2
# external access
b.foo.a[0] = 77
b.foo() # prints 77
While You can accomplish Your goal by replacing self.foo.a = 1 with self.foo.__dict__['a'] = 1 it is generally not recommended.
If you are using Python2 - (and not Python3) - whenever you retrieve a method from an instance, a new instance method object is created which is a wrapper to the original function defined in the class body.
The instance method is a rather transparent proxy to the function - you can retrieve the function's attributes through it, but not set them - that is why setting an item in self.foo.__dict__ works.
Alternatively you can reach the function object itself using: self.foo.im_func - the im_func attribute of instance methods point the underlying function.
Based on other contributors's answers, I came up with the following workaround. First, wrap a dictionnary in a class resolving non-existant attributes to the wrapped dictionnary such as the following code.
class DictWrapper(object):
def __init__(self, d):
self.d = d
def __getattr__(self, key):
return self.d[key]
Credits to Lucas Jones for this code.
Then implement a addstatic decorator with a statics attribute that will store the static attributes.
class addstatic(object):
def __init__(self, **statics):
self.statics = statics
def __call__(self, function):
class override(object):
def __init__(self, function, statics):
self.__function = function
self.statics = DictWrapper(statics)
def __call__(self, *args, **kwargs):
return self.__function(*args, **kwargs)
def __get__(self, instance, objtype):
from types import MethodType
return MethodType(self, instance)
retval = override(function, self.statics)
return retval
The following code is an example of how the addstatic decorator can be used on methods.
class bar(object):
#attribute(a=2, b=3)
def foo(self):
self.foo.statics.a = 1
self.foo.statics.b = 2
Then, playing with an instance of the bar class yields :
>>> b = bar()
>>> b.foo.statics.a
2
>>> b.foo.statics.b
3
>>> b.foo()
>>> b.foo.statics.a
3
>>> b.foo.statics.b
5
The reason for using this statics dictionnary follows jsbueno's answer which suggest that what I want would require overloading the dot operator of and instance method wrapping the foo function, which I am not sure is possible. Of course, the method's attribute could be set in self.foo.__dict__, but since it not recommended (as suggested by brainovergrow), I came up with this workaround. I am not certain this would be recommended either and I guess it is up for comments.
I need to refactor existing code by collapsing a method that's copy-and-pasted between various classed that inherit from one another into a single method.
So I produced the following code:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
return self.rec_gen(B)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
return self.rec_gen(C)
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()
And the output:
3
2
What still bothers me is that in the 'rec' method I need to tell 'rec_gen' the context of the class in which it's running. Is there a way for 'rec_gen' to figure it out by itself in runtime?
This capability has been added to Python 3 - see PEP 3135. In a nutshell:
class B(A):
def rec(self):
return super().rec() + 1
I think you've created the convoluted rec()/rec_gen() setup because you couldn't automatically find the class, but in case you want that anyway the following should work:
class A(object):
def rec(self):
return 1
class B(A):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
def rec_gen(self, rec_class):
return super(rec_class, self).rec() + 1
class C(B):
def rec(self):
# __class__ is a cell that is only created if super() is in the method
super()
return self.rec_gen(__class__)
The simplest solution in Python 2 is to use a private member to hold the super object:
class B(A):
def __init__(self):
self.__super = super(B)
def rec(self):
return self.__super.rec() + 1
But that still suffers from the need to specify the actual class in one place, and if you happen to have two identically-named classes in the class hierarchy (e.g. from different modules) this method will break.
There were a couple of us who made recipes for automatic resolution for Python 2 prior to the existence of PEP 3135 - my method is at self.super on ActiveState. Basically, it allows the following:
class B(A, autosuper):
def rec(self):
return self.super().rec() + 1
or in the case that you're calling a parent method with the same name (the most common case):
class B(A, autosuper):
def rec(self):
return self.super() + 1
Caveats to this method:
It's quite slow. I have a version sitting around somewhere that does bytecode manipulation to improve the speed a lot.
It's not consistent with PEP 3135 (although it was a proposal for the Python 3 super at one stage).
It's quite complex.
It's a mix-in base class.
I don't know if the above would enable you to meet your requirements. With a small change to the recipe though you could find out what class you're in and pass that to rec_gen() - basically extract the class-finding code out of _getSuper() into its own method.
An alternative solution for python 2.x would be to use a metaclass to automatically define the rec method in all your subclasses:
class RecGen(type):
def __new__(cls, name, bases, dct):
new_cls = super(RecGen, cls).__new__(cls, name, bases, dct)
if bases != (object,):
def rec(self):
return super(new_cls, self).rec() + 1
new_cls.rec = rec
return new_cls
class A(object):
__metaclass__ = RecGen
def rec(self):
return 1
class B(A):
pass
class C(B):
pass
Note that if you're just trying to get something like the number of parent classes, it would be easier to use self.__class__.__mro__ directly:
class A(object):
def rec(self):
return len(self.__class__.__mro__)-1
class B(A):
pass
class C(B):
pass
I'm not sure exactly what you're trying to achieve, but if it is just to have a method that returns a different constant value for each class then use class attributes to store the value. It isn't clear at all from your example that you need to go anywhere near super().
class A(object):
REC = 1
def rec(self):
return self.REC
class B(A):
REC = 2
class C(B):
REC = 3
if __name__=='__main__':
b = B(); c = C()
print c.rec()
print b.rec()