I need a magic variable in my class and started writing it
class myclass:
def __init__(self, name, *args):
self.name = name
????
def myFunc()
for i in args:
print(i)
I just could not find a proper explanation of how to write a class with magic variables in the constructor and use it later. Do I have to create a self. member out of it (and if so how) or can I neglect it and just use args as in myFunc ?
*args are not called magic variables, but arbitrary argument lists, or variadic arguments, and they are used to send arbitrary number of arguments to a function, and they are wrapped in a tuple like the example below
In [9]: def f(a,*args):
...: print(a)
...: print(args)
...:
In [10]: f(1,2,3,4)
1
(2, 3, 4)
So in order to access these variables, you would do what you do for any class instance variable, assign it via self.args = args and access them via self.args
Also note that we use camel-case for class names, so the class name changes to MyClass and snake-case for functions, so the function name changes to my_func
class MyClass:
def __init__(self, name, *args):
self.name = name
#Assigning variadic arguments using self
self.args = args
def my_func(self):
#Accessing variadic arguments using self
for i in self.args:
print(i)
obj = MyClass('Joe',1,2,3)
obj.my_func()
The output will be
1
2
3
If you assign the args to self.args, you can access them from other methods. However this method must accept the instance of the class as its first argument (see below how myFunc takes self now).
class myclass:
def __init__(self, name, *args):
self.name = name
self.args = args
def myFunc(self):
for i in self.args:
print(i)
Related
I am trying to design a class structure that allows the user to define their own class that overloads predefined methods in other classes. In this case the user would create the C class to overload the "function" method in D. The user created C class has common logic for other user created classes A and B so they inherit from C to overload "function" but also inherit from D to use D's other methods. The issue I am having is how to pass "value" from A and B to D and ignore passing it to C. What I currently have written will produce an error as C does not have "value" as an argument.
I know that I can add "value" (or *args) to C's init method and the super call but I don't want to have to know what inputs other classes need in order to add new classes to A and B. Also, if I swap the order of C and D I won't get an error but then I don't use C's overloaded "function". Is there an obvious way around this?
class D(SomethingElse):
def __init__(self, value, **kwargs):
super(D, self).__init__(**kwargs)
self.value = value
def function(self):
return self.value
def other_method(self):
pass
class C(object):
def __init__(self):
super(C, self).__init__()
def function(self):
return self.value*2
class B(C, D):
def __init__(self, value, **kwargs):
super(B, self).__init__(value, **kwargs)
class A(C, D):
def __init__(self, value, **kwargs):
super(A, self).__init__(value, **kwargs)
a = A(3)
print(a.function())
>>> 6
Essentially, there are two things you need to do to make your __init__ methods play nice with multiple inheritance in Python:
Always take a **kwargs parameter, and always call super().__init__(**kwargs), even if you think you are the base class. Just because your superclass is object doesn't mean you are last (before object) in the method resolution order.
Don't pass your parent class's __init__ arguments explicitly; only pass them via **kwargs. Your parent class isn't necessarily the next one after you in the method resolution order, so positional arguments might be passed to the wrong other __init__ method.
This is called "co-operative subclassing". Let's try with your example code:
class D:
def __init__(self, value, **kwargs):
self.value = value
super().__init__(**kwargs)
def function(self):
return self.value
class C:
# add **kwargs parameter
def __init__(self, **kwargs):
# pass kwargs to super().__init__
super().__init__(**kwargs)
def function(self):
return self.value * 2
class B(C, D):
# don't take parent class's value arg explicitly
def __init__(self, **kwargs):
# pass value arg via kwargs
super().__init__(**kwargs)
class A(C, D):
# don't take parent class's value arg explicitly
def __init__(self, **kwargs):
# pass value arg via kwargs
super().__init__(**kwargs)
Demo:
>>> a = A(value=3)
>>> a.value
3
>>> a.function()
6
Note that value must be passed to the A constructor as a keyword argument, not as a positional argument. It's also recommended to set self.value = value before calling super().__init__.
I've also simplified class C(object): to class C:, and super(C, self) to just super() since these are equivalent in Python 3.
So I'm trying to understand the point of A AND B. I'm guessing that maybe you want to mix in the superclass behavior and sometimes have local behavior. So suppose A is just mixing together behaviors, and B has some local behavior and state.
If you don't need your own state, you probably don't need an __init__. So for A and C just omit __init__.
class SomethingElse(object):
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
class D(SomethingElse):
def __init__(self, value, *args, **kwargs):
super(D, self).__init__(*args, **kwargs)
self.value = value
def function(self):
return self.value
def other_method(self):
return self.__dict__
class C(object):
#def __init__(self):
# super(C, self).__init__()
def function(self):
return self.value*2
class B(C, D):
def __init__(self, value, bstate, *args, **kwargs):
super(B, self).__init__(value, *args, **kwargs)
self.bstate = bstate
def __repr__(self):
return (self.__class__.__name__ + ' ' +
self.bstate + ' ' + str(self.other_method()))
class A(C, D):
pass
a = A(3)
b = B(21, 'extra')
a.function()
6
b.function()
42
repr(a)
'<xx.A object at 0x107cf5e10>'
repr(b)
"B extra {'args': (), 'bstate': 'extra', 'value': 21, 'kwargs': {}}"
I've kept python2 syntax assuming you might still be using it, but as another answer points out, python3 simplifies super() syntax, and you really should be using python3 now.
If you swap C and D you are changing the python method resolution order, and that will indeed change the method to which a call to A.function resolves.
I have a class member which accepts a function:
class A:
def func(self, method):
...
I want to set a default method since that behavior is desired 99% of the time.
This default behavior is static since it does not depend on any members of the class. However, I would like this default method to be private and invisible to the user. Is there any way of accomplishing that?
This is what I have tried:
class A:
#staticmethod
def __meth(x):
pass
def func(self, method = meth):
pass
Error: 'staticmethod' object is not callable
class A:
#staticmethod
def __meth(x):
pass
def func(self, method = A.__meth):
pass
Error: NameError: name 'A' is not defined
class A:
#staticmethod
def __meth(x):
pass
def func(self, method = self.__meth):
pass
Error: NameError: name 'self' is not defined
I am using Python 3.5 and do not want to rely on newer features.
It's fairly idiomatic to use None as the default and assign it as needed:
class A:
#staticmethod
def __meth(x):
print(x)
def func(self, method=None):
if method is None:
method = self.__meth
method("x")
The problems start with your default parameter. These parameters are evaluated whilst the class definition is being read, and so class A is not yet defined.
You should handle it like a normal default parameter:
class A:
#staticmethod
def __meth(x):
print('meth')
def func(self, method = None):
if method is None:
self.__meth(1)
else:
method()
def foo():
print('foo')
a = A()
a.func()
a.func(foo)
Output:
meth
foo
You can delay name resolution by putting it into a lambda:
class A:
#staticmethod
def __meth(x):
pass
def func(self, method = lambda s: A.__meth(s)):
pass
I have a decorator method to check the argument types passed to a function.
def accepts(*types):
def check_accepts(f):
assert len(types) == f.func_code.co_argcount
def new_f(*args, **kwds):
for (a, t) in zip(args, types):
assert isinstance(a, t), \
"arg %r does not match %s" % (a,t)
return f(*args, **kwds)
new_f.func_name = f.func_name
return new_f
return check_accepts
Now, in a class (in classA.py), I want a method to only accept arguments of the same class:
class ClassA:
#accepts(WHATTOPUTHERE)
def doSomething(otherObject):
# Do something
In other classes I can just put ClassA in place of WHATTOPUTHERE, but inside classA.py, ClassA is not known. How can I pass the current class to the #accepts() function?
Use the function based version of the decorator and apply it after the class definition:
class ClassA:
def doSomething(otherObject):
# Do something
ClassA.doSomething = accepts(ClassA)(ClassA.doSomething)
Another way would be write a Metaclass that would automatically apply this after class creation:
class Meta(type):
def __new__(cls, clsname, bases, dct):
fields = ('doSomething', ) #Fields on which you want to apply the decorator
for name, val in dct.items():
if name in fields:
dct[name] = accepts(cls)(val)
return type.__new__(cls, clsname, bases, dct)
class ClassA(object):
__metaclass__ = Meta
def doSomething(otherObject):
pass
Instead of manually doing things like new_f.func_name = f.func_name, use functools.wraps. This would also preserve things like docstring, argument list etc.
from functools import wraps
def accepts(*types):
def check_accepts(f):
print "inside"
assert len(types) == f.func_code.co_argcount
#wraps(f)
def new_f(*args, **kwds):
wouldn't adding a self variable to the function doSomething and then referencing args[0] inside check_accepts (provided you add (*args) or extra argument to your definition) solve your problem? If the function doSomething is supposed to be a class method, still you can outsource this self. How?
add a self to some dummy method inside the class
make a decorator which populates a variable which accepts can somehow reach (like metadata)
make sure to call this additional method before the doSomething()
you've got the class instance! Enjoy!
NOTE: This is not the only way to store metadata like this and use it later, you can do it as you wish.
There is an existing module I use containing a class that has methods with string arguments that take the form:
existing_object.existing_method("arg1")
or
existing_object.existing_method("arg1:arg2")
The arguments are in a hierarchical structure.
I would like to create a module that objectifies the arguments and makes them methods of the class of the imported module such that use would look like this:
my_object.arg1.my_method()
or
my_object.arg1.arg2.my_method()
my_method() would call existing_method() while passing it the "arg1:arg2" as an argument.
If someone could point me in the right direction to get started I'd appreciate it.
You can do this with a custom __getattr__ that returns special method caller instances:
class MethodCaller(object):
def __init__(self, args, parent):
self.args = args
self.parent = parent
def __getattr__(self, name):
return MethodCaller(self.args + (name,), self.parent)
def my_method(self):
return self.parent.existing_method(':'.join(self.args))
class MyClass(object):
def __getattr__(self, name):
return MethodCaller((name,), self)
def existing_method(self, arg):
print arg
Example:
>>> MyClass().arg1.my_method()
arg1
>>> MyClass().arg1.arg2.my_method()
arg1:arg2
>>> MyClass().foo.bar.my_method()
foo:bar
Thinking about this more clearly I realized that what I really wanted was to be able to use the IPython introspection of modules to navigate the hierarchy. This meant that I simply needed to create objects like this:
class Foo():
def __init__(self, arg):
self.arg = arg
def my_method(self.arg)
arg1 = Foo("arg1")
arg1.arg2 = Foo("arg1:arg2")
Is is possible to access the arguments which were passed to __init__, without explicitly having to store them?
e.g.
class thing(object):
def __init__(self, name, data):
pass # do something useful here
t = thing('test', [1,2,3,])
print t.__args__ # doesn't exist
>> ('test', [1,2,3])
The use-case for this is creating a super-class which can automatically store the arguments used to create an instance of a class derived from it, without having to pass all the arguments explicitly to the super's __init__. Maybe there's an easier way to do it!
No, you have to store them. Otherwise they are gone after __init__() returns, as all local variables.
If you don't want to pass all arguments on explicitly, you can use **kwargs:
class Base(object):
def __init__(self, name, data):
# store name and data
class Derived(Base):
def __init__(self, **kwargs):
Base.__init__(self, **kwargs)
Derived(name="Peter", data=42)
This is not entirely recommended, but here is a wrapper that automatically stores parameter variables:
from functools import wraps
def init_wrapper(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
func_parameters = f.func_code.co_varnames[1:f.func_code.co_argcount]
#deal with default args
diff = len(func_parameters) - len(args)
if diff > 0:
args += f.func_defaults[-diff:]
#set instance variables
for pos, arg in enumerate(func_parameters):
print pos, arg
setattr(self, arg, args[pos])
f(self, *args, **kwargs) #not necessary to use return on __init__()
return wrapper
Usage:
class A(object):
#init_wrapper
def __init__(self, a, b, c):
print a + b + c
Example:
>>> a = A(1, 2, 3)
6
>>> a.a
1
>>> a.b
2
>>> a.c
3
In a word: No.
What you could do is:
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
If you find yourself needing to do this a lot, you could also use a decorator to abstract the task.
I think that you are looking for arbitrary argument lists and keyword arguments combined with super.__init__.
Give "Python's Super is nifty, but you can't use it" a read before you start down this path though.