I'm trying to implement a configuration system contained within a module. The core configuration variable is a class instance and a global variable in this module. It seems that when I import this variable, I cannot use it as a class for some reason.
Consider this minimal example:
foomodule.py:
class FooClass:
number = 5
def bar (self):
return self.number
foo = FooClass
foo.number = 5
main.py
from foomodule import foo
print foo.bar()
Running main.py results in a cryptic error message:
Traceback (most recent call last):
File "main.py", line 2, in <module>
print foo.bar()
TypeError: unbound method bar() must be called with FooClass instance as first argument (got nothing instead)
But I am calling it with a FooClass instance which I'd think should be the self argument like it usually is. What am I doing wrong here?
You only bound foo to the class; you didn't make it an instance:
foo = FooClass # only creates an additional reference
Call the class:
foo = FooClass() # creates an instance of FooClass
In Python you usually don't use accessor methods; just reference foo.number in your main module, rather than use foo.bar() to obtain it.
In your example foo is just an alias for FooClass. I assume that your actual problem is more complicated than your snippet. However, if you really need a class method, you can annotate it with #classmethod decorator.
class FooClass(object):
number = 5
#classmethod
def bar(cls):
return cls.number
To use your the class you could do:
from foomodule import Foo
Foo.bar()
Or you can access the class member directly
Foo.number
Related
I'm trying to use super in a subclass which is wrapped in another class using a class decorator:
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
return cls()
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(MySubclass, self).say(x.upper())
However, the call to super fails:
>>> MySubclass().make_instance().say('hello')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in say
TypeError: super(type, obj): obj must be an instance or subtype of type
The problem is that, when say is called, MySubclass doesn't refer to the original class anymore, but to the return value of the decorator.
One possible solution would be to store the value of MySubclass before decorating it:
class MySubclass(MyClass):
def say(self, x):
super(_MySubclass, self).say(x.upper())
_MySubclass = MySubclass
MySubclass = class_decorator(MySubclass)
This works, but isn't intuitive and would need to be repeated for each decorated subclass. I'm looking for a way that doesn't need additional boilerplate for each decorated subclass -- adding more code in one place (say, the decorator) would be OK.
Update: In Python 3 this isn't a problem, since you can use __class__ (or the super variant without arguments), so the following works:
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super().say(x.upper())
Unfortunately, I'm stuck with Python 2.7 for this project.
The problem is that your decorator returns a different class than python (or anyone who uses your code) expects. super not working is just one of the many unfortunate consequences:
>>> isinstance(MySubclass().make_instance(), MySubclass)
False
>>> issubclass(MySubclass, MyClass)
False
>>> pickle.dumps(MySubclass().make_instance())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
_pickle.PicklingError: Can't pickle <class '__main__.MySubclass'>: it's not the same object as __main__.MySubclass
This is why a class decorator should modify the class instead of returning a different one. The correct implementation would look like this:
def class_decorator(wrapped_cls):
#classmethod
def make_instance(cls):
return cls()
wrapped_cls.make_instance = make_instance
return wrapped_cls
Now super and everything else will work as expected:
>>> MySubclass().make_instance().say('hello')
HELLO
The problem occurs because at the time when MySubclass.say() is called, the global symbol MySubclass no longer refers to what's defined in your code as 'class MySubclass'. It is an instance of WrapperClass, which isn't in any way related to MySubclass.
If you are using Python3, you can get around this by NOT passing any arguments to 'super', like this:
super().say(x.upper())
I don't really know why you use the specific construct that you have, but it does look strange that a sub-class of MyClass that defines 'say()' - and has itself a 'say()' method in the source code would have to end up as something that does not have that method - which is the case in your code.
Note you could change the class WrapperClass line to make it read
class WrapperClass(cls):
this will make your wrapper a sub-class of the one you just decorated. This doesn't help with your super(SubClass, self) call - you still need to remove the args (which is OK only on Python3), but at least an instance created as x=MySubclass() would have a 'say' method, as one would expect at first glance.
EDIT: I've come up with a way around this, but it really looks odd and has the disadvantage of making the 'wrapped' class know that it is being wrapped (and it becomes reliant on that, making it unusable if you remove the decorator):
def class_decorator(cls):
class WrapperClass(object):
def make_instance(self):
i = cls()
i._wrapped = cls
return i
return WrapperClass
class MyClass(object):
def say(self, x):
print(x)
#class_decorator
class MySubclass(MyClass):
def say(self, x):
super(self._wrapped, self).say(x.upper())
# make_instance returns inst of the original class, non-decorated i = MySubclass().make_instance() i.say('hello')
In essence, _wrapped saves a class reference as it was at declaration time, consistent with using the regular super(this_class_name, self) builtin call.
I want to call a method from the parent class in a child class.
I use XX.__init__() in my child class and call the press function from the parent class. But it fails when I run the following code:
Func.py
class PC:
def __init__(self):
PCKeyDis = {}
self.PCKeyDis = PCKeyDis
def Press(self,key):
KeyDis = self.PCKeyDis
if len(key)==1 and key.islower():
key = key.upper()
win32api.keybd_event(KeyDis[key],0,0,0)
time.sleep(0.1)
win32api.keybd_event(KeyDis[key],0,win32con.KEYEVENTF_KEYUP,0)
class PCFunc(PC):
def __init__(self):
pass
def Sentence(self,string):
PC.__init__()
strlist = list(string)
for i in xrange(len(strlist)):
if strlist[i] == ' ':
strlist[i] = 'Space'
PC.Press(strlist[i]) #use this function
action.py
import Func
import win32gui
PC = Func.PC()
PCFunc = Func.PCFunc ()
win32gui.SetForegroundWindow(win32gui.FindWindow(winclass,winnm))
PCFunc.Sentence(path)
I get:
unbound method Sentence() must be called with PCFunc instance as first argument (got str instance instead)
If you want to call the constructor of the base class, then you do it on instantiation in the __init__() method, not in the Sentence() method:
def __init__(self):
super(self.__class__, self).__init__()
Since Sentence() is an instance method, you need to call it via an instance of the class (like the error tells you):
pc_func = PCFunc()
pc_func.Sentence(var)
Here you are calling the method with an undefined variable:
PCFunc.Sentence(path)
Instead you need to give a string as parameter, so either write Sentence('path'), or define the variable first:
path = 'my path'
pc_func.Sentence(path)
Do not use the same name as the class name for an instance of the class:
PCFunc = Func.PCFunc ()
Otherwise the variable name storing the instance overwrites the class name.
Apart from that, it is unclear what your code is actually supposed to do. Have a look at the Python code conventions for a first step to making your code more readible. Then do some research about classes and inheritance.
The code you posted does not produce the error you posted. Here is an example that will produce that error:
class Dog:
def do_stuff(self, string):
print string
d = Dog()
d.do_stuff('hello')
Dog.do_stuff(d, 'goodbye')
Dog.do_stuff('goodbye')
--output:--
hello
goodbye
Traceback (most recent call last):
File "1.py", line 9, in <module>
Dog.do_stuff('goodbye')
TypeError: unbound method do_stuff() must be called with Dog instance as first argument (got str instance instead)
An __init__() function can also produce that error:
class Dog:
def __init__(self):
pass
def do_stuff(self, string):
print(string)
Dog.__init__()
--output:--
Traceback (most recent call last):
File "1.py", line 7, in <module>
Dog.__init__()
TypeError: unbound method __init__() must be called with Dog instance as first argument (got nothing instead)
In the line:
d.do_stuff('hello')
the fragment d.do_stuff causes python to create and return a bound method object--which is then immediately executed by the function execution operator () in the fragment ('hello’). The bound method is bound to the instance d, hence the reason it is called a bound method. A bound method automatically passes the instance it contains to the method when the method is executed.
On the other hand, when you write:
Dog.do_stuff(....)
the fragment Dog.do_stuff causes python to create and return an unbound method. An unbound method does not contain an instance, so when an unbound method is executed by the function execution operator (), you must manually pass an instance. (In python3, things changed and you can pass anything as the first argument--an instance of the class isn't required.)
I am trying to define a class within a class. I am not trying to solve a real world problem with this. I am simply trying to learn Python with this code.
class Foo:
class Bar:
def __init__(self):
self.x = 'Bar'
def __init__(self):
self.x = 'Foo'
self.bar = Bar()
def print(self):
print('self.x:', self.x)
print('self.bar.x:', self.bar.x)
foo = Foo()
foo.print()
When I try to execute this code, I get this error:
Traceback (most recent call last):
File "demo.py", line 14, in <module>
foo = Foo()
File "demo.py", line 8, in __init__
self.bar = Bar()
NameError: name 'Bar' is not defined
What went wrong? When we can define and use functions defined within a function, why not the same for classes? Am I making a mistake in using Bar class?
Use Foo.Bar() instead, because Bar is not a global name.
You need to access Bar through Foo (Bar is not defined in module level)
Replace following line:
self.bar = Bar()
with:
self.bar = Foo.Bar()
or
self.bar = self.Bar()
According to Class definition:
The
class’s suite is then executed in a new execution frame (see section
Naming and binding), using a newly created local namespace and the
original global namespace. (Usually, the suite contains only function
definitions.) When the class’s suite finishes execution, its execution
frame is discarded but its local namespace is saved. A class
object is then created using the inheritance list for the base classes
and the saved local namespace for the attribute dictionary. The class
name is bound to this class object in the original local namespace.
For one of the project I am currently working I was thinking of creating a class that could not be instantiate by a client and only be supplied an instance of through a particular interface i.e. the client would not be able create further instance out of it by some hackery such as:
>>> try:
... raise WindowsError
... except:
... foo = sys.exc_info()
...
>>> foo
(<type 'exceptions.WindowsError'>, WindowsError(), <traceback object at 0x0000000005503A48>)
>>> type(foo[2])()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: cannot create 'traceback' instances
once he has one.
I was successfully able to create a class that couldn't be instantiated. i.e.
>>> class Foo():
... def __init__(self):
... raise TypeError("cannot create 'Foo' instances")
...
>>> bar = Foo()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in __init__
TypeError: cannot create 'Foo' instances
>>> bar
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'bar' is not defined
But how could I use this every same definition to create an instance of the class?
Of course I could do something like this:
>>> class Foo():
... def __init__(self, instantiate = False):
... if not instantiate:
... raise TypeError("cannot create 'Foo' instances")
but I don't find it elegant enough nor does it completely prevent the client from further instantiating it. And no I aint going down the road of building a C++ module for it.
Any suggestions on how to achieve such a thing? import abc?
A brief rational to answer Martijn's question and for completeness:
Actual you could consider the instance of the particular, and related, classes, in question, as nodes in a tree and that both the parent and the children to remain connected, dependent on and cognizant of each other and have a single unique root throughout any instance of python(insured by the use package). Any state changes in a particular node would cause others to update themselves and the database to which they are connect, accordingly. Apart from that I was being curious to know how such a thing could be put in place (the traceback class was teasing me).
What you're doing is a bad idea, you shouldn't do it.
I'm sure there's an other, better solution.
If you do decide to go with your way anyways (you shouldn't), here's how you can create an object without using __init__():
Objects in python are created with the __new__() method. The method __init__() only edits the object which was created by __new__(). For example, __init__() usually initializes some attributes for the object.
When declaring something like x = Foo() what happens is this:
x = object.__new__(Foo) gets called first and creates the object.
Foo.__init__(x) gets called second, it simply initializes some attributes etc. to the already existing object.
This means that you are not required to call Foo() (and as a result, call __init__() too). Instead, you can just call __new__() directly:
class Foo(object):
def __init__(self):
raise TypeError("Cannot create 'Foo' instances.")
>>> x = object.__new__(Foo)
>>> x
<__main__.Foo object at 0x02B074F0>
Our x is now an instance of Foo, without any attributes that is, and it can use any methods defined in Foo class.
If you want, you can create your own replacement function of __init__ for initializing attributes:
def init_foo(foo, name):
foo.name = name
>>> init_foo(x, "Mike")
>>> x.name
'Mike'
This could of course be Foo's instance method too:
class Foo(object):
def __init__(self):
raise TypeError("Cannot create 'Foo' instances.")
def init(self, name):
self.name = name
>>> x = object.__new__(Foo)
>>> x.init("Mike")
>>> x.name
'Mike'
Going even step further, you can even use a classmethod for creating your object with only one call:
class Foo(object):
def __init__(self):
raise TypeError("Cannot create 'Foo' instances.")
#classmethod
def new(cls, name):
obj = object.__new__(cls)
obj.name = name
return obj
>>> x = Foo.new("Mike")
>>> x.name
'Mike'
I have a wrapper function that returns a function. Is there a way to programmatically set the docstring of the returned function? If I could write to __doc__ I'd do the following:
def wrapper(a):
def add_something(b):
return a + b
add_something.__doc__ = 'Adds ' + str(a) + ' to `b`'
return add_something
Then I could do
>>> add_three = wrapper(3)
>>> add_three.__doc__
'Adds 3 to `b`
However, since __doc__ is read-only, I can't do that. What's the correct way?
Edit: Ok, I wanted to keep this simple, but of course this is not what I'm actually trying to do. Even though in general __doc__ is writeable in my case it isn't.
I am trying to create testcases for unittest automatically. I have a wrapper function that creates a class object that is a subclass of unittest.TestCase:
import unittest
def makeTestCase(filename, my_func):
class ATest(unittest.TestCase):
def testSomething(self):
# Running test in here with data in filename and function my_func
data = loadmat(filename)
result = my_func(data)
self.assertTrue(result > 0)
return ATest
If I create this class and try to set the docstring of testSomething I get an error:
>>> def my_func(): pass
>>> MyTest = makeTestCase('some_filename', my_func)
>>> MyTest.testSomething.__doc__ = 'This should be my docstring'
AttributeError: attribute '__doc__' of 'instancemethod' objects is not writable
An instancemethod gets its docstring from its __func__. Change the docstring of __func__ instead. (The __doc__ attribute of functions are writeable.)
>>> class Foo(object):
... def bar(self):
... pass
...
>>> Foo.bar.__func__.__doc__ = "A super docstring"
>>> help(Foo.bar)
Help on method bar in module __main__:
bar(self) unbound __main__.Foo method
A super docstring
>>> foo = Foo()
>>> help(foo.bar)
Help on method bar in module __main__:
bar(self) method of __main__.Foo instance
A super docstring
From the 2.7 docs:
User-defined methods
A user-defined method object combines a class, a class instance (or None) and any callable
object (normally a user-defined function).
Special read-only attributes: im_self is the class instance object, im_func is the function
object; im_class is the class of im_self for bound methods or the class that asked for the
method for unbound methods; __doc__ is the method’s documentation (same as
im_func.__doc__); __name__ is the method name (same as im_func.__name__);
__module__ is the name of the module the method was defined in, or None if unavailable.
Changed in version 2.2: im_self used to refer to the class that defined the method.
Changed in version 2.6: For 3.0 forward-compatibility, im_func is also available as
__func__, and im_self as __self__.
I would pass the docstring into the factory function and use type to manually construct the class.
def make_testcase(filename, myfunc, docstring):
def test_something(self):
data = loadmat(filename)
result = myfunc(data)
self.assertTrue(result > 0)
clsdict = {'test_something': test_something,
'__doc__': docstring}
return type('ATest', (unittest.TestCase,), clsdict)
MyTest = makeTestCase('some_filename', my_func, 'This is a docstring')
This is an addition to the fact that the __doc__ attribute of classes of type type cannot be changed. The interesting point is that this is only true as long as the class is created using type. As soon as you use a metaclass you can actually just change __doc__.
The example uses the abc (AbstractBaseClass) module. It works using a special ABCMeta metaclass
import abc
class MyNewClass(object):
__metaclass__ = abc.ABCMeta
MyClass.__doc__ = "Changing the docstring works !"
help(MyNewClass)
will result in
"""
Help on class MyNewClass in module __main__:
class MyNewClass(__builtin__.object)
| Changing the docstring works !
"""
Just use decorators. Here's your case:
def add_doc(value):
def _doc(func):
func.__doc__ = value
return func
return _doc
import unittest
def makeTestCase(filename, my_func):
class ATest(unittest.TestCase):
#add_doc('This should be my docstring')
def testSomething(self):
# Running test in here with data in filename and function my_func
data = loadmat(filename)
result = my_func(data)
self.assertTrue(result > 0)
return ATest
def my_func(): pass
MyTest = makeTestCase('some_filename', my_func)
print MyTest.testSomething.__doc__
> 'This should be my docstring'
Here's a similar use case: Python dynamic help and autocomplete generation
__doc__ is not writable only when your object is of type 'type'.
In your case, add_three is a function and you can just set __doc__ to any string.
In the case where you're trying to automatically generate unittest.TestCase subclasses, you may have more mileage overriding their shortDescription method.
This is the method that strips the underlying docstring down to the first line, as seen in normal unittest output; overriding it was enough to give us control over what showed up in reporting tools like TeamCity, which was what we needed.