I have some questions about inheriting thread class.
class MyThread(threading.Thread):
def __init__(self, num):
threading.Thread.__init__(self)
self.num = num
def run(self):
print("Thread", self.num)
time.sleep(1)
why can't I only override the run method?
The Python document of threading mention that "The Thread class represents an activity that is run in a separate thread of control. There are two ways to specify the activity: by passing a callable object to the constructor, or by overriding the run() method in a subclass."
Why does the above example code override the constructor also?
The constructor is overridden to pass the num parameter from the place where instance of MyThread is created to the run method.
Note that you do not call run method directly so you can't pass any parameters to it unless you store them in constructor.
If you don't need to pass parameters you can override only run method.
Related
Given:
def my_decorator(func):
def wrapper():
print("Something is happening before the function is called.")
#Some code that will execute the decorated method
func()
return wrapper
class parent:
def parentMethod(self):
pass
class child(parent):
#my_decorator
def childMethod(self):
print("Child method called")
if __name__ == '__main__':
childInstance = child()
IS there a way to execute the decorated method without calling the decorated method?
If that dosen't make any sense, i want basically every method that is decorated with #my_decorator in child class to be executed when a instance of child class is created.
If both above questions are invalid, is there a way to get the methods that are decorated with "#my_decorator" (the names of the methods) without having to call the decorated methods. As a parallel example, the same way you can get the class name that inherits the parent class with self.__class__.__name__
It can be done rather easily with the inspect module, by having the decorator add a custom attribute to the method, and using that in __init__:
import inspect
def my_decorator(func):
print("Something is happening without the function being called.")
func.deco = True
return func
class parent:
def parentMethod(self):
pass
class child(parent):
def __init__(self):
for name, m in inspect.getmembers(self.__class__, inspect.isfunction):
if hasattr(m, 'deco'):
m(self)
#my_decorator
def childMethod(self):
print("Child method called")
if __name__ == '__main__':
childInstance = child()
It gives:
Something is happening without the function being called.
Child method called
The first message is displayed when the child class is defined, the second one when childInstance is initialized.
You can definitely execute every decorated method on instantiation with some clever meta-programming and reflection. However, this will be complex, difficult to debug, and quite painful to deal with when you (or your colleagues) come back to the code.
You're far better off just calling the methods explicitly in the __init__.
If you really want to go down the route of introspecting the class, try these references:
How to get all methods of a python class with given decorator
Introspection to get decorator names on a method?
in the below example I want to know when I should use one of them for inherits? I think both are valid so, why sometimes I have to use super if the other way is workable to work with?
class User:
def __init__(self):
self._user = "User A"
pass
class UserA(User):
_user = "User B"
def __init__(self):
super().__init__()
class UserB(User):
pass
You are correct, both are valid. The difference is:
UserA: you are overwriting the __init__ method of the ancestor. This is practical if you want to add something during the initialization process. However, you still want to initialize the ancestor, and this can be done via super().__init__(), despite having overwritten the __init__ method.
UserB: you are fully using the __init__ of the ancestor you are inheriting from (by not overwriting the __init__ method). This can be used if nothing extra needs to be done during initialization.
The super() builtin returns a proxy object (temporary object of the superclass) that allows us to access methods of the base class. For example:
class Mammal(object):
def __init__(self, mammalName):
print(mammalName, 'is a warm-blooded animal.')
class Dog(Mammal):
def __init__(self):
print('Dog has four legs.')
super().__init__('Dog')
self represents the instance of the class. By using the “self” keyword we can access the attributes and methods of the class in python
I'm using the DroneKit API in Python for controlling a drone using a companion computer. I'm trying to create a class, Vehicle, which inherits from the Vehicle class in DroneKit. The purpose of this class is for me to override some methods present in DroneKit that don't work with PX4 as well as adding a few methods of my own, whilst still having access to all of the methods available by default.
The issue is that you don't create a Vehicle object directly using Dronekit – you call the connect() function which return a Vehicle object.
My question is, how do I create an instance of my class?
The accepted method seems to be to call the parent init(), like so:
class Vehicle(dronekit_Vehicle):
def __init__(self, stuff):
dronekit_Vehicle.__init__(stuff)
But like I said, you don't create a Vehicle object directly in Dronekit, e.g. vehicle = Vehicle(stuff), but by vehicle = connect(stuff), which eventually returns a Vehicle object but also does a bunch of other stuff.
The only way I can think of is
class Vehicle(dronekit_Vehicle):
def __init__(self, stuff):
self.vehicle = connect(stuff)
And then having to use self.vehicle.function() to access the default DroneKit commands and attributes, which is a huge pain.
How do I make this work?
The way objects are defined has nothing to do with connect. Calling connect is merely some convenience function that wraps some logic around the object creation:
def connect(...):
handler = MAVConnection(...)
return Vehicle(handler)
with Vehicle.__init__() being defined as
def __init__(self, handler):
super(Vehicle, self).__init__()
self._handler = handler
...
So as long as you pass on the handler in your constructor:
class MyVehicle(dronekit.Vehicle):
def __init__(self, handler):
super(MyVehicle, self).__init__(handler)
Your class will work with connect():
connect(..., vehicle_class=MyVehicle)
This is a feature I miss in several languages and wonder if anyone has any idea how it can be done in Python.
The idea is that I have a base class:
class Base(object):
def __init__(self):
self.my_data = 0
def my_rebind_function(self):
pass
and a derived class:
class Child(Base):
def __init__(self):
super().__init__(self)
# Do some stuff here
self.my_rebind_function() # <==== This is the line I want to get rid of
def my_rebind_function(self):
# Do stuff with self.my_data
As can be seen above, I have a rebound function which I want called after the Child.__init__ has done its job. And I want this done for all inherited classes, so it would be great if it was performed by the base class, so I do not have to retype that line in every child class.
It would be nice if the language had a function like __finally__, operating similar to how it operates with exceptions. That is, it should run after all __init__-functions (of all derived classes) have been run, that would be great. So the call order would be something like:
Base1.__init__()
...
BaseN.__init__()
LeafChild.__init__()
LeafChild.__finally__()
BaseN.__finally__()
...
Base1.__finally__()
And then object construction is finished. This is also kind of similar to unit testing with setup, run and teardown functions.
You can do this with a metaclass like that:
class Meta(type):
def __call__(cls, *args, **kwargs):
print("start Meta.__call__")
instance = super().__call__(*args, **kwargs)
instance.my_rebind_function()
print("end Meta.__call__\n")
return instance
class Base(metaclass=Meta):
def __init__(self):
print("Base.__init__()")
self.my_data = 0
def my_rebind_function(self):
pass
class Child(Base):
def __init__(self):
super().__init__()
print("Child.__init__()")
def my_rebind_function(self):
print("Child.my_rebind_function")
# Do stuff with self.my_data
self.my_data = 999
if __name__ == '__main__':
c = Child()
print(c.my_data)
By overwriting Metaclass.__call__ you can hook after all __init__ ( and __new__) methods of the class-tree have run an before the instance is returned. This is the place to call your rebind function. To understand the call order i added some print statements. The output will look like this:
start Meta.__call__
Base.__init__()
Child.__init__()
Child.my_rebind_function
end Meta.__call__
999
If you want to read on and get deeper into details I can recommend following great article: https://blog.ionelmc.ro/2015/02/09/understanding-python-metaclasses/
I may still not fully understand, but this seems to do what I (think) you want:
class Base(object):
def __init__(self):
print("Base.__init__() called")
self.my_data = 0
self.other_stuff()
self.my_rebind_function()
def other_stuff(self):
""" empty """
def my_rebind_function(self):
""" empty """
class Child(Base):
def __init__(self):
super(Child, self).__init__()
def other_stuff(self):
print("In Child.other_stuff() doing other stuff I want done in Child class")
def my_rebind_function(self):
print("In Child.my_rebind_function() doing stuff with self.my_data")
child = Child()
Output:
Base.__init__() called
In Child.other_stuff() doing other stuff I want done in Child class
In Child.my_rebind_function() doing stuff with self.my_data
If you want a "rebind" function to be invoked after each instance of a type which inherits from Base is instantiated, then I would say this "rebind" function can live outside the Base class(or any class inheriting from it).
You can have a factory function that gives you the object you need when you invoke it(for example give_me_a_processed_child_object()). This factory function basically instantiates an object and does something to it before it returns it to you.
Putting logic in __init__ is not a good idea because it obscures logic and intention. When you write kid = Child(), you don't expect many things to happen in the background, especially things that act on the instance of Child that you just created. What you expect is a fresh instance of Child.
A factory function, however, transparently does something to an object and returns it to you. This way you know you're getting an already processed instance.
Finally, you wanted to avoid adding "rebind" methods to your Child classes which you now you can since all that logic can be placed in your factory function.
How can one prevent accidentally overloading a super class method's key attribute or method? For example, I defined a class that subclasses from threading.Thread and declared 2 datetime attributes called start and end. The start overloaded the Thread.start() method. That took me a while to discover.
code example:
class b(threading.Thread)
def __init__(self):
threading.Thread.__init__(self)
start = None
end = None
def run(self):
print('thread started')
The declaration start overloaded threading.Thread.start() and
so the start() call for the thread throws an exception sine start is None
Is there a python coding style that avoids this type of error (accidentally
overloading a base class's attribute or method)?
I am new to python and can see myself making this type of mistake again.