Class A has an attribute of another class B.
Class A():
def __init__(self, b):
self.b = b
def get_b_secret(self, job_id):
x, y = self.b.get_secret(job_id)
return x, y
Class B():
def __init__(self, key, url):
self.key = key
self.url = url
def get_secret(job_id):
# logic to get secret
# return a tuple
return x, y
I want to write a unit test for method get_b_secret of class A by mocking B class as a whole.
#patch('b.B')
def test_get_b_secret(self, mock_b):
mock_b.b.get_secret.return_value = ('x', 'y')
obj = A(mock_b)
expected = ('x','y')
self.assertEqual(obj.get_b_secret('001'), expected)
I realized that by mocking class B, I am not really instantialzing B to an instance inside of A's instance. That's why when I debug the test, A's get_b_secret is returning a MagicMock object instead.
I found this article about mocking object. But in that example, the outer class's init doesn't have inner Class object as an argument. So it is a little different. What is the best way to do it?
If you example code is correct, than you don't need to mock class B. You just need to pass mock with function get_secret into class A when you initializing it.
mock = MagicMock()
mock.return_value.get_secret.return_value = ('x', 'y')
obj = A(mock)
....
Related
I have two python classes
class A:
"""
This is a class retaining some constants
"""
C=1
class B:
VAR = None
def __init__(self):
b.VAR = A
def f(self, v=VAR ):
print(v.C)
clb = B()
clb .f()
AttributeError: 'NoneType' object has no attribute 'C'
So what I am trying to do is populate the B::VAR class variable in the B::init() with the reference of class A, and after that in the B::f() to have access to A::C by using default argument v (that retains VAR).
I intend to use v as a default value for the code inside B::f() and if needed to change it when calling the function.
Is my scenario possible?
Thank you,
Yes, this is possible:
class A:
"""
This is a class retaining some constants
"""
C = 1
class B:
VAR = None
def __init__(self):
self.VAR = A
def f(self, v=None):
if v is None:
v = self.VAR
print(v.C)
clb = B()
clb.f()
You issue is that the default arguments v=VAR is an old reference to the B.VAR which is None, not the updated value of the object clb.VAR.
This diagram show that the old version of f() have a default value for v that point to None, because this is computed at the definition of the method, when the class B is defined, before any creation of clb: B object, where VAR is a class attribute.
My suggestion is to set v at runtime using the VAR of the object throught self, which is changed in the __init__ to A.
class A:
C = 1
class B:
VAR = None
def __init__(self):
B.VAR = A
#classmethod
def f(cls):
print(cls.VAR.C)
clb = B()
clb.f()
This is another way to do it. However, I'm wondering what it is you're actually trying to do, because this seems really strange
Being new to OOP, I wanted to know if there is any way of inheriting one of multiple classes based on how the child class is called in Python. The reason I am trying to do this is because I have multiple methods with the same name but in three parent classes which have different functionality. The corresponding class will have to be inherited based on certain conditions at the time of object creation.
For example, I tried to make Class C inherit A or B based on whether any arguments were passed at the time of instantiating, but in vain. Can anyone suggest a better way to do this?
class A:
def __init__(self,a):
self.num = a
def print_output(self):
print('Class A is the parent class, the number is 7',self.num)
class B:
def __init__(self):
self.digits=[]
def print_output(self):
print('Class B is the parent class, no number given')
class C(A if kwargs else B):
def __init__(self,**kwargs):
if kwargs:
super().__init__(kwargs['a'])
else:
super().__init__()
temp1 = C(a=7)
temp2 = C()
temp1.print_output()
temp2.print_output()
The required output would be 'Class A is the parent class, the number is 7' followed by 'Class B is the parent class, no number given'.
Thanks!
Whether you're just starting out with OOP or have been doing it for a while, I would suggest you get a good book on design patterns. A classic is Design Patterns by Gamma. Helm. Johnson and Vlissides.
Instead of using inheritance, you can use composition with delegation. For example:
class A:
def do_something(self):
# some implementation
class B:
def do_something(self):
# some implementation
class C:
def __init__(self, use_A):
# assign an instance of A or B depending on whether argument use_A is True
self.instance = A() if use_A else B()
def do_something(self):
# delegate to A or B instance:
self.instance.do_something()
Update
In response to a comment made by Lev Barenboim, the following demonstrates how you can make composition with delegation appear to be more like regular inheritance so that if class C has has assigned an instance of class A, for example, to self.instance, then attributes of A such as x can be accessed internally as self.x as well as self.instance.x (assuming class C does not define attribute x itself) and likewise if you create an instance of C named c, you can refer to that attribute as c.x as if class C had inherited from class A.
The basis for doing this lies with builtin methods __getattr__ and __getattribute__. __getattr__ can be defined on a class and will be called whenever an attribute is referenced but not defined. __getattribute__ can be called on an object to retrieve an attribute by name.
Note that in the following example, class C no longer even has to define method do_something if all it does is delegate to self.instance:
class A:
def __init__(self, x):
self.x = x
def do_something(self):
print('I am A')
class B:
def __init__(self, x):
self.x = x
def do_something(self):
print('I am B')
class C:
def __init__(self, use_A, x):
# assign an instance of A or B depending on whether argument use_A is True
self.instance = A(x) if use_A else B(x)
# called when an attribute is not found:
def __getattr__(self, name):
# assume it is implemented by self.instance
return self.instance.__getattribute__(name)
# something unique to class C:
def foo(self):
print ('foo called: x =', self.x)
c = C(True, 7)
print(c.x)
c.foo()
c.do_something()
# This will throw an Exception:
print(c.y)
Prints:
7
foo called: x = 7
I am A
Traceback (most recent call last):
File "C:\Ron\test\test.py", line 34, in <module>
print(c.y)
File "C:\Ron\test\test.py", line 23, in __getattr__
return self.instance.__getattribute__(name)
AttributeError: 'A' object has no attribute 'y'
I don't think you can pass values to the condition of the class from inside itself.
Rather, you can define a factory method like this :
class A:
def sayClass(self):
print("Class A")
class B:
def sayClass(self):
print("Class B")
def make_C_from_A_or_B(make_A):
class C(A if make_A else B):
def sayClass(self):
super().sayClass()
print("Class C")
return C()
make_C_from_A_or_B(True).sayClass()
which output :
Class A
Class C
Note: You can find information about the factory pattern with an example I found good enough on this article (about a parser factory)
I have this class, Foo, whose functions use dataframes I initialize in its constructor. I want to test its functions in my test class, FooTest.
from src.shared.utils import get_spark_dataframe
class Foo(object):
def __init__(self, x, y):
self.a = get_spark_dataframe(x, y.some_db, "table_a")
self.b = get_spark_dataframe(x, y.some_db, "table_b")
def some_foo_function(self):
return self.a.join(self.b, ['pk'])
I want to mock out this get_spark_dataframe function and replace it with my own, since I'm merely interested in replacing the dataframes in the class with fake dataframes I define in my test class.
def get_spark_dataframe(x, db_name, table_name):
return x.get_table(db=db_name, table=table_name).toDF()
This is vaguely what my test class looks like:
class FooTest(PysparkTest):
def setUp(self):
self.a_df = self.spark.createDataFrame([Row(...)])
self.b_df = self.spark.createDataFrame([Row(...)])
self.x = None
self.y = None
def mock_get_spark_dataframe(self, x, db_name, table_name):
if table_name == "table_a":
return self.a_df
elif table_name == "table_b":
return self.b_df
#patch('src.shared.utils.get_spark_dataframe', side_effect=mock_get_spark_dataframe)
def test_some_foo_function(self, mock_get_spark_dataframe):
foo = Foo(self.x, self.y)
return_value = foo.some_foo_function()
...
Am I doing something incorrectly? It doesn't seem like my mock function is being used when I create the Foo object. The real get_spark_dataframe function seems to be getting called, and it complains about x being None. Am I using side_effect incorrectly?
Try these changes in your code:
import src.shared.utils as utils
class Foo(object):
def __init__(self, x, y):
self.a = utils.get_spark_dataframe(x,...
...
and
class FooTest(PysparkTest):
...
#patch('utils.get_spark_dataframe',...
...
I'm trying to verify that the implementation of Base.run_this calls the methods of derived class (derived_method_[1st|2nd|3rd]) in correct order. As the output shows, the test is not working. How can I fix this?
class Base(object):
__metaclass__ = abc.ABCMeta
def __init__(self, parameters):
self.parameters = parameters;
#abc.abstractmethod
def must_implement_this(self):
return
def run_this(self):
self.must_implement_this()
if(self.parameters):
first = getattr(self, "derived_method_1st")
first()
second = getattr(self, "derived_method_2nd")
second()
third = getattr(self, "derived_method_3rd")
third()
class Derived(Base):
def must_implement_this(self):
pass
def derived_method_1st(self):
pass
def derived_method_2nd(self):
pass
def derived_method_3rd(self):
pass
mocked = MagicMock(wraps=Derived(True))
mocked.run_this()
mocked.assert_has_calls([call.derived_method_1st(), call.derived_method_2nd(), call.derived_method_3rd()])
Output
AssertionError: Calls not found.
Expected: [call.derived_method_1st(), call.derived_method_2nd(), call.derived_method_3rd()]
Actual: [call.run_this()]
wraps doesn't work well with instances. What happens here is that mocked.run_this returns a new mock object that 'wraps' Derived(True).run_this, where the latter is a bound method to the original Derived() instance.
As such, that method will call self.derived_method_* methods that are bound to that original instance, not to the mock.
You could patch in the run_this method on a spec mock instead:
mock = MagicMock(spec=Derived)
instance = mock()
instance.run_this = Derived.run_this.__get__(instance) # bind to mock instead
instance.parameters = True # adjust as needed for the test
instance.run_this()
Demo:
>>> mock = MagicMock(spec=Derived)
>>> instance = mock()
>>> instance.run_this = Derived.run_this.__get__(instance) # bind to mock instead
>>> instance.parameters = True # adjust as needed for the test
>>> instance.run_this()
>>> instance.mock_calls
[call.must_implement_this(),
call.derived_method_1st(),
call.derived_method_2nd(),
call.derived_method_3rd()]
I would like to create a class whose f method depends on the "mode" the object of the class has been created.
The code below doesn't work but hope it gets you an idea of what I am trying to do. My idea is to have a dictionary in which I define the settings for each mode (in this case the function or method to assign to self.f, so that rather than using many if elif statements
in the init function I just assign the correct values using the dictionary.
class A(object):
_methods_dict={
'a':A.f1,
'b':A.f2
}
def __init__(self,mode = 'a'):
self.f = _methods_dict[mode]
def f1(self,x):
return x
def f2(self,x):
return x**2
I can't figure why this does not work, how would you fix it?
Also are there better (and more pythonic) approaches to get the same kind of functionalities?
Store the name of the two functions, then use getattr() to retrieve the bound method in __init__:
class A(object):
_methods_dict = {
'a': 'f1',
'b': 'f2'
}
def __init__(self, mode='a'):
self.f = getattr(self, self._methods_dict[mode])
def f1(self, x):
return x
def f2(self, x):
return x ** 2
Alternatively, just proxy the method:
class A(object):
_methods_dict = {
'a': 'f1',
'b': 'f2'
}
def __init__(self,mode = 'a'):
self._mode = mode
#property
def f(self):
return getattr(self, self._methods_dict[self._mode])
def f1(self, x):
return x
def f2(self, x):
return x ** 2
The f property just returns the correct bound method for the current mode. Using a property simplifies call signature handling, and gives users the actual method to introspect if they so wish.
Either method has the same end-result:
>>> a1 = A()
>>> a2 = A('b')
>>> a1.f(10)
10
>>> a2.f(10)
100
The difference lies in what is stored in the instance, the first method stores bound methods:
>>> vars(a1)
{'f': <bound method A.f1 of <__main__.A object at 0x10aa1ec50>>}
>>> vars(a2)
{'f': <bound method A.f2 of <__main__.A object at 0x10aa1ed50>>}
versus the method in the other:
>>> vars(a1)
{'_mode': 'a'}
>>> vars(a2)
{'_mode': 'b'}
That may not seem much of a difference, but the latter method creates instances that can be pickled and deep-copied without problems.
You could just make two separate classes:
class Base(object):
# place here all attributes shared in common among the Modes
pass
class ModeA(Base):
def f(self, x):
return x
class ModeB(Base):
def f(self, x):
return x**2
def make_mode(mode, *args, **kwargs):
mode_dict = {'a':ModeA, 'b':ModeB}
return mode_dict[mode](*args, **kwargs)
a = make_mode('a')
print(a.f(10))
# 10
b = make_mode('b')
print(b.f(10))
# 100
To answer your first question ("why this does not work"): the class object "A" is only created and bound to the module name "A" after the whole class statement (yes, "class" is an executable statement) block has ended, so you cannot refer to neither the name or the class object itself within this block.