Python Mixin - Unresolved Attribute Reference [PyCharm] - python

I am using a mixin to separate a range of functionality to a different class. This Mixin is only supposed to be mixable with the only child class:
class Mixin:
def complex_operation(self):
return self.foo.capitalize()
class A(Mixin):
def __init__(self):
self.foo = 'foo'
in my method Mixin.complex_operation PyCharm gives warning 'Unresolved Attribute Reference foo'.
Am I using the mixin pattern correctly? Is there a better way? (I would like to have type hints and autocompletion in my mixins, and I would like to have multiple mixins.)

Declare the necessary fields in the Mixin like:
class Mixin:
foo: str
def complex_operation(self):
return self.foo.capitalize()
This way the mixin actually declares the fields a class must have to be able to use this mixin. Type hint will create warnings if extending class will put incompatible type into declared field.
edit: Replaced foo = None with foo:str as suggested by #valex

I see few options.
1) Type annotations (i think this is cleanest solution):
class Mixin:
foo: str
def complex_operation(self):
return self.foo.capitalize()
2) Default None (#ikamen option):
class Mixin:
foo = None
def complex_operation(self):
return self.foo.capitalize()
3) Suppress unresolved reference error for class or for specific line (i think this is more dirty way than first two):
# noinspection PyUnresolvedReferences
class Mixin:
def complex_operation(self):
return self.foo.capitalize()
class Mixin:
def complex_operation(self):
# noinspection PyUnresolvedReferences
return self.foo.capitalize()

So just to compiling my thoughts from the comments for everyone else:
The problem is keeping the two classes intrinsically connected while separating functionality. Here are my solutions:
1) Make a module
Have another file, say mixin.py, that has complex_operation as a function. Instead of accepting self as a parameter, have it take a string:
# mixin.py
def complex_operation (foo: str) -> str: return foo.capitalize()
# main.py
from ai import complex_operation
class A:
def __init__(self): self.foo = "foo"
print (complex_operation (A().foo))
2) Make a class to accept another class as a parameter
In Mixin's __init__ function, add a parameter to accept an A, and then use that in its methods:
# mixin.py
class Mixin:
def __init__(self, a: A): self.a = a
def complex_operation(self): return self.a.foo.capitalize()
# main.py
from mixin import Mixin
class A:
def __init__(self): self.foo = "foo"
print (Mixin (A()).complex_operation())

Related

Python: creating a class instance via static method vs class method

Let's say I have a class and would like to implement a method which creates an instance of that class. What I have is 2 options:
static method,
class method.
An example:
class DummyClass:
def __init__(self, json):
self.dict = json
#staticmethod
def from_json_static(json):
return DummyClass(json)
#classmethod
def from_json_class(cls, json):
return cls(json)
Both of the methods work:
dummy_dict = {"dummy_var": 124}
dummy_instance = DummyClass({"test": "abc"})
dummy_instance_from_static = dummy_instance.from_json_static(dummy_dict)
print(dummy_instance_from_static.dict)
> {'dummy_var': 124}
dummy_instance_from_class = DummyClass.from_json_class(dummy_dict)
print(dummy_instance_from_class.dict)
> {'dummy_var': 124}
What I often see in codes of other people is the classmethod design instead of staticmethod. Why is this the case?
Or, rephrasing the question to possibly get a more comprehensive answer: what are the pros and cons of creating a class instance via classmethod vs staticmethod in Python?
Two big advantages of the #classmethod approach:
First, you don't hard-code the name. Given modern refactoring tools in IDEs, this isn't as big of a deal, but it is nice to not have your code break if you change the name of your Foo, class to Bar::
class Bar:
#statmicmethod
def make_me():
return Foo()
Another advantage (at least, you should understand the difference!) is how this behaves with inheritance:
class Foo:
#classmethod
def make_me_cm(cls):
return cls()
#staticmethod
def make_me_sm():
return Foo()
class Bar(Foo):
pass
print(Bar.make_me_cm()) # it's a Bar instance
print(Bar.make_me_sm()) # it's a Foo instance

Mock class instances without calling `__init__` and mock their respective attributes

I have a class MyClass with a complex __init__ function.
This class had a method my_method(self) which I would like to test.
my_method only needs attribute my_attribute from the class instance.
Is there a way I can mock class instances without calling __init__ and by setting the attributes of each class instance instead?
What I have:
# my_class.py
from utils import do_something
class MyClass(object):
def __init__(self, *args, **kwargs):
# complicated function which I would like to bypass when initiating a mocked instance class
pass
def my_method(self):
return do_something(self.my_attribute)
What I tried
#mock.patch("my_class.MyClass")
def test_my_method(class_mock, attribute):
instance = class_mock.return_value
instance.my_attribute = attribute
example_instance = my_class.MyClass()
out_my_method = example_instance.my_method()
# then perform some assertions on `out_my_method`
however this still makes usage of __init__ which I hope we can by-pass or mock.
As I mentioned in the comments, one way to test a single method without having to create an instance is:
MyClass.my_method(any_object_with_my_attribute)
The problem with this, as with both options in quamrana's answer, is that we have now expanded the scope of any future change just because of the tests. If a change to my_method requires access to an additional attribute, we now have to change both the implementation and something else (the SuperClass, the MockMyClass, or in this case any_object_with_my_attribute_and_another_one).
Let's have a more concrete example:
import json
class MyClass:
def __init__(self, filename):
with open(filename) as f:
data = json.load(f)
self.foo = data.foo
self.bar = data.bar
self.baz = data.baz
def my_method(self):
return self.foo ** 2
Here any test that requires an instance of MyClass. is painful because of the file access in __init__. A more testable implementation would split apart the detail of how the data is accessed and the initialisation of a valid instance:
class MyClass:
def __init__(self, foo, bar, baz):
self.foo = foo
self.bar = bar
self.baz = baz
def my_method(self):
return self.foo ** 2
#classmethod
def from_json(cls, filename):
with open(filename) as f:
data = json.load(f)
return cls(data.foo, data.bar, data.baz)
You have to refactor MyClass("path/to/file") to MyClass.from_json("path/to/file"), but wherever you already have the data (e.g. in your tests) you can use e.g. MyClass(1, 2, 3) to create the instance without requiring a file (you only need to consider the file in the tests of from_json itself). This makes it clearer what the instance actually needs, and allows the introduction of other ways to construct an instance without changing the interface.
There are at least two options I can see:
Extract a super class:
class SuperClass:
def __init__(self, attribute):
self.my_attribute = attribute
def my_method(self):
return do_something(self.my_attribute)
class MyClass(SuperClass):
def __init__(self, *args, **kwargs):
super().__init__(attribute) # I don't know where attribute comes from
# complicated function which I would like to bypass when initiating a mocked instance class
Your tests can instantiate SuperClass and call my_method().
Inherit from MyClass as is and make your own simple __init__():
class MockMyClass(MyClass):
def __init__(self, attribute):
self.my_attribute = attribute
Now your test code can instantiate MockMyClass with the required attribute and call my_method()
For instance, you could write the test as follows
def test_my_method(attribute):
class MockMyClass(MyClass):
def __init__(self, attribute):
self.my_attribute = attribute
out_my_method = MockMyClass(attribute).my_method()
# perform assertions on out_my_method

super not working with class decorators?

Lets define simple class decorator function, which creates subclass and adds 'Dec' to original class name only:
def decorate_class(klass):
new_class = type(klass.__name__ + 'Dec', (klass,), {})
return new_class
Now apply it on a simple subclass definition:
class Base(object):
def __init__(self):
print 'Base init'
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super(MyClass, self).__init__()
Now, if you try instantiate decorated MyClass, it will end up in an infinite loop:
c = MyClass()
# ...
# File "test.py", line 40, in __init__
# super(MyClass, self).__init__()
# RuntimeError: maximum recursion depth exceeded while calling a Python object
It seems, super can't handle this case and does not skip current class from inheritance chain.
The question, how correctly use class decorator on classes using super ?
Bonus question, how get final class from proxy-object created by super ? Ie. get object class from super(Base, self).__init__ expression, as determined parent class defining called __init__.
If you just want to change the class's .__name__ attribute, make a decorator that does that.
from __future__ import print_function
def decorate_class(klass):
klass.__name__ += 'Dec'
return klass
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Python 3 output
MyClass init
Base init
<class '__main__.MyClass'> MyClassDec
Note the difference in the repr of cls. (I'm not sure why you'd want to change a class's name though, it sounds like a recipe for confusion, but I guess it's ok for this simple example).
As others have said, an #decorator isn't intended to create a subclass. You can do it in Python 3 by using the arg-less form of super (i.e., super().__init__()). And you can make it work in both Python 3 and Python 2 by explicitly supplying the parent class rather than using super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
#decorate_class
class MyClass(Base):
def __init__(self):
print('MyClass init')
Base.__init__(self)
c = MyClass()
cls = c.__class__
print(cls, cls.__name__)
Python 2 & 3 output
MyClass init
Base init
<class '__main__.MyClassDec'> MyClassDec
Finally, if we just call decorate_class using normal function syntax rather than as an #decorator we can use super.
from __future__ import print_function
def decorate_class(klass):
name = klass.__name__
return type(name + 'Dec', (klass,), {})
class Base(object):
def __init__(self):
print('Base init')
class MyClass(Base):
def __init__(self):
print('MyClass init')
super(MyClass, self).__init__()
MyClassDec = decorate_class(MyClass)
c = MyClassDec()
cls = c.__class__
print(cls, cls.__name__)
The output is the same as in the last version.
Since your decorator returns an entirely new class with different name, for that class MyClass object doesn't even exist. This is not the case class decorators are intended for. They are intended to add additional functionality to an existing class, not outright replacing it with some other class.
Still if you are using Python3, solution is simple -
#decorate_class
class MyClass(Base):
def __init__(self):
print 'MyClass init'
super().__init__()
Otherwise, I doubt there is any straight-forward solution, you just need to change your implementation. When you are renaming the class, you need to rewrite overwrite __init__ as well with newer name.
The problem is that your decorator creates a subclass of the original one. That means that super(Myclass) now point to... the original class itself!
I cannot even explain how the 0 arg form of super manages to do the job in Python 3, I could not find anything explicit in the reference manual. I assume it must use the class in which it is used at the time of declaration. But I cannot imagine a way to get that result in Python2.
If you want to be able to use super in the decorated class in Python 2, you should not create a derived class, but directly modify the original class in place.
For example, here is a decorator that prints a line before and after calling any method:
def decorate_class(klass):
for name, method in klass.__dict__.iteritems(): # iterate the class attributes
if isinstance(method, types.FunctionType): # identify the methods
def meth(*args, **kwargs): # define a wrapper
print "Before", name
method(*args, **kwargs)
print "After", name
setattr(klass, name, meth) # tell the class to use the wrapper
return klass
With your example it gives as expected:
>>> c = MyClass()
Before __init__
MyClass init
Base init
After __init__

How to incorporate type checking in an abstract base class in Python

When I define a class, I like to include type checking (using assert) of the input variables. I am now defining a 'specialized' class Rule which inherits from an abstract base class (ABC) BaseRule, similar to the following:
import abc
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#abc.abstractproperty
def resources(self):
pass
class Rule(BaseRule):
def __init__(self, resources):
assert all(isinstance(resource, Resource) for resource in resources) # type checking
self._resources = resources
#property
def resources(self):
return self._resources
class Resource(object):
def __init__(self, domain):
self.domain = domain
if __name__ == "__main__":
resources = [Resource("facebook.com")]
rule = Rule(resources)
The assert statement in the __init__ function of the Rule class ensures that the resources input is a list (or other iterable) of Resource objects. However, this would also be the case for other classes which inherit from BaseRule, so I would like to incorporate this assertion in the abstractproperty somehow. How might I go about this?
See this documentation on abc Type annotations with mypy-lang https://mypy.readthedocs.io/en/latest/class_basics.html#abstract-base-classes-and-multiple-inheritance
Make your base class have a non-abstract property that calls separate abstract getter and setter methods. The property can do the validation you want before calling the setter. Other code (such as the __init__ method of a derived class) that wants to trigger the validation can do so by doing its assignment via the property:
class BaseRule(object):
__metaclass__ = abc.ABCMeta
#property
def resources(self): # this property isn't abstract and shouldn't be overridden
return self._get_resources()
#resources.setter
def resources(self, value):
assert all(isinstance(resource, Resources) for resource in value)
self._set_resources(value)
#abstractmethod
def _get_resources(self): # these methods should be, instead
pass
#abstractmethod
def _set_resources(self, value):
pass
class Rule(BaseRule):
def __init__(self, resources):
self.resources = resources # assign via the property to get type-checking!
def _get_resources(self):
return self._resources
def _set_resources(self, value):
self._resources = value
You might even consider moving the __init__ method from Rule into the BaseRule class, since it doesn't need any knowledge about Rule's concrete implementation.

Class instance as static attribute

Python 3 doesn't allow you to reference a class inside its body (except in methods):
class A:
static_attribute = A()
def __init__(self):
...
This raises a NameError in the second line because 'A' is not defined.
Alternatives
I have quickly found one workaround:
class A:
#property
#classmethod
def static_property(cls):
return A()
def __init__(self):
...
Although this isn't exactly the same since it returns a different instance every time (you could prevent this by saving the instance to a static variable the first time).
Are there simpler and/or more elegant alternatives?
EDIT:
I have moved the question about the reasons for this restriction to a separate question
The expression A() can't be run until the class A has been defined. In your first block of code, the definition of A is not complete at the point you are trying to execute A().
Here is a simpler alternative:
class A:
def __init__(self):
...
A.static_attribute = A()
When you define a class, Python immediately executes the code within the definition. Note that's different than defining a function where Python compiles the code, but doesn't execute it.
That's why this will create an error:
class MyClass(object):
a = 1 / 0
But this won't:
def my_func():
a = 1 / 0
In the body of A's class definition, A is not yet defined, so you can't reference it until after it's been defined.
There are several ways you can accomplish what you're asking, but it's not clear to me why this would be useful in the first place, so if you can provide more details about your use case, it'll be easier to recommend which path to go down.
The simplest would be what khelwood posted:
class A(object):
pass
A.static_attribute = A()
Because this is modifying class creation, using a metaclass could be appropriate:
class MetaA(type):
def __new__(mcs, name, bases, attrs):
cls = super(MetaA, mcs).__new__(mcs, name, bases, attrs)
cls.static_attribute = cls()
return cls
class A(object):
__metaclass__ = MetaA
Or you could use descriptors to have the instance lazily created or if you wanted to customize access to it further:
class MyDescriptor(object):
def __get__(self, instance, owner):
owner.static_attribute = owner()
return owner.static_attribute
class A(object):
static_attribute = MyDescriptor()
Using the property decorator is a viable approach, but it would need to be done something like this:
class A:
_static_attribute = None
#property
def static_attribute(self):
if A._static_attribute is None:
A._static_attribute = A()
return A._static_attribute
def __init__(self):
pass
a = A()
print(a.static_attribute) # -> <__main__.A object at 0x004859D0>
b = A()
print(b.static_attribute) # -> <__main__.A object at 0x004859D0>
You can use a class decorator:
def set_static_attribute(cls):
cls.static_attribute = cls()
return cls
#set_static_attribute
class A:
pass
Now:
>>>> A.static_attribute
<__main__.A at 0x10713a0f0>
Applying the decorator on top of the class makes it more explicit than setting static_attribute after a potentially long class definition. The applied decorator "belongs" to the class definition. So if you move the class around in your source code you will more likely move it along than an extra setting of the attribute outside the class.

Categories

Resources