Constrain method access to an interface? - python

I have a Python system consisting of around 9-10 classes all implementing most of a fat duck-typed interface and used interchangeably by a large collection of modules. I'm trying to refactor the classes into a core, explicit (i.e. ABC) interface and peripheral functionality, following separation of responsibility, but in order to do that I need to be able to tell when the consumer modules are calling methods outside the core interface.
Suppose I have an ABC with abstract methods:
from abc import ABCMeta, abstractmethod
class MyABC:
__metaclass__ = ABCMeta
#abstractmethod
def foo(self):
pass
I also have a class implementing those abstract methods as well as other methods:
class MyClass(MyABC):
def foo(self):
pass
def bar(self):
pass
instance = MyClass()
>>> isinstance(instance, MyABC)
True
How can I ensure that when I pass instance to a method do_something it only uses the methods that are part of MyABC (in this case foo) and not any other methods (bar)? In a static-typed language (e.g. C++) I could pass do_something a pointer of the ABC type; is there some sort of wrapper available in Python that will restrict method access similarly?

Short and simple answer: no
There is no concept of private methods/variables in Python that would be enforced, as described here in detail.
In Python this is handeld by convention.
And if you really want to go into the deep internals checkout this thread.

This is what I came up with:
class ABCGuard(object):
def __init__(self, obj, *abcs):
if any(not isinstance(obj, abc) for abc in abcs):
raise ValueError("{0} must implement {1}"
.format(obj.__class__.__name__,
', '.join(abc.__name__ for abc in abcs
if not isinstance(obj, abc))))
self.__obj = obj
self.__abcs = abcs
classname = '{0}{{{1}}}'.format(obj.__class__.__name__,
', '.join(abc.__name__ for abc in abcs))
self.__class__ = type(classname, (ABCGuard, ) + abcs, {})
def __getattribute__(self, name):
if name.startswith('_ABCGuard__') or (name.startswith('__') and
name.endswith('__')):
return super(ABCGuard, self).__getattribute__(name)
elif any(name in abc.__abstractmethods__ for abc in self.__abcs):
return getattr(self.__obj, name)
else:
raise AttributeError("%r object has no attribute %r" %
(self.__class__.__name__, name))
def __dir__(self):
return [x for abc in self.__abcs for x in abc.__abstractmethods__]

Related

Multiple Inheritance Dependency - Base requires AbstractBaseClass

The gist of the question: if inheriting multiple classes how can I guarantee that if one class is inherited, a compliment Abstract Base Class (abc) is also used by the child object.
I've been messing around with pythons inheritance trying to see what kind of cool stuff I can do and I came up with this pattern, which is kind of interesting.
I've been trying to use this make implementing and testing objects that interface with my cache easier. I've got three modules:
ICachable.py
Cacheable.py
SomeClass.py
ICacheable.py
import abc
class ICacheable(abc.ABC):
#property
#abc.abstractmethod
def CacheItemIns(self):
return self.__CacheItemIns
#CacheItemIns.setter
#abc.abstractmethod
def CacheItemIns(self, value):
self.__CacheItemIns = value
return
#abc.abstractmethod
def Load(self):
"""docstring"""
return
#abc.abstractmethod
def _deserializeCacheItem(self):
"""docstring"""
return
#abc.abstractmethod
def _deserializeNonCacheItem(self):
"""docstring"""
return
Cacheable.py
class Cacheable:
def _getFromCache(self, itemName, cacheType,
cachePath=None):
"""docstring"""
kwargs = {"itemName" : itemName,
"cacheType" : cacheType,
"cachePath" : cachePath}
lstSearchResult = CacheManager.SearchCache(**kwargs)
if lstSearchResult[0]:
self.CacheItemIns = lstSearchResult[1]
self._deserializeCacheItem()
else:
cacheItem = CacheManager.NewItem(**kwargs)
self.CacheItemIns = cacheItem
self._deserializeNonCacheItem()
return
SomeClass.py
import ICacheable
import Cacheable
class SomeClass(Cacheable, ICacheable):
__valueFromCache1:str = ""
__valueFromCache2:str = ""
__CacheItemIns:dict = {}
#property
def CacheItemIns(self):
return self.__CacheItemIns
#CacheItemIns.setter
def CacheItemIns(self, value):
self.__CacheItemIns = value
return
def __init__(self, itemName, cacheType):
#Call Method from Cacheable
self.__valueFromCache1
self.__valueFromCache2
self.__getItemFromCache(itemName, cacheType)
return
def _deserializeCacheItem(self):
"""docstring"""
self.__valueFromCache1 = self.CacheItemIns["val1"]
self.__valueFromCache2 = self.CacheItemIns["val2"]
return
def _deserializeNonCacheItem(self):
"""docstring"""
self.__valueFromCache1 = #some external function
self.__valueFromCache2 = #some external function
return
So this example works, but the scary thing is that there is no gurantee that a class inherriting Cacheable also inherits ICacheable. Which seems like a design flaw, as Cacheable is useless on its own. However the ability to abstract things from my subclass/child class with this is powerful. Is there a way to guarantee Cacheable's dependency on ICacheable?
If you explicitly do not want inheritance, you can register classes as virtual subclasses of an ABC.
#ICacheable.register
class Cacheable:
...
That means every subclass of Cacheable is automatically treated as subclass of ICacheable as well. This is mostly useful if you have an efficient implementation that would be slowed down by having non-functional Abstract Base Classes to traverse, e.g. for super calls.
However, ABCs are not just Interfaces and it is fine to inherit from them. In fact, part of the benefit of ABC is that it enforces subclasses to implement all abstract methods. An intermediate helper class, such as Cacheable, is fine not to implement all methods when it is never instantiated. However, any non-virtual subclass that is instantiated must be concrete.
>>> class FailClass(Cacheable, ICacheable):
... ...
...
>>> FailClass()
TypeError: Can't instantiate abstract class FailClass with abstract methods CacheItemIns, Load, _deserializeCacheItem, _deserializeNonCacheItem
Note that if you
always subclass as class AnyClass(Cacheable, ICacheable):
never instantiate Cacheable
that is functionally equivalent to Cacheable inheriting from ICacheable. The Method Resolution Order (i.e. the inheritance diamond) is the same.
>>> AnyClass.__mro__
(__main__. AnyClass, __main__.Cacheable, __main__.ICacheable, abc.ABC, object)

Inherited Abstract Classes in python

In python, can I define an interface (abstract class) by inheritance from another abstract class?
If I try:
import abc
ABC = abc.ABCMeta('ABC', (object,), {})
class interface(ABC):
#abc.abstractmethod
def method(self, message):
return
class InterfaceExtended(ABC, interface):
#abc.abstractmethod
def NewMethod(self, message):
return
I get an error on the "InterfaceExtended" class :
TypeError: Error when calling the metaclass bases
Cannot create a consistent method resolution
order (MRO) for bases ABC, Interface
Don't inherit from ABC in your second class. The interface it derives from already inherits ABC

Intercepting __getattr__ in a Python 2.3 old-style mixin class?

I have a large Python 2.3 based installation with 200k LOC. As part of a migration project I need to intercept all attribute lookups of all old-style class.
Old legacy code:
class Foo(Bar):
...
My idea is to inject a common mixin class like
class Foo(Bar, Mixin):
...
class Mixin:
def __getattr__(self, k)
print repr(self), k
return Foo.__getattr__(self, k)
However I am running always into a recursion because Foo.__getattr__ resolves
to Mixin.__getattr__.
Is there any way to fix the code for Python 2.3 old-style classes?
If you are already injecting mixins, why not add object as parent, to make them new style
class Foo(Mixin, Bar, object):
...
And then use super
class Mixin(object):
def __getattr__(self, k)
print repr(self), k
return super(Mixin, self).__getattr__(k)
Assuming that none of the classes in your code base implement __setattr__ or __getattr__ then one approach is to intercept __setattr__ in your Mixin, writing the value to another reserved attribute, then read it back in __getattr__
class Mixin:
def __setattr__(self, attr, value):
# write the value into some special reserved space
namespace = self.__dict__.setdefault("_namespace", {})
namespace[attr] = value
def __getattr__(self, attr):
# reject special methods so e.g. __repr__ can't recurse
if attr.startswith("__") and attr.endswith("__"):
raise AttributeError
# do whatever you wish to do here ...
print repr(self), attr
# read the value from the reserved space
namespace = self.__dict__.get("_namespace", {})
return namespace[attr]
Example:
class Foo(Mixin):
def __init__(self):
self.x = 1
Then
>>> Foo().x
<__main__.Foo instance at 0x10c4dad88> x
Clearly this won't work if any of your Foo classes implement __setattr__ or __getattr__ themselves.

How to enforce method signature for child classes?

Languages like C#, Java has method overloads, which means if child class does not implement the method with exact signature will not overwrite the parent method.
How do we enforce the method signature in child classes in python? The following code sample shows that child class overwrites the parent method with different method signature:
>>> class A(object):
... def m(self, p=None):
... raise NotImplementedError('Not implemented')
...
>>> class B(A):
... def m(self, p2=None):
... print p2
...
>>> B().m('123')
123
While this is not super important, or maybe by design of python (eg. *args, **kwargs). I am asking this for the sake of clarity if this is possible.
Please Note:
I have tried #abstractmethod and the ABC already.
Below is a complete running example showing how to use a metaclass to make sure that subclass methods have the same signatures as their base classes. Note the use of the inspect module. The way I'm using it here it makes sure that the signatures are exactly the same, which might not be what you want.
import inspect
class BadSignatureException(Exception):
pass
class SignatureCheckerMeta(type):
def __new__(cls, name, baseClasses, d):
#For each method in d, check to see if any base class already
#defined a method with that name. If so, make sure the
#signatures are the same.
for methodName in d:
f = d[methodName]
for baseClass in baseClasses:
try:
fBase = getattr(baseClass, methodName).__func__
if not inspect.getargspec(f) == inspect.getargspec(fBase):
raise BadSignatureException(str(methodName))
except AttributeError:
#This method was not defined in this base class,
#So just go to the next base class.
continue
return type(name, baseClasses, d)
def main():
class A(object):
def foo(self, x):
pass
try:
class B(A):
__metaclass__ = SignatureCheckerMeta
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
except BadSignatureException:
print("Class B can't be constructed because of a bad method signature")
print("This is as it should be :)")
try:
class C(A):
__metaclass__ = SignatureCheckerMeta
def foo(self, x):
"""This is ok because the signature matches A.foo"""
pass
except BadSignatureException:
print("Class C couldn't be constructed. Something went wrong")
if __name__ == "__main__":
main()
Update of the accepted answer to work with python 3.5.
import inspect
from types import FunctionType
class BadSignatureException(Exception):
pass
class SignatureCheckerMeta(type):
def __new__(cls, name, baseClasses, d):
#For each method in d, check to see if any base class already
#defined a method with that name. If so, make sure the
#signatures are the same.
for methodName in d:
f = d[methodName]
if not isinstance(f, FunctionType):
continue
for baseClass in baseClasses:
try:
fBase = getattr(baseClass, methodName)
if not inspect.getargspec(f) == inspect.getargspec(fBase):
raise BadSignatureException(str(methodName))
except AttributeError:
#This method was not defined in this base class,
#So just go to the next base class.
continue
return type(name, baseClasses, d)
def main():
class A(object):
def foo(self, x):
pass
try:
class B(A, metaclass=SignatureCheckerMeta):
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
except BadSignatureException:
print("Class B can't be constructed because of a bad method signature")
print("This is as it should be :)")
try:
class C(A):
__metaclass__ = SignatureCheckerMeta
def foo(self, x):
"""This is ok because the signature matches A.foo"""
pass
except BadSignatureException:
print("Class C couldn't be constructed. Something went wrong")
if __name__ == "__main__":
main()
By design, the language doesn't support checking the signatures. For an interesting read, check out:
http://grokbase.com/t/python/python-ideas/109qtkrzsd/abc-what-about-the-method-arguments
From this thread, it does sound like you may be able to write a decorator to check the signature, with abc.same_signature(method1, method2), but I've never tried that.
The reason it is being overridden is because they actually have the same method signature. What is written there is akin to doing something like this in Java:
public class A
{
public void m(String p)
{
throw new Exception("Not implemented");
}
}
public class B extends A
{
public void m(String p2)
{
System.out.println(p2);
}
}
Note that even though the paramater names are different, the types are the same and thus they have the same signature. In strongly typed languages like this, we get to explicitly say what the types are going to be ahead of time.
In python the type of the paramater is dynamically determined at run time when you use the method. This makes it impossible for the python interpreter to tell which method you actually wished to call when you say B().m('123'). Because neither of the method signatures specify which type of paramater they expect, they simply say I'm looking for a call with one parameter. So it makes sense that the deepest (and most relevent to the actual object you have) is called, which would be class B's method because it is an instance of class B.
If you want to only process cetain types in a child class method, and pass along all others to the parent class, it can be done like this:
class A(object):
def m(self, p=None):
raise NotImplementedError('Not implemented')
class B(A):
def m(self, p2=None):
if isinstance(p2, int):
print p2
else:
super(B, self).m(p2)
Then using b gives you the desired output. That is, class b processes ints, and passes any other type along to its parent class.
>>> b = B()
>>> b.m(2)
2
>>> b.m("hello")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in m
File "<stdin>", line 3, in m
NotImplementedError: Not implemented
I use meta classes for others purposes in my code so I rolled a version that uses a class decorator instead. The below version works with python3. and also supports decorated methods (yes, this creates a potential loophole but if you use decorators that changes the actual signature, shame on you). To make it work with python2, change inspect.isfunction to inspect.ismethod
import inspect
from functools import wraps
class BadSignatureException(Exception):
pass
def enforce_signatures(cls):
for method_name, method in inspect.getmembers(cls, predicate=inspect.isfunction):
if method_name == "__init__":
continue
for base_class in inspect.getmro(cls):
if base_class is cls:
continue
try:
base_method = getattr(base_class, method_name)
except AttributeError:
continue
if not inspect.signature(method) == inspect.signature(base_method):
raise BadSignatureException("%s.%s does not match base class %s.%s" % (cls.__name__, method_name,
base_class.__name__, method_name))
return cls
if __name__ == "__main__":
class A:
def foo(self, x):
pass
def test_decorator(f):
#wraps(f)
def decorated_function(*args, **kwargs):
pass
return decorated_function
#enforce_signatures
class B(A):
#test_decorator
def foo(self):
"""This override shouldn't work because the signature is wrong"""
pass
mypy, and I expect other static type-checkers, will complain if methods on your subclass have a different signature to the methods they overwrite. It seems to me the best way to enforce type-signatures on child classes is to enforce mypy (or whatever).

How to access the subclasses of a class as properties?

So I have a .py file containing a class where its subclasses can be accessed as properties. All these subclasses are defined beforehand. I also need all the subclasses to have the same ability (having their own subclasses be accessible as properties). The biggest problem I've been facing is that I don't know how to access the current class within my implementation of __getattr__(), so that'd be a good place to start.
Here's some Python+Pseudocode with what I've tried so far. I'm pretty sure it won't work since __getattr__() seems to be only working with instances of a class. If that is case, sorry, I am not as familiar with OOP in Python as I would like.
class A(object):
def __getattr__(self, name):
subclasses = [c.__name__ for c in current_class.__subclasses__()]
if name in subclasses:
return name
raise AttributeError
If I've understood your question properly, you can do what you want by using a custom metaclass that adds a classmethod to its instances. Here's an example:
class SubclassAttributes(type):
def __getattr__(cls, name): # classmethod of instances
for subclass in cls.__subclasses__():
if subclass.__name__ == name:
return subclass
else:
raise TypeError('Class {!r} has no subclass '
'named {!r}'.format(cls.__name__, name))
class Base(object):
__metaclass__ = SubclassAttributes # Python 2 metaclass syntax
#class Base(object, metaclass=SubclassAttributes): # Python 3 metaclass syntax
# """ nothing to see here """
class Derived1(Base): pass
class Derived2(Base): pass
print(Base.Derived1) # -> <class '__main__.Derived1'>
print(Base.Derived2) # -> <class '__main__.Derived2'>
print(Base.Derived3) # -> TypeError: Class 'Base' has no subclass named 'Derived3'
For something that works in both Python 2 and 3, define the class as shown below. Derives Base from a class that has SubclassAttributes as its metaclass. The is similar to what the six module's with_metaclass() function does:
class Base(type.__new__(type('TemporaryMeta', (SubclassAttributes,), {}),
'TemporaryClass', (), {})): pass
class A(object):
def __getattr__(self, key):
for subclass in self.__class__.__subclasses__():
if (subclass.__name__ == key):
return subclass
raise AttributeError, key
Out of curiosity, what is this designed to be used for?
>>> class A(object):
... pass
...
>>> foo = A()
>>> foo.__class__
<class '__main__.A'>

Categories

Resources