This list shows what methods you need to implement for your class to be "regarded" as Sequence: __getitem__, __len__, __contains__, __iter__, __reversed__, index, and count. So why does this minimal implementation does not work, i.e. why issubclass(S, Sequence) is False?
from collections import *
class S(object):
def __getitem__(self, item):
raise IndexError
def __len__(self):
return 0
def __contains__(self, item):
return False
def __iter__(self):
return iter(())
def __reversed__(self):
return self
def index(self, item):
raise IndexError
def count(self, item):
return 0
issubclass(S, Iterable) # True :-)
issubclass(S, Sized) # True :-)
issubclass(S, Container) # True :-)
issubclass(S, Sequence) # False :-(
Is there an additional method I need to implement that I overlooked? Did I misunderstand abstract base classes? Subclassing Sequence makes issubclass return True of course, but that kinda defeats the idea behind abc, doesn't it?
Use the source, Luke!
Sequence does not implement its own __subclasshook__, and all the implementations of __subclasshook__ from the parents of Sequence have checks like this:
class Iterable:
...
#classmethod
def __subclasshook__(cls, C):
if cls is Iterable: # <<<<
if _hasattr(C, "__iter__"):
return True
return NotImplemented
You can however explicitly register() your class as a Sequence:
Sequence.register(S)
As for the reason why Sequence does not implement __subclasshook__, see issue 16728 (which title was initially "collections.abc.Sequence shoud provide __subclasshook__"). The issue can be summarized by saying that a sequence can be many things, depending on the needs of who uses it:
Many algorithms that require a sequence only need __len__ and __getitem__. [...] collections.abc.Sequence is a much richer interface.
Related
Is it possible to define that a class needs a specific constructor?
class Constructible(Protocol):
def __init__(self, i: int): # how do I do this?
raise NotImplementedError
def get_value(self):
raise NotImplementedError
def map_is(cs: Iterable[Constructible], i: int):
mapped = tuple(C(i) for C in cs)
values = tuple(c.get_value() for c in mapped)
# both the constructor and a member method are used
return mapped, values
# implementors (omitting __hash__ and __eq__ for brevity)
class X(Constructible):
def __init__(self, i):
self.i=i
def get_value(self):
return self.i
class Sq(Constructible):
def __init__(self, i):
self.i=i
def get_value(self):
return self.i * self.i
cs, values = tuple(map_is((X, Sq), 5))
assert values == (5, 25)
When specifying it like this, I get
$ mypy constr.py
constr.py:12: error: "Constructible" not callable
Found 1 error in 1 file (checked 1 source file)
Is this even possible? Or should I revert to a factory function #classmethod def construct(i: int): Self?
As explained by #jonrsharpe, you do not pass an iterable of Constructible instances to map_is but an iterable of classes. That means that you should define the function that way:
def map_is(cs: Iterable[Type[Constructible]], i: int):
return (C(i) for C in cs)
That is enough for mypy to validate the code.
But there is an unrelated problem: you never declared any __hash__ nor __equal__ special method. That means that in assert values == (X(5), Sq(5)) the equality used is the one defined on the object class (same as is). So after the above fix, the code executes successfully but still raises an AssertionError, because the objects do have same value, yet they are distinct objects...
I've read in the abc python module docs that a Sequence is something that implements the following: __getitem__, __len__, __contains__, __iter__, __reversed__, index, and count.
Yet, when I run the following example it yields false:
from collections import abc
class Sequence():
def __getitem__(self):
pass
def __len__(self):
pass
def index(self):
pass
def count(self):
pass
def __contains__(self):
pass
def __iter__(self):
pass
def __reversed__(self):
pass
print(isinstance(Sequence(), abc.Sequence)) # False
When I was doing similar stuff for abc.Collection or abc.Reversed to play around I got the results I expected, for example I created a dummy class that implements __contains__, __iter__, __len__ and it was in fact detected correctly as an instance of abc.Collection.
Do you have any idea what's wrong with Sequence?
EDIT 1:
from collections import abc
class CustomIterable:
def __iter__(self):
pass
print(isinstance(CustomIterable(), abc.Iterable)) # True
even though my custom iterable is not from abc it still is recognized as Iterable because it implements __iter__ special method.
I had a similar question today.
From what I can gather, collections.abc.Iterable implements a custom __subclasshook__() method, whereas collections.abc.Sequence does not:
# _collections_abc.py
class Iterable(metaclass=ABCMeta):
__slots__ = ()
#abstractmethod
def __iter__(self):
while False:
yield None
#classmethod
def __subclasshook__(cls, C):
if cls is Iterable:
return _check_methods(C, "__iter__") # True if __iter__ implemented
return NotImplemented
What this means is that if a class Foo defines the required __iter__ method then isinstance(Foo(), collections.abc.Iterable) will return True:
from collections.abc import Iterable
class Foo:
def __iter__(self):
return []
assert isinstance(Foo(), Iterable)
I'm not sure why collections.abc.Iterable implements a custom __subclasshook__() method but collections.abc.Sequence does not.
Your custom Sequence class is different from the abc.Sequence class so isinstance will return false.
If you're looking for True, your custom class needs to inherit from abc.Sequence:
class Sequence(abc.Sequence):
.....
In a framework, I often want to provide a base class that the framework user sub classes. The base class provides controlled access to the base class. One way to accomplish this is to provide unimplemented methods with different names, for example by adding an underscore as prefix:
class Base:
def method(self, arg):
# ...
result = self._method(arg)
# ...
return result
def _method(self, arg):
raise NotImplementedError
However, this scheme only works for one level of inheritance. For more levels, the different method names make it hard to keep an overview of what's going on. Moreover, the framework user has to override different methods depending on the base class he chooses:
class Base:
def method(self, arg):
# ...
result = self._method_sub(arg)
# ...
return result
def _method_sub(self, arg):
raise NotImplementedError
class Intermediate(Base):
def _method_sub(self, arg):
# ...
result = self._method_sub_sub(arg)
# ...
return result
def _method_sub_sub(self, arg):
raise NotImplementedError
Calling super methods does not help when the base method needs to access return values of the child method. I feel object orientation is slightly flawed, missing a child keyword that allows to forward calls to the child class. What solutions exist to solve this problem?
Does this give you what you want?
import abc
class Base(object):
__metaclass__ = abc.ABCMeta
def calculate(self):
result = self.doCalculate()
if 3 < result < 7: # do whatever validation you want
return result
else:
raise ValueError()
#abc.abstractmethod
def doCalculate(self):
pass
class Intermediate(Base):
__metaclass__ = abc.ABCMeta
class Leaf(Intermediate):
def doCalculate(self):
return 5
leaf = Leaf()
print leaf.calculate()
I think the question focuses on different points where behavior extension in an intermediate class can happen. The intermediate class obviously shall refine the "control" part here.
1st Solution
Mostly this can be done the classical way by just overriding the "safe" method - particularly when "both Base and Intermediate are abstract classes provided by the framework", things can be organized so.
The final "silly" implementation class which does the spade work overrides the unsafe method.
Think of this example:
class DoublePositive:
def double(self, x):
assert x > 0
return self._double(x)
def _double(self, x):
raise NotImplementedError
class DoubleIntPositive(DoublePositive):
def double(self, x):
assert isinstance(x, int)
return DoublePositive.double(self, x)
class DoubleImplementation(DoubleIntPositive):
def _double(self, x):
return 2 * x
2nd Solution
Calling virtual child class methods, thus behavior extension at "inner" execution points in a non-classical manner, could be done by introspection in Python - by stepping down the class __bases__ or method resolution order __mro__ with a helper function.
Example:
def child_method(cls, meth, _scls=None):
scls = _scls or meth.__self__.__class__
for base in scls.__bases__:
if base is cls:
cmeth = getattr(scls, meth.__name__, None)
if cmeth.__func__ is getattr(cls, meth.__name__, None).__func__:
return child_method(scls, meth) # next child
if cmeth:
return cmeth.__get__(meth.__self__)
for base in scls.__bases__:
r = child_method(cls, meth, base) # next base
if r is not None:
return r
if _scls is None:
raise AttributeError("child method %r missing" % meth.__name__)
return None
class Base(object):
def double(self, x):
assert x > 0
return Base._double(self, x)
def _double(self, x):
return child_method(Base, self._double)(x)
class Inter(Base):
def _double(self, x):
assert isinstance(x, float)
return child_method(Inter, self._double)(x)
class Impl(Inter):
def _double(self, x):
return 2.0 * x
The helper function child_method() here is thus kind of opposite of Python's super().
3rd solution
If calls should be chainable flexibly, things can be organized as a kind of handler chain explicitly. Think of self.addHandler(self.__privmeth) in the __init__() chain - or even via a tricky meta class. Study e.g. the urllib2 handler chains.
I am working on a graph library in Python and I am defining my vetex this way:
class Vertex:
def __init__(self,key,value):
self._key = key
self._value = value
#property
def key(self):
return self._key
#key.setter
def key(self,newKey):
self._key = newKey
#property
def value(self):
return self._value
#value.setter
def value(self,newValue):
self.value = newValue
def _testConsistency(self,other):
if type(self) != type(other):
raise Exception("Need two vertexes here!")
def __lt__(self,other):
_testConsistency(other)
if self.index <= other.index:
return True
return False
......
Do I really have to define __lt__,__eq__,__ne__....all by my self? It is so verbose. Is there simpler way I can get around this?
Cheers.
Please dont use __cmp__ since it will be away in python 3.
functools.total_ordering can help you out here. It's meant to be a class decorator. You define one of __lt__(), __le__(), __gt__(), or __ge__() AND __eq__ and it fills in the rest.
As a side note:
Instead of writing this
if self.index <= other.index:
return True
return False
write this:
return self.index <= other.index
It's cleaner that way. :-)
Using functools.total_ordering, you only need to define one of the equality operators and one of the ordering operators. In Python < 3.2, you're out of luck, something has to define these operators as individual methods. Though you may be able to save some code by writing a simpler version of total_ordering yourself, if you need it in several places.
I have the following code in django.template:
class Template(object):
def __init__(self, template_string, origin=None, name='<Unknown Template>'):
try:
template_string = smart_unicode(template_string)
except UnicodeDecodeError:
raise TemplateEncodingError("Templates can only be constructed from unicode or UTF-8 strings.")
if settings.TEMPLATE_DEBUG and origin is None:
origin = StringOrigin(template_string)
self.nodelist = compile_string(template_string, origin)
self.name = name
def __iter__(self):
for node in self.nodelist:
for subnode in node:
yield subnode
def render(self, context):
"Display stage -- can be called many times"
return self.nodelist.render(context)
The part I am confused about is below. How does this __iter__ method work? I can't find any corresponding next method.
def __iter__(self):
for node in self.nodelist:
for subnode in node:
yield subnode
This is the only way that I know how to implement __iter__:
class a(object):
def __init__(self,x=10):
self.x = x
def __iter__(self):
return self
def next(self):
if self.x > 0:
self.x-=1
return self.x
else:
raise StopIteration
ainst = a()
for item in aisnt:
print item
In your answers, try to use code examples rather than text, because my English is not very good.
From the docs:
If a container object’s __iter__()
method is implemented as a generator,
it will automatically return an
iterator object (technically, a
generator object) supplying the
__iter__() and __next__() methods.
Here is your provided example using a generator:
class A():
def __init__(self, x=10):
self.x = x
def __iter__(self):
for i in reversed(range(self.x)):
yield i
a = A()
for item in a:
print(item)
That __iter__method returns a python generator (see the documentation), as it uses the yield keyword.
The generator will provide the next() method automatically; quoting the documentation:
What makes generators so compact is that the __iter__() and next() methods are created
automatically.
EDIT:
Generators are really useful. If you are not familiar with them, I suggest you readup on them, and play around with some test code.
Here is some more info on iterators and generators from StackOverflow.