Python inherited type variables - python

I suppose i'm misunderstand how type inheritance work in python.
While i'm defining variable inside Parent class, any Child class inherited in parent
referencing same variable from parent.
class Parent(object):
store = dict()
class ChildA(Parent):
pass
class ChildB(Parent):
pass
ChildA.store['key1'] = 'val'
ChildB.store['key2'] = 'val'
print ChildB.store['key1'] == ChildA.store['key2']
What i'm trying to achieve is store dictionary instance to be created in every Child class inherited from Parent. So referencing ChildB.store['key1'] would raise KeyError
I have tried to use __new__ to create dictionary instance while type is creating:
class NewParent(object):
def __new__(cls, *args, **kwargs):
rv = super(NewParent,cls).__new__(cls, *args, **kwargs)
rv.store = dict()
return rv
But it's seems like __new__ running only before instantiating Child class, so referencing variable via type (e.g. Child.store is raising AttributeError)
So is there any way to achieve behavior i want?

You want to use a metaclass, which lets you initialize a class definition sort of like how a constructor lets you initalize an instance. For more details, see http://eli.thegreenplace.net/2011/08/14/python-metaclasses-by-example/.
Example:
#!/usr/bin/env python2
class ParentMeta(type):
def __new__(meta, name, bases, dct):
dct['store'] = dict()
return super(ParentMeta, meta).__new__(meta, name, bases, dct)
class Parent(object):
__metaclass__ = ParentMeta
class ChildA(Parent):
pass
class ChildB(Parent):
pass
ChildA.store['key1'] = 'val'
ChildB.store['key2'] = 'val'
print ChildB.store['key1'] == ChildA.store['key2']
will result in
Traceback (most recent call last):
File "test.py", line 20, in <module>
print ChildB.store['key1'] == ChildA.store['key2']
KeyError: 'key1'

Related

Can't find classmethod over class name

I am trying to invoke classmethod over classname .AttributeError problem occurs
When I use #singleton ,I can't run with classname.functionname .It's must be classname().functionname
Why does this happen?
def singleton(cls):
'''
单例
:param cls:
:return:
'''
_instance = {}
def _singleton(*args, **kargs):
if cls not in _instance:
_instance[cls] = cls(*args, **kargs)
# print(type(_instance[cls])) <class '__main__.Coco'>
return _instance[cls]
return _singleton
#singleton
class Coco():
# def __new__(cls, *args, **kwargs):
# if not hasattr(Coco, "_instance"):
# if not hasattr(Coco, "_instance"):
# Coco._instance = object.__new__(cls)
# print(type(Coco._instance))
# return Coco._instance
def __init__(self):
print('coco')
#classmethod
def get_info(cls):
print('coco is 18 ages old')
# print(Coco().get_info())
print(Coco.get_info())
Exception
Traceback (most recent call last):
File "/Users/coco/Automation/AutoTestRes/scripts/python/coco.py", line 36, in <module>
print(Coco.get_info())
AttributeError: 'function' object has no attribute 'get_info'
When you use a decorator in Python, like this:
#decorator_name
class class_name:
...
..., this is equivalent to doing this:
class class_name:
...
class_name = decorator_name(class_name)
This means that the value of the variable class_name is no longer necessarily a class, but instead it is whatever the return value of decorator_name is.
In your case, the class decorator singleton returns the function _singleton, not the actual class. So when you say:
print(Coco.get_info())
..., this is the same as saying:
print(_singleton.get_info())
...within the function.
Therefore, you get an AttributeError, because the function, which now has the name Coco, does not have that attribute.
To access the attribute of the class, you need to run the function because this will return an instance of the class, which will have the attribute.
It is no longer possible to access the class itself from the global scope.

Dynamically altering object types on instantiation

I have a MetaClass that, at the moment, simply returns an obj of a class.
From this MetaClass, I derieved two different classes. The difference in the classes is a dictionary that gets passed to __init__ for one of the classes but not for the other one.
If this dict only contains one single entry, I want python to return an instance of the other class instead of the one that was actually called.
class _MetaClass(type):
def __new__(cls, name, bases, dct):
return super(_MetaClass, cls).__new__(name, bases, **dct)
class Class1(object):
__metaclass__ = _MetaClass
def __init__(self, someargs):
pass
class Class2(object):
__metaclass__ = _MetaClass
def __init__(self, someargs, kwargs):
if len(kwargs) == 1:
return Class1(someargs)
else:
pass
TestInstance = Class2("foo", {"bar":"foo"}) #should now be Class1 instance because only one item in dct
If the dict, like in this case, has only len(dct) == 1, then it should create an instance of Class1 with the "foo" passed to its __init__ instead of returning an instance of Class2 as it normally would
I tried to implement the __new__ and __init__ methods for the MetaClass, but I could not figure out how to check the arguments that are actually passes on new class instantiation
You could create a handlerMethod to deal with the issue
def handler(yourDict):
if len(yourDict) == 1:
return Class2(yourDict)
else:
return Class1(yourDict)
and then call the handler instead of the constructor
TestInstance = handler(yourDict)

__classcell__ generates error in Python 3.6 when the metaclass calls multiple super().__new__ from inherited class

Here is an executable code which works in Python 2.7 but results in an error in Python 3.6:
import six
class AMeta(type):
def __new__(cls, name, bases, attrs):
module = attrs.pop('__module__')
new_attrs = {'__module__': module}
classcell = attrs.pop('__classcell__', None)
if classcell is not None:
new_attrs['__classcell__'] = classcell
new = super(AMeta, cls).__new__(
cls, name, bases, new_attrs)
new.duplicate = False
legacy = super(AMeta, cls).__new__(
cls, 'Legacy' + name, (new,), new_attrs)
legacy.duplicate = True
return new
#six.add_metaclass(AMeta)
class A():
def pr(cls):
print('a')
class B():
def pr(cls):
print('b')
class C(A,B):
def pr(cls):
super(C, cls).pr() # not shown with new_attrs
B.pr(cls)
print('c') # not shown with new_attrs
c = C()
c.pr()
# Expected result
# a
# b
# c
I get the following error:
Traceback (most recent call last):
File "test.py", line 28, in <module>
class C(A,B):
TypeError: __class__ set to <class '__main__.LegacyC'> defining 'C' as <class '__main__.C'>
C is inherit from A that is generated with the metaclass AMeta. They are tests classes and AMeta's goal is to execute all the tests with 2 different file folders: the default one and the legacy one.
I found a way to remove thise error by removing classcell from attrs, then returning new = super(AMeta, cls).new(cls, name, bases, attrs) (not new_attrs) but it doesn't seem right, and if it is, I'd like to know why.
The goal of new_attrs resulted from this SO question or from the documentation where it states basically the opposite: when modifying the attrs, make sure to keep classcell because it is deprecated in Python 3.6 and will result in an error in Python 3.8.
Note that in this case, it removes the pr definition because they weren't passed to new_attrs, thus prints 'b' instead of 'abc', but is irrelevant for this problem.
Is there a way to call multiple super().new inside the new of a metaclass AMeta, and then call them from a class C inheriting from the class inheriting A ?
Without nesting inheritance, the error doesn't appear, like this:
import six
class AMeta(type):
def __new__(cls, name, bases, attrs):
new = super(AMeta, cls).__new__(
cls, name, bases, attrs)
new.duplicate = False
legacy = super(AMeta, cls).__new__(
cls, 'Duplicate' + name, (new,), attrs)
legacy.duplicate = True
return new
#six.add_metaclass(AMeta)
class A():
def pr(cls):
print('a')
a = A()
a.pr()
# Result
# a
Then maybe it is A's role to do something to fix it?
Thanks by advance,
What your problem is I can figure out, and how to work around it
The problem is that when you do what you are doing, you are passing the same cell object to both copies of your class: the original and the legacy one.
As it exists in two classes at once, it conflicts with the other place it is in use when one tries to make use of it - super() will pick the wrong ancestor class when called.
cell objects are picky, they are created in native code, and can't be created or configured on the Python side. I could figure out a way of creating the class copy by having a method that will return a fresh cell object, and passing that as __classcell__.
(I also tried to simply run copy.copy/copy.deepcopy on the classcell object -before resorting to my cellfactory bellow - it does not work)
In order to reproduce the problem and figure out a solution I made a simpler version of your metaclass, Python3 only.
from types import FunctionType
legacies = []
def classcellfactory():
class M1(type):
def __new__(mcls, name, bases, attrs, classcellcontainer=None):
if isinstance(classcellcontainer, list):
classcellcontainer.append(attrs.get("__classcell__", None))
container = []
class T1(metaclass=M1, classcellcontainer=container):
def __init__(self):
super().__init__()
return container[0]
def cellfactory():
x = None
def helper():
nonlocal x
return helper.__closure__[0]
class M(type):
def __new__(mcls, name, bases, attrs):
cls1 = super().__new__(mcls, name + "1", bases, attrs)
new_attrs = attrs.copy()
if "__classcell__" in new_attrs:
new_attrs["__classcell__"] = cellclass = cellfactory()
for name, obj in new_attrs.items():
if isinstance(obj, FunctionType) and obj.__closure__:
new_method = FunctionType(obj.__code__, obj.__globals__, obj.__name__, obj.__defaults__, (cellclass, ))
new_attrs[name] = new_method
cls2 = super().__new__(mcls, name + "2", bases, new_attrs)
legacies.append(cls2)
return cls1
class A(metaclass=M):
def meth(self):
print("at A")
class B(A):
pass
class C(B,A):
def meth(self):
super().meth()
C()
So, not only I create a nested-function in order to have the Python runtime create a separate cell object, that I then use in the cloned class - but also, methods that make use of the cellclass have to be re-created with a new __closure__ that points to the new cell var.
Without recreating the methods, they won't work in the clonned class - as super() in the cloned-class' methods will expect the cell pointing to the cloned class itself, but it points to the original one.
Fortunately, methods in Python 3 are plain functions - that makes the code simpler. However, that code won't run in Python 2 - so, just enclose it in an if block not to run on Python2. As the __cellclass__ attribute does not even exist there, there is no problem at all.
After pasting the code above in a Python shell I can run both methods and super() works:
In [142]: C().meth()
at A
In [143]: legacies[-1]().meth()
at A

Access base class variable from metaclass

I am trying to read base class variable from metaclass to override class variable using this code:
class TypeBaseMeta(type):
def __new__(cls, name, bases, namespace, **kwds):
for base in bases:
namespace['__validators__'] = base['__validators__'] + namespace['__validators__']
return type.__new__(cls, name, bases, namespace, **kwds)
class TypeBase(metaclass=TypeBaseMeta):
__validators__ = ('presence')
def __init__(self, *args, **kwargs):
pass
def validate_presence(self, flag):
if self.data:
return True
class String(TypeBase):
__validators__ = ('length')
def validate_length(self, range):
if len(self.data) in range(*range):
return True
but I got this error:
Traceback (most recent call last):
File "types.py", line 18, in <module>
class String(TypeBase):
File "types.py", line 4, in __new__
namespace['__validators__'] = base['__validators__'] + namespace['__validators__']
TypeError: 'TypeBaseMeta' object is not subscriptable
I know that subscriptable object must have __getitem__() and behave like dictionaries and list but I have no idea what is causing this error.
__validators__ is an attribute of the superclass, not a dict item, so it should be accessed with base.__validators__. (That is, change base['__validators__'] to base.__validators__. Don't change namespace['__validators__'].)
The reason you access the attribute of the current class with namespace['__validators__'] is because that class doesn't exist yet (it is being created by the metaclass). Right now all you have is a dict of its attributes. But the superclass (base) was already created, and is a real class whose attributes are accessed in the normal way, with ..
As Dunes points out in a comment, your code has another problem, which is that you should be writing ('presence',) and ('length',) for your validators, to create tuples. Otherwise they are just strings, and the subclass's __validators__ will be set to the single string 'presencelength'.
The __validators__ variable in the superclasses is not acessible as if it were a dictionary - you have to fecth it from it's __dict__ attribute, or use getattr.
- namespace['__validators__'] = base['__validators__'] + namespace['__validators__']
+ namespace['__validators__'] = base.__dict__.get('__validators__', ()) + namespace['__validators__']

counter part of __getattr__

I am trying to find a way to set dict values encapsulated into a class, for example using __getattr__ i can return the internal dict value, however the __setattr__ is called even when attributes exists, making my implementation ugly. The example below is simplified my actual class inherits from a Subject class (the subject part of the observer pattern)
i am trying to achieve something like this:
obj = Example()
obj.username = 'spidername' # all OK username is a key in the internal dict
# but company is not a key in the internal dict so
obj.company = 'ABC' # will raise AttributeError
and i am asking if there is a better way than the way i am doing below:
class Example(object):
def __init__(self, table=None):
self._fields = {}
self._table = table
def _set_fields(self):
"""
this method will be implemented by
subclasses and used to set fields names and values
i.e.
self._field['username'] = Field(default='unknown', is_primary=False)
"""
raise NotImplementedError
def __getattr__(self, name):
"""
great this method is only called when "name"
is not an attribute of this class
"""
if name in self._fields:
return self._fields[name].value
return None
def __setattr__(self, name, value):
"""
not so great, this method is called even for
attributes that exists in this class
is there a better way to do the following?
this can be in __init__, but its still ugly
"""
attribs = ['_fields', '_table']
if name in attribs:
super(Example, self).__setattr__(name, value)
else:
if name in self._fields:
self._fields[name].value = value
else:
raise AttributeError
EDIT: adjusted comment in code, added missin quotes
The problem is that the attributes don't exist when they are first assigned. In __init__, when you first assign a dict to _fields, _fields is not an attribute. It only becomes an existing attribute after its been assigned. You could use __slots__ if you know in advance what the attributes are, but my guess is that you don't. So my suggestion would be to insert these into the instance dict manually:
class Example(object):
def __init__(self, table=None):
self.__dict__['_fields'] = {}
self.__dict__['_table'] = table
...
def __setattr__(self, name, value):
if name in self._fields:
self._fields[name].value = value
else:
raise AttributeError
However, with this implementation, the only way you can add or change instance attributes later would be through __dict__. But I assume this is not likely.
FWIW, your overall goal can be achieved directly just by using __slots__:
>>> class Example(object):
__slots__ = ['username']
>>> obj = Example()
>>> obj.username = 'spiderman'
>>> obj.company = 'ABC'
Traceback (most recent call last):
File "<pyshell#18>", line 1, in <module>
obj.company = 'ABC'
AttributeError: 'Example' object has no attribute 'company'

Categories

Resources