python getting object properties with __dict__ - python

Using py3, I have an object that uses the #property decorator
class O(object):
def __init__(self):
self._a = None
#property
def a(self):
return 1
accessing the attribute a via __dict__ (with _a) doesn't seem to return the property decorated value but the initialized value None
o = O()
print(o.a, o.__dict__['_a'])
>>> 1, None
Is there a generic way to make this work? I mostly need this for
def __str__(self):
return ' '.join('{}: {}'.format(key, val) for key, val in self.__dict__.items())

Of course self.__dict__["_a"] will return self._a (well actually it's the other way round - self._a will return self.__dict__["_a"] - but anyway), not self.a. The only thing the property is doing here is to automatically invoke it's getter (your a(self) function) so you don't have to type the parens, otherwise it's just a plain method call.
If you want something that works with properties too, you'll have to get those manually from dir(self.__class__) and getattr(self.__class__, name), ie:
def __str__(self):
# py2
attribs = self.__dict__.items()
# py3
# attribs = list(self.__dict__.items())
for name in dir(self.__class__):
obj = getattr(self.__class__, name)
if isinstance(obj, property):
val = obj.__get__(self, self.__class__)
attribs.append((name, val))
return ' '.join('{}: {}'.format(key, val) for key, val in attribs)
Note that this won't prevent _a to appears in attribs - if you want to avoid this you'll also have to filter out protected names from the attribs list (all protected names, since you ask for something generic):
def __str__(self):
attribs = [(k, v) for k, v in self.__dict__.items() if not k.startswith("_")]
for name in dir(self.__class__):
# a protected property is somewhat uncommon but
# let's stay consistent with plain attribs
if name.startswith("_"):
continue
obj = getattr(self.__class__, name)
if isinstance(obj, property):
val = obj.__get__(self, self.__class__)
attribs.append((name, val))
return ' '.join('{}: {}'.format(key, val) for key, val in attribs)
Also note that this won't handle other computed attributes (property is just one generic implementation of the descriptor protocol). At this point, your best bet for something that's still as generic as possible but that can be customised if needed is to implement the above as a mixin class with a couple hooks for specialization:
class PropStrMixin(object):
# add other descriptor types you want to include in the
# attribs list
_COMPUTED_ATTRIBUTES_CLASSES = [property,]
def _get_attr_list(self):
attribs = [(k, v) for k, v in self.__dict__.items() if not k.startswith("_")]
for name in dir(self.__class__):
# a protected property is somewhat uncommon but
# let's stay consistent with plain attribs
if name.startswith("_"):
continue
obj = getattr(self.__class__, name)
if isinstance(obj, *self._COMPUTED_ATTRIBUTES_CLASSES):
val = obj.__get__(self, self.__class__)
attribs.append((name, val))
return attribs
def __str__(self):
attribs = self._get_attr_list()
return ' '.join('{}: {}'.format(key, val) for key, val in attribs)
class YouClass(SomeParent, PropStrMixin):
# here you can add to _COMPUTED_ATTRIBUTES_CLASSES
_COMPUTED_ATTRIBUTES_CLASSES = PropStrMixin + [SomeCustomDescriptor])

Property is basically a "computed attribute". In general, the property's value is not stored anywhere, it is computed on demand. That's why you cannot find it in the __dict__.
#property decorator replaces the class method by a descriptor object which then calls the original method as its getter. This happens at the class level.
The lookup for o.a starts at the instance. It does not exist there, the class is checked in the next step. O.a exists and is a descriptor (because it has special methods for the descriptor protocol), so the descriptor's getter is called and the returned value is used.
(EDITED)
There is not a general way to dump the name:value pairs for the descriptors. Classes including the bases must be inspected, this part is not difficult. However retrieving the values is equivalent to a function call and may have unexpected and undesirable side-effects. For a different perspective I'd like to quote a comment by bruno desthuilliers here: "property get should not have unwanted side effects (if it does then there's an obvious design error)".

You can also update self._a as getter since the return of the getter should always reflect what self._a is stored:
class O(object):
def __init__(self):
self._a = self.a
#property
def a(self):
self._a = 1
return self._a
A bit redundant, maybe, but setting self._a = None initially is useless in this case.
In case you need a setter
This would also be compatible given remove the first line in getter:
#a.setter
def a(self, value):
self._a = value

Related

Overriding getters and setters for attributes from a list of strings

The aim is to provide some strings in a list as attributes of a class. The class shall have not only attributes, but also the respective getter and setter methods. In some other class inherited from that some of those setters need to be overridden.
To this end I came up with the following. Using setattr in a loop over the list of strings, an attribute and the respective methods are created. Concerning this first part, the code works as expected.
However I am not able to override the setters in an inheriting class.
class Base():
attributes = ["attr{}".format(i) for i in range(100)]
def __init__(self):
_get = lambda a: lambda : getattr(self, a)
_set = lambda a: lambda v: setattr(self, a, v)
for attr in self.attributes:
setattr(self, attr, None)
setattr(self, "get_"+attr, _get(attr))
setattr(self, "set_"+attr, _set(attr))
class Child(Base):
def __init__(self):
super().__init__()
#setattr(self, "set_attr4", set_attr4)
# Here I want to override one of the setters to perform typechecking
def set_attr4(self, v):
print("This being printed would probably solve the problem.")
if type(v) == bool:
super().set_attr4(v)
else:
raise ValueError("attr4 must be a boolean")
if __name__ == "__main__":
b = Base()
b.attr2 = 5
print(b.get_attr2())
b.set_attr3(55)
print(b.get_attr3())
c = Child()
c.set_attr4("SomeString")
print(c.get_attr4())
The output here is
5
555
SomeString
The expected output would however be
5
555
This being printed would probably solve the problem.
ValueError("attr4 must be a boolean")
So somehow the set_attr4 method is never called; which I guess is expected, because __init__ is called after the class structure is read in. But I am at loss on how else to override those methods. I tried to add setattr(self, "set_attr4", set_attr4) (the commented line in the code above) but to no avail.
Or more generally, there is the propery which is usually used for creating getters and setters. But I don't think I understand how to apply it in a case where the getters and setters are created dynamically and need to be overridden by a child.
Is there any solution to this?
Update due to comments: It was pointed out by several people that using getters/setters in python may not be a good style and that they may usually not be needed. While this is definitely something to keep in mind, the background of this question is that I'm extending an old existing code which uses getters/setters throughout. I hence do not wish to change the style and let the user (this project only has some 20 users in total, but still...) suddenly change the way they access properties within the API.
However any future reader of this may consider that the getter/setter approach is at least questionable.
Metaclasses to the rescue!
class Meta(type):
def __init__(cls, name, bases, dct):
for attr in cls.attributes:
if not hasattr(cls, attr):
setattr(cls, attr, None)
setattr(cls, f'get_{attr}', cls._get(attr))
setattr(cls, f'set_{attr}', cls._set(attr))
class Base(metaclass=Meta):
attributes = ["attr{}".format(i) for i in range(100)]
_get = lambda a: lambda self: getattr(self, a)
_set = lambda a: lambda self, v: setattr(self, a, v)
# the rest of your code goes here
This is pretty self-explanatory: make attributes, _get, _set class variables (so that you can access them without class instantiation), then let the metaclass set everything up for you.
The __init__ is executed after the subclass is created, so it overrides what was specified there.
The minimal change needed to fix the problem is to check whether the attribute has already been set:
class Base():
attributes = ["attr{}".format(i) for i in range(100)]
def __init__(self):
_get = lambda a: lambda : getattr(self, a)
_set = lambda a: lambda v: setattr(self, a, v)
for attr in self.attributes:
setattr(self, attr, None)
if not hasattr(self, "get_"+attr):
setattr(self, "get_"+attr, _get(attr))
if not hasattr(self, "set_"+attr):
setattr(self, "set_"+attr, _set(attr))
However, I do not see to point in doing that this way. This is creating a new getter and setter for each instance of Base. I would instead rather create them on the class. That can be done with a class decorator, or with a metaclass, or in the body of the class itself, or in some other way.
For example, this is ugly, but simple:
class Base():
attributes = ["attr{}".format(i) for i in range(100)]
for attr in attributes:
exec(f"get_{attr} = lambda self: self.{attr}")
exec(f"set_{attr} = lambda self, value: setattr(self, '{attr}', value)")
del attr
This is better:
class Base:
pass
attributes = ["attr{}".format(i) for i in range(100)]
for attr in attributes:
setattr(Base, f"get_{attr}", lambda self: getattr(self, attr))
setattr(Base, f"set_{attr}", lambda self, value: setattr(self, '{attr}', value))
You're right about the problem. The creation of your Base instance happens after the Child class defines set_attr4. Since Base is creating it's getters/setters dynamically, it just blasts over Childs version upon creation.
One alternative way (in addition to the other answers) is to create the Child's getters/setters dynamically too. The idea here is to go for "convention over configuration" and just prefix methods you want to override with override_. Here's an example:
class Child(Base):
def __init__(self):
super().__init__()
overrides = [override for override in dir(self) if override.startswith("override_")]
for override in overrides:
base_name = override.split("override_")[-1]
setattr(self, base_name, getattr(self, override))
# Here I want to override one of the setters to perform typechecking
def override_set_attr4(self, v):
print("This being printed would probably solve the problem.")
if type(v) == bool:
super().set_attr4(v)
else:
raise ValueError("attr4 must be a boolean") # Added "raise" to this, overwise we just return None...
which outputs:
5
55
This being printed would probably solve the problem.
Traceback (most recent call last):
File ".\stack.py", line 39, in <module>
c.set_attr4("SomeString")
File ".\stack.py", line 29, in override_set_attr4
raise ValueError("attr4 must be a boolean") # Added "raise" to this, overwise we just return None...
ValueError: attr4 must be a boolean
Advantages here are that the Base doesn't have to know about the Child class. In the other answers, there's very subtle Base/Child coupling going on. It also might not be desirable to touch the Base class at all (violation of the Open/Closed principle).
Disadvantages are that "convention over configuration" to avoid a true inheritance mechanism is a bit clunky and unintuitive. The override_ function is also still hanging around on the Child instance (which you may or may not care about).
I think the real problem here is that you're trying to define getters and setters in such a fashion. We usually don't even want getters/setters in Python. This definitely feels like an X/Y problem, but maybe it isn't. You have a lot of rep, so I'm not going to give you some pedantic spiel about it. Even so, maybe take a step back and think about what you're really trying to do and consider alternatives.
The problem here is that you're creating the "methods" in the instance of the Base class (__init__ only runs in the instance).
Inheriting happens before you instance your class, and won't look into instances.
In other words, When you try to override the method, it wasn't even created in first place.
A solution is to create them in the class and not in self instance inside __init__:
def _create_getter(attr):
def _get(self):
return getattr(self, attr)
return _get
def _create_setter(attr):
def _set(self, value):
return setattr(self, attr, value)
return _set
class Base():
attributes = ["attr{}".format(i) for i in range(100)]
for attr in Base.attributes:
setattr(Base, 'get_' + attr, _create_getter(attr))
setattr(Base, 'set_' + attr, _create_setter(attr))
Then inheriting will work normally:
class Child(Base):
def set_attr4(self, v):
print("This being printed would probably solve the problem.")
if type(v) == bool:
super().set_attr4(v)
else:
raise ValueError("attr4 must be a boolean")
if __name__ == "__main__":
b = Base()
b.attr2 = 5
print(b.get_attr2())
b.set_attr3(55)
print(b.get_attr3())
c = Child()
c.set_attr4("SomeString")
print(c.get_attr4())
You could also just not do it - make your Base class as normal, and make setters only for the attributes you want, in the child class:
class Base:
pass
class Child(Base):
#property
def attr4(self):
return self._attr4
#attr4.setter
def attr4(self, new_v):
if not isinstance(new_v, bool):
raise TypeError('Not bool')
self._attr4 = new_v
Testing:
c = Child()
c.attr3 = 2 # works fine even without any setter
c.attr4 = True #works fine, runs the setter
c.attr4 = 3 #type error

Python class decorator "self" seems wrong

I am trying to work out how I can change functionality of __setattr__ of a class using a decorator on the class, but I am running into issue when trying to access self inside the function that replaces __setattr__. If I change the problmatic line to not access self, e.g. replacing it with val = str(val), I get the expected behaviour.
I see similar problems in other questions here, but they use a different approach, where a class is used as a decorater. My approach below feels less complicated, so I'd love to keep it like that if possible.
Why might a not be defined on self/foo where I expect it to be?
# Define the function to be used as decorator
# The decorator function accepts the relevant fieldname as argument
# and returns the function that wraps the class
def field_proxied(field_name):
# wrapped accepts the class (type) and modifies the functionality of
# __setattr__ before returning the modified class (type)
def wrapped(wrapped_class):
super_setattr = wrapped_class.__setattr__
# The new __setattr__ implementation makes sure that given an int,
# the fieldname becomes a string of that int plus the int in the
# `a` attribute
def setattr(self, attrname, val):
if attrname == field_name and isinstance(val, int):
val = str(self.a + val) # <-- Crash. No attribute `a`
super_setattr(self, attrname, val)
wrapped_class.__setattr__ = setattr
return wrapped_class
return wrapped
#field_proxied("b")
class Foo(object):
def __init__(self):
self.a = 2
self.b = None
foo = Foo()
# <-- At this point, `foo` has no attribute `a`
foo.b = 4
assert foo.b == "6" # Became a string
The problem is simple, you just need one line change.
def setattr(self, attrname, val):
if attrname == field_name and isinstance(val, int):
val = str(self.a + val)
super_setattr(self, attrname, val) # changed line
The reason is, in your original method, you will only call super_setattr when attrname == field_name. So self.a = 2 in __init__ doesn't work at all as "a" != "b".

python __getattribute__ override and #property decorator

I had to write a class of some sort that overrides __getattribute__.
basically my class is a container, which saves every user-added property to self._meta which is a dictionary.
class Container(object):
def __init__(self, **kwargs):
super(Container, self).__setattr__('_meta', OrderedDict())
#self._meta = OrderedDict()
super(Container, self).__setattr__('_hasattr', lambda key : key in self._meta)
for attr, value in kwargs.iteritems():
self._meta[attr] = value
def __getattribute__(self, key):
try:
return super(Container, self).__getattribute__(key)
except:
if key in self._meta : return self._meta[key]
else:
raise AttributeError, key
def __setattr__(self, key, value):
self._meta[key] = value
#usage:
>>> a = Container()
>>> a
<__main__.Container object at 0x0000000002B2DA58>
>>> a.abc = 1 #set an attribute
>>> a._meta
OrderedDict([('abc', 1)]) #attribute is in ._meta dictionary
I have some classes which inherit Container base class and some of their methods have #property decorator.
class Response(Container):
#property
def rawtext(self):
if self._hasattr("value") and self.value is not None:
_raw = self.__repr__()
_raw += "|%s" %(self.value.encode("utf-8"))
return _raw
problem is that .rawtext isn't accessible. (I get attributeerror.) every key in ._meta is accessible, every attributes added by __setattr__ of object base class is accessible, but method-to-properties by #property decorator isn't. I think it has to do with my way of overriding __getattribute__ in Container base class. What should I do to make properties from #property accessible?
I think you should probably think about looking at __getattr__ instead of __getattribute__ here. The difference is this: __getattribute__ is called inconditionally if it exists -- __getattr__ is only called if python can't find the attribute via other means.
I completely agree with mgilson. If you want a sample code which should be equivalent to your code but work well with properties you can try:
class Container(object):
def __init__(self, **kwargs):
self._meta = OrderedDict()
#self._hasattr = lambda key: key in self._meta #???
for attr, value in kwargs.iteritems():
self._meta[attr] = value
def __getattr__(self, key):
try:
return self._meta[key]
except KeyError:
raise AttributeError(key)
def __setattr__(self, key, value):
if key in ('_meta', '_hasattr'):
super(Container, self).__setattr__(key, value)
else:
self._meta[key] = value
I really do not understand your _hasattr attribute. You put it as an attribute but it's actually a function that has access to self... shouldn't it be a method?
Actually I think you should simple use the built-in function hasattr:
class Response(Container):
#property
def rawtext(self):
if hasattr(self, 'value') and self.value is not None:
_raw = self.__repr__()
_raw += "|%s" %(self.value.encode("utf-8"))
return _raw
Note that hasattr(container, attr) will return True also for _meta.
An other thing that puzzles me is why you use an OrderedDict. I mean, you iterate over kwargs, and the iteration has random order since it's a normal dict, and add the items in the OrderedDict. Now you have _meta which contains the values in random order.
If you aren't sure whether you need to have a specific order or not, simply use dict and eventually swap to OrderedDict later.
By the way: never ever use an try: ... except: without specifying the exception to catch. In your code you actually wanted to catch only AttributeErrors so you should have done:
try:
return super(Container, self).__getattribute__(key)
except AttributeError:
#stuff

Why does adding a second attribute to a metaclass-property-closure mix change the first attribute?

I want to understand python metaclasses. For practice I'm implementing a declarative way for writing classes (similar to sqlalchemy.ext.declarative). This looks promising as long as I only have one attribute.
But when I add another attribute, some part of the first attribute is changed and the value of the first attribute is validated against the pattern of the second attribute. This might be caused by the metaclass, by a closure, by the property or a combination of them. I try to give a minimal, complete but readable example.
#! /usr/bin/env python
"""
Something like:
class Artist:
locale = Pattern('[A-Z]{2}-[A-Z]{2}')
should be equivalent to:
class Artist:
def __init__(self):
self._locale = None
#property
def locale(self):
return self._locale
#locale.setter
def locale(self, value):
validate(value, '[A-Z]{2}-[A-Z]{2}')
self._locale = value
Problem:
The code below works if Artist has only one attribute.
When I add another one with a different pattern, only that last
pattern is used in validation.
"""
import re
import unittest
# this class (and future siblings) are used to describe attributes
class Pattern(object):
def __init__(self, pattern):
self.pattern = pattern
def validate(self, value):
if value is None:
return
if not re.match("^%s$" % self.pattern, value):
raise ValueError("invalid value: %r" % value)
def __repr__(self):
return "%s(pattern=%r)" % (self.__class__.__name__, self.pattern)
# __metaclass__ based class creation
def createClassFromDeclaration(name, bases, dct):
""" Examine dct, create initialization in __init__ and property. """
attributes = dict()
properties = dict()
for key, value in dct.iteritems():
if not isinstance(value, Pattern):
continue
pattern = value
pattern.attribute = "_%s" % key
attributes[key] = pattern
def fget(self):
return getattr(self, pattern.attribute)
def fset(self, value):
pattern.validate(value)
return setattr(self, pattern.attribute, value)
properties[key] = property(fget, fset)
def __init__(self, **kwargs):
# set all attributes found in the keyword arguments
for key, value in kwargs.iteritems():
if key in self.__attributes__:
setattr(self, key, value)
# set all attributes _NOT_ found to None
for key, declaration in attributes.iteritems():
if not hasattr(self, declaration.attribute):
setattr(self, key, None)
dct = dict(dct)
dct.update(properties)
dct['__init__'] = __init__
dct['__attributes__'] = attributes
return type(name, bases, dct)
# declarative class
class Artist(object):
__metaclass__ = createClassFromDeclaration
# FIXME: adding a second attribute changes the first pattern
locale = Pattern('[A-Z]{2}-[A-Z]{2}')
date = Pattern('[0-9]{4}-[0-9]{2}-[0-9]{2}')
# some unit tests
class TestArtist(unittest.TestCase):
def test_attributes_are_default_initialized(self):
artist = Artist()
self.assertIsNone(artist.date)
self.assertIsNone(artist.locale)
def test_attributes_are_initialized_from_keywords(self):
artist = Artist(locale="EN-US", date="2013-02-04")
self.assertEqual(artist.date, "2013-02-04")
# FIXME: the following does not work.
# it validates against the date pattern
self.assertEqual(artist.locale, "EN-US")
def test_locale_with_valid_value(self):
artist = Artist()
artist.date = "2013-02-04"
self.assertEqual(artist.locale, "2013-02-04")
# FIXME: the following does not work.
# it validates against the date pattern
artist.locale = "EN-US"
self.assertEqual(artist.locale, "EN-US")
def test_locale_with_invalid_value_throws(self):
artist = Artist()
with self.assertRaises(ValueError):
artist.locale = ""
with self.assertRaises(ValueError):
artist.locale = "EN-USA"
if __name__ == '__main__':
unittest.main()
# vim: set ft=python sw=4 et sta:
When I comment out the second attribute ('date') the tests succeed, but with the second attribute the tests that try to set the first attribute ('locale') fail. What causes the unittests to fail?
Disclaimer: This code is only for training. There are ways to create the same functionality that do not involve metaclasses, properties and closures (as you and I know). But we don't learn anything new if we only walk the streets we know. Please help me expand my Python knowledge.
The problem doesn't really have anything to do with metaclasses or properties per se. It has to do with how you're defining your get/set functions. Your fget and fset reference the variable pattern from the enclosing function. This creates a closure. The value of pattern will be looked up at the time fget/fset are called, not at the time they're defined. So when you overwrite pattern on the next loop iteration, you cause all fget/fset functions to now reference the new pattern.
Here's a simpler example that shows what's going on:
def doIt(x):
funs = []
for key, val in x.iteritems():
thingy = val + 1
def func():
return thingy
funs.append(func)
return funs
>>> dct = {'a': 1, 'b': 2, 'c': 3}
>>> funs = doIt(dct)
>>> for f in funs:
... print f()
3
3
3
Notice that, even though the three functions are defined at times when thingy has different values, when I call them later they all return the same value. This is because they are all looking up thingy when they're called, which is after the loop is done, so thingy just equals the last value it was set to.
The usual way to get around this is to pass in the variable you want to close over as the default value of an additional function argument. Try doing your getter and setter like this:
def fget(self, pattern=pattern):
return getattr(self, pattern.attribute)
def fset(self, value, pattern=pattern):
pattern.validate(value)
return setattr(self, pattern.attribute, value)
Default arguments are evaluated at function definition time, not call time, so this forces each function to "save" the value of pattern it wants to use.

Subclassing ctypes - Python

This is some code I found on the internet. I'm not sure how it is meant to be used. I simply filled members with the enum keys/values and it works, but I'm curious what this metaclass is all about. I am assuming it has something to do with ctypes, but I can't find much information on subclassing ctypes. I know EnumerationType isn't doing anything the way I'm using Enumeration.
from ctypes import *
class EnumerationType(type(c_uint)):
def __new__(metacls, name, bases, dict):
if not "_members_" in dict:
_members_ = {}
for key,value in dict.items():
if not key.startswith("_"):
_members_[key] = value
dict["_members_"] = _members_
cls = type(c_uint).__new__(metacls, name, bases, dict)
for key,value in cls._members_.items():
globals()[key] = value
return cls
def __contains__(self, value):
return value in self._members_.values()
def __repr__(self):
return "<Enumeration %s>" % self.__name__
class Enumeration(c_uint):
__metaclass__ = EnumerationType
_members_ = {}
def __init__(self, value):
for k,v in self._members_.items():
if v == value:
self.name = k
break
else:
raise ValueError("No enumeration member with value %r" % value)
c_uint.__init__(self, value)
#classmethod
def from_param(cls, param):
if isinstance(param, Enumeration):
if param.__class__ != cls:
raise ValueError("Cannot mix enumeration members")
else:
return param
else:
return cls(param)
def __repr__(self):
return "<member %s=%d of %r>" % (self.name, self.value, self.__class__)
And an enumeration probably done the wrong way.
class TOKEN(Enumeration):
_members_ = {'T_UNDEF':0, 'T_NAME':1, 'T_NUMBER':2, 'T_STRING':3, 'T_OPERATOR':4, 'T_VARIABLE':5, 'T_FUNCTION':6}
A metaclass is a class used to create classes. Think of it this way: all objects have a class, a class is also an object, therefore, it makes sense that a class can have a class.
http://www.ibm.com/developerworks/linux/library/l-pymeta.html
To understand what this is doing, you can look at a few points in the code.
_members_ = {'T_UNDEF':0, 'T_NAME':1, 'T_NUMBER':2, 'T_STRING':3, 'T_OPERATOR':4, 'T_VARIABLE':5, 'T_FUNCTION':6}
globals()[key] = value
Here it takes every defined key in your dictionary: "T_UNDEF" "T_NUMBER" and makes them available in your globals dictionary.
def __init__(self, value):
for k,v in self._members_.items():
if v == value:
self.name = k
break
Whenever you make an instance of your enum, it will check to see if the "value" is in your list of allowable enum names when you initialized the class. When the value is found, it sets the string name to self.name.
c_uint.__init__(self, value)
This is the actual line which sets the "ctypes value" to an actual c unsigned integer.
That is indeed a weird class.
The way you are using it is correct, although another way would be:
class TOKEN(Enumeration):
T_UNDEF = 0
T_NAME = 1
T_NUMBER = 2
T_STRING = 3
T_OPERATOR = 4
T_VARIABLE = 5
T_FUNCTION = 6
(That's what the first 6 lines in __new__ are for)
Then you can use it like so:
>>> TOKEN
<Enumeration TOKEN>
>>> TOKEN(T_NAME)
<member T_NAME=1 of <Enumeration TOKEN>>
>>> T_NAME in TOKEN
True
>>> TOKEN(1).name
'T_NAME'
The from_param method seems to be for convenience, for writing methods that accept either an int or an Enumeration object. Not really sure if that's really its purpose.
I think this class is meant to be used when working with external APIs the use c-style enums, but it looks like a whole lot of work for very little gain.

Categories

Resources