ConfigParser with Custom Dict Type - python

I need to create a custom ConfigParser which adds a few features including, most relevantly here, a default value for unset keys of None. This seems to be supported through custom dict types and accordingly I wrote something like this:
class SyncDict(collections.UserDict):
...
def __getitem__(self, key):
if key in self.data:
return self.data[key]
return None
...
class SyncConfig(ConfigParser):
...
def __init__(self, filename):
super().__init__(allow_no_value=True, dict_type = SyncDict)
...
However, this does not work as it still raises KeyError in SectionProxy. For example,
>>> a = SyncConfig('aaa.cfg')
>>> a.add_section('b')
>>> b = a['b']
>>> b['c']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files\Python38\lib\configparser.py", line 1254, in __getitem__
raise KeyError(key)
KeyError: 'c'
Am I missing something, or this really not supposed to be possible?
PS: Bonus points for a way to make SyncDict return a value of an empty SyncDict when asked for a section and None when asked for an option.

Related

Restrict possible options of class attributes values

How to limit possible options to be chosen when instantiating class objects? Let's say we have a class like below and we'd like to restrict value values only to be predefined as in the valueOptions list. My implementation attempt is as below. Is there more Pythonic way to achieve the same result?
class TestClass:
valueOptions = ['A value', 'The value', 'Another value']
def __init__(self, value):
self.value = value
#property
def value(self):
return self._value
#value.setter
def value(self, val):
if val not in self.valueOptions:
raise ValueError('Value not in available options.')
self._value = val
def __str__(self):
return f'Value: {self.value}'
v1 = TestClass('A value') # line 20
v2 = TestClass('Value') # line 21
After run:
Traceback (most recent call last):
File "<string>", line 21, in <module>
File "<string>", line 5, in __init__
File "<string>", line 14, in value
ValueError: Value not in available options.
Use Enum instead of str if you need a limited values
The core issue is that you (implicitly; your code isn't typed) define value as being a string type. But you don't need a string: you need a limited list of possible values. String is not the right variable type to code that.
The enum module allows you to define a enumeration type (class) with a restricted number of options / values.
This actually also solves another issue: the values in your example are hard coded, creating "magic numbers" (or in this case: "magic strings") in your code. Your code will eventually lead to scattered instances of 'A value', 'The value' and 'Another value' in lots of different places. If you (at some future point) decide to replace 'The value' to 'The best value', you have a code maintenance problem...
Here is a code example using enums (and type hints):
from enum import Enum
class ValueOptions(Enum):
A_VALUE = 'A value',
THE_VALUE = 'The value',
ANOTHER_VALUE = 'Another value'
class TestClass:
def __init__(self, value: ValueOptions) -> None:
self.value = value
def __str__(self) -> str:
return f'Value: {self.value.name}'
v1 = TestClass(ValueOptions.A_VALUE)
print(v1)
v2 = TestClass(ValueOptions.ILLEGAL_VALUE)
print(v2)
Try it online!
This leads to valid code for v1, and to an error for v2:
Traceback (most recent call last):
File ".code.tio", line 19, in <module>
v2 = TestClass(ValueOptions.ILLEGAL_VALUE)
File "/usr/lib64/python3.7/enum.py", line 349, in __getattr__
raise AttributeError(name) from None
AttributeError: ILLEGAL_VALUE
This error will not only show at runtime, but (even better!) when writing your code. Any Python code checker (such as mypy) will detect that I used an illegal attribute:
❯ mypy ./enum_test.py
enum_test.py:21: error: "Type[ValueOptions]" has no attribute "ILLEGAL_VALUE"
Found 1 error in 1 file (checked 1 source file)
Best of all: this code is a bit shorter than your version! ;-)

Attrs dependant attributes on initialization

Let's say we have an attrs class:
#attr.s()
class Foo:
a: bool = attr.ib(default=False)
b: int = attr.ib(default=5)
#b.validator
def _check_b(self, attribute, value):
if not self.a:
raise ValueError("to define 'b', 'a' must be True")
if value < 0:
raise ValueError("'b' has to be a positive integer")
So the following behaviour is correct:
>>> Foo(a=True, b=10)
Foo(a=True, b=10)
>>> Foo(b=10)
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "<attrs generated init __main__.Foo>", line 5, in __init__
__attr_validator_b(self, __attr_b, self.b)
File "<input>", line 9, in _check_b
ValueError: to define 'b', 'a' must be True
But this is not:
>>> Foo()
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "<attrs generated init __main__.Foo>", line 5, in __init__
__attr_validator_b(self, __attr_b, self.b)
File "<input>", line 9, in _check_b
ValueError: to define 'b', 'a' must be True
This obviously happens because Foo.b is always initialized, regardless of when Foo.a is given value: via default value or on Foo.__init__.
Is there anyway to accomplish this attribute dependance with any of the initialization hooks?
Following #hynek's recommendation to have default values that result in a valid instance, I've changed default of the dependant attribute to None to validate only when a value is passed on __init__:
#attr.s()
class Foo:
a: bool = attr.ib(default=False)
b: Optional[int] = attr.ib(default=None)
#b.validator
def _check_b(self, attribute, value):
if value is None:
self.b = 5
return
if not self.a:
raise ValueError("to define 'b', 'a' must be True")
if value < 0:
raise ValueError("'b' has to be a positive integer")
I am aware that to change attributes's values on a validator is not optimal, but it gets the work done.
Documentation about validators can be found here
Depending what you want to do, you can check whether value is the default value (available on attribute.default). However, your whole problem is that the default values result in an instance that's, according to your validator, invalid. Thus there's probably better ways to model this.

Immutable/Frozen Dictionary subclass with fixed key, value types

Problem
Similar to previous questions, I would like to create a frozen/immutable dictionary. Specifically, after initialization, the user should get an ValueError when trying to use the __delitem__ and __setitem__ methods.
Unlike the previous questions, I specifically want it to be a sub-class where the initialization type is constrained to a specific key and value type.
Attempted Solution
My own attempts at accomplishing this with collections.UserDict failed:
class WorkflowParams(UserDict):
def __init__(self, __dict: Mapping[str, str]) -> None:
super().__init__(__dict=__dict)
def __setitem__(self, key: str, item: str) -> None:
raise AttributeError("WorkflowParams is immutable.")
def __delitem__(self, key: str) -> None:
raise AttributeError("WorkflowParams is immutable.")
When trying to use it:
workflow_parameters = WorkflowParams(
{
"s3-bucket": "my-big-bucket",
"input-val": "1",
}
)
It fails with
Traceback (most recent call last):
File "examples/python_step/python_step.py", line 38, in <module>
workflow_parameters = WorkflowParams(
File "/home/sean/git/scargo/scargo/core.py", line 14, in __init__
super().__init__(__dict=__dict)
File "/home/sean/miniconda3/envs/scargo/lib/python3.8/collections/__init__.py", line 1001, in __init__
self.update(kwargs)
File "/home/sean/miniconda3/envs/scargo/lib/python3.8/_collections_abc.py", line 832, in update
self[key] = other[key]
File "/home/sean/git/scargo/scargo/core.py", line 17, in __setitem__
raise AttributeError("WorkflowParams is immutable.")
AttributeError: WorkflowParams is immutable.
Because of how __init__() resolves methods.
Disqualified Alternatives
Because of my need for a subclass, the commonly suggested solution of using MappingProxyType doesn't meet my requirements.
Additionally, I'm suspicious of answers which recommend subclassing dict, since this seems to cause some unintended behaviour.
This seem to work just fine for me (tested with Python 3.6 and 3.8):
from collections import UserDict
from typing import Mapping
class WorkflowParams(UserDict):
def __init__(self, __dict: Mapping[str, str]) -> None:
super().__init__()
for key, value in __dict.items():
super().__setitem__(key, value)
def __setitem__(self, key: str, item: str) -> None:
raise AttributeError("WorkflowParams is immutable.")
def __delitem__(self, key: str) -> None:
raise AttributeError("WorkflowParams is immutable.")
workflow_parameters = WorkflowParams(
{
"s3-bucket": "my-big-bucket",
"input-val": "1",
}
)
print(workflow_parameters)
# output: {'s3-bucket': 'my-big-bucket', 'input-val': '1'}
workflow_parameters['test'] = 'dummy'
# expected exception: AttributeError: WorkflowParams is immutable.
I would do it with the collections.abc, which to just to quickly build container classes, just implement a couple of thing and done
>>> import collections
>>> class FrozenDict(collections.abc.Mapping):
def __init__(self,/,*argv,**karg):
self._data = dict(*argv,**karg)
def __getitem__(self,key):
return self._data[key]
def __iter__(self):
return iter(self._data)
def __len__(self):
return len(self._data)
def __repr__(self):
return f"{type(self).__name__}({self._data!r})"
>>> t=FrozenDict( {
"s3-bucket": "my-big-bucket",
"input-val": "1",
})
>>> t
FrozenDict({'s3-bucket': 'my-big-bucket', 'input-val': '1'})
>>> t["test"]=23
Traceback (most recent call last):
File "<pyshell#38>", line 1, in <module>
t["test"]=23
TypeError: 'FrozenDict' object does not support item assignment
>>> del t["input-val"]
Traceback (most recent call last):
File "<pyshell#39>", line 1, in <module>
del t["input-val"]
TypeError: 'FrozenDict' object does not support item deletion
>>>

pickling dict inherited class is missing internal values

I've played around for a bit with the code and obviously the reason for the failure is that when setting 'wham' the value is another instance of the class TestDict which works fine as long as i don't try to pickle and unpickle it.
Because if i do self.test is missing.
Traceback:
Traceback (most recent call last):
File "test.py", line 30, in <module>
loads_a = loads(dumps_a)
File "test.py", line 15, in __setitem__
if self.test == False:
AttributeError: 'TestDict' object has no attribute 'test'
The code:
from pickle import dumps, loads
class TestDict(dict):
def __init__(self, test=False, data={}):
super().__init__(data)
self.test = test
def __getitem__(self, k):
if self.test == False:
pass
return dict.__getitem__(self, k)
def __setitem__(self, k, v):
if self.test == False:
pass
if type(v) == dict:
super().__setitem__(k, TestDict(False, v))
else:
super().__setitem__(k, v)
if __name__ == '__main__':
a = TestDict()
a['wham'] = {'bam' : 1}
b = TestDict(True)
b['wham'] = {'bam' : 2}
dumps_a = dumps(a)
dumps_b = dumps(b)
loads_a = loads(dumps_a)
loads_b = loads(dumps_b)
print(loads_a)
print(loads_b)
The code works if not replacing __setitem__ and __getitem__ but i want to add extra functionality to those two specific functions.
I've also tried:
class TestDict(dict):
__module__ = os.path.splitext(os.path.basename(__file__))[0]
Which sort of worked, as long as i don't nest TestDict within TestDict meaning i won't get to replace __setitem__ and __getitem__ in sub-parts of the dictionary.
Define __reduce__ method:
class TestDict(dict):
...
def __reduce__(self):
return type(self), (self.test, dict(self))
The problem is that pickle doesn't call the __init__ method of the object when it does the unpicling, so the self.test variable is not created at the moment when it tries to set the items of the dictionary.
Apparently setting the attributes is staged after setting items of the dictionary.
One way to solve it is to add a class level attribute that will be overridden in the instances:
class TestDict(dict):
test = False
...

Setting Property via a String

I'm trying to set a Python class property outside of the class via the setattr(self, item, value) function.
class MyClass:
def getMyProperty(self):
return self.__my_property
def setMyProperty(self, value):
if value is None:
value = ''
self.__my_property = value
my_property = property( getMyProperty, setMyProperty )
And in another script, I create an instance and want to specify the property and let the property mutator handle the simple validation.
myClass = MyClass()
new_value = None
# notice the property in quotes
setattr(myClass, 'my_property', new_value)
The problem is that it doesn't appear to be calling the setMyProperty(self, value) mutator. For a quick test to verify that it doesn't get called, I change the mutator to:
def setMyProperty(self, value):
raise ValueError('WTF! Why are you not being called?')
if value is None:
value = ''
self.__my_property = value
I'm fairly new to Python, and perhaps there's another way to do what I'm trying to do, but can someone explain why the mutator isn't being called when setattr(self, item, value) is called?
Is there another way to set a property via a string? I need the validation inside the mutator to be executed when setting the property value.
Works for me:
>>> class MyClass(object):
... def get(self): return 10
... def setprop(self, val): raise ValueError("hax%s"%str(val))
... prop = property(get, setprop)
...
>>> i = MyClass()
>>> i.prop =4
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in setprop
ValueError: hax4
>>> i.prop
10
>>> setattr(i, 'prop', 12)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in setprop
ValueError: hax12
The code you pasted seems to do the same as mine, except that my class inherits from object, but that's cause I'm running Python 2.6 and I thought that in 2.7 all classes automatically inherit from object. Try that, though, and see if it helps.
To make it even clearer: try just doing myClass.my_property = 4. Does that raise an exception? If not then it's an issue with inheriting from object - properties only work for new-style classes, i.e. classes that inherit from object.

Categories

Resources