python class instance invoke get() without declaration - python

I am reading mmdetecton project on github, and I'm so confused with code(I screened out all other irrelevant factors.):
class A:
def __init__(self, a):
self.a = a
self._dict_test = {"b": a}
def __getattr__(self, item):
print("You call __getattr__ !")
return getattr(self._dict_test, item)
test = A(2)
judge = test.get("b", False)
print("a is", test.a)
print("judge is ", judge)
print(test.__dict__)
I didn't declare the get() function in the class, I checked the documentation where it says:
Attribute references are translated to lookups in this dictionary, e.g., m.x is equivalent to m.dict["x"].
So,
(1)I wonder how should my code be interpreted? is it test.__dict__.get(), or test.__dict__['get()']
Has this ever happened to anyone?
(2)why getattr is invoked???
I check the doc where it says
getattr Called when the default attribute access fails with an AttributeError
but isn't get() the dict's function ? why get() fails with an AttributeError?
I am searching for a long time on net. But no use, and thanks in advance!

If you remove the __getattr__ method, you will see an exception:
judge = test.get("b", False)
AttributeError: 'A' object has no attribute 'get'
because there is no get defined in the A class.
With the __getattr__ method, test.get evaluates to getattr(self._dict_test, item) inside that method which is getattr(test._dict_test, "get") which is test._dict_test.get which is the usual dict.get method for test._dict_test, not test.__dict__.

Related

Python class attribute 'is not defined' when referenced by another class attribute

Using the following, I am able to successfully create a parser and add my arguments to self._parser through the __init()__ method.
class Parser:
_parser_params = {
'description': 'Generate a version number from the version configuration file.',
'allow_abbrev': False
}
_parser = argparse.ArgumentParser(**_parser_params)
Now I wish to split the arguments into groups so I have updated my module, adding some classes to represent the argument groups (in reality there are several subclasses of the ArgumentGroup class), and updating the Parser class.
class ArgumentGroup:
_title = None
_description = None
def __init__(self, parser) -> ArgumentParser:
parser.add_argument_group(*self._get_args())
def _get_args(self) -> list:
return [self._title, self._description]
class ArgumentGroup_BranchType(ArgumentGroup):
_title = 'branch type arguments'
class Parser:
_parser_params = {
'description': 'Generate a version number from the version configuration file.',
'allow_abbrev': False
}
_parser = argparse.ArgumentParser(**_parser_params)
_argument_groups = [cls(_parser) for cls in ArgumentGroup.__subclasses__()]
However, I'm now seeing an error.
Traceback (most recent call last):
...
File "version2/args.py", line 62, in <listcomp>
_argument_groups = [cls(_parser) for cls in ArgumentGroup.__subclasses__()]
NameError: name '_parser' is not defined
What I don't understand is why _parser_params do exist when they are referred by another class attribute, but _parser seemingly does not exist in the same scenario? How can I refactor my code to add the parser groups as required?
This comes from the confluence of two quirks of Python:
class statements do not create a new local scope
List comprehensions do create a new local scope.
As a result, the name _parser is in a local scope whose closest enclosing scope is the global scope, so it cannot refer to the about-to-be class attribute.
A simple workaround would be to replace the list comprehension with a regular for loop.
_argument_groups = []
for cls in ArgumentGroup.__subclasses()__:
_argument_groups.append(cls(_parser))
(A better solution would probably be to stop using class attributes where instance attributes make more sense.)

BDD behave Python need to create a World map to hold values

I'm not too familiar with Python but I have setup a BDD framework using Python behave, I now want to create a World map class that holds data and is retrievable throughout all scenarios.
For instance I will have a world class where I can use:
World w
w.key.add('key', api.response)
In one scenario and in another I can then use:
World w
key = w.key.get('key').
Edit:
Or if there is a built in way of using context or similar in behave where the attributes are saved and retrievable throughout all scenarios that would be good.
Like lettuce where you can use world http://lettuce.it/tutorial/simple.html
I've tried this between scenarios but it doesn't seem to be picking it up
class World(dict):
def __setitem__(self, key, item):
self.__dict__[key] = item
print(item)
def __getitem__(self, key):
return self.__dict__[key]
Setting the item in one step in scenario A: w.setitem('key', response)
Getting the item in another step in scenario B: w.getitem('key',)
This shows me an error though:
Traceback (most recent call last):
File "C:\Program Files (x86)\Python\lib\site-packages\behave\model.py", line 1456, in run
match.run(runner.context)
File "C:\Program Files (x86)\Python\lib\site-packages\behave\model.py", line 1903, in run
self.func(context, *args, **kwargs)
File "steps\get_account.py", line 14, in step_impl
print(w.__getitem__('appToken'))
File "C:Project\steps\world.py", line 8, in __getitem__
return self.__dict__[key]
KeyError: 'key'
It appears that the World does not hold values here between steps that are run.
Edit:
I'm unsure how to use environment.py but can see it has a way of running code before the steps. How can I allow my call to a soap client within environment.py to be called and then pass this to a particular step?
Edit:
I have made the request in environment.py and hardcoded the values, how can I pass variables to environment.py and back?
It's called "context" in the python-behave jargon. The first argument of your step definition function is an instance of the behave.runner.Context class, in which you can store your world instance. Please see the appropriate part of the tutorial.
Have you tried the
simple approach, using global var, for instance:
def before_all(context):
global response
response = api.response
def before_scenario(context, scenario):
global response
w.key.add('key', response)
Guess feature can be accessed from context, for instance:
def before_feature(context, feature):
feature.response = api.response
def before_scenario(context, scenario):
w.key.add('key', context.feature.response)
You are looking for:
Class variable: A variable that is shared by all instances of a class.
Your code in Q uses Class Instance variable.
Read about: python_classes_objects
For instance:
class World(dict):
__class_var = {}
def __setitem__(self, key, item):
World.__class_var[key] = item
def __getitem__(self, key):
return World.__class_var[key]
# Scenario A
A = World()
A['key'] = 'test'
print('A[\'key\']=%s' % A['key'] )
del A
# Scenario B
B = World()
print('B[\'key\']=%s' % B['key'] )
Output:
A['key']=test
B['key']=test
Tested with Python:3.4.2
Come back and Flag your Question as answered if this is working for you or comment why not.
Defining global var in before_all hook did not work for me.
As mentioned by #stovfl
But defining global var within one of my steps worked out.
Instead, as Szabo Peter mentioned use the context.
context.your_variable_name = api.response
and just use
context.your_variable_name anywhere the value is to be used.
For this I actually used a config file [config.py] I then added the variables in there and retrieved them using getattr. See below:
WSDL_URL = 'wsdl'
USERNAME = 'name'
PASSWORD = 'PWD'
Then retrieved them like:
import config
getattr(config, 'USERNAME ', 'username not found')

Dynamic attributes or custom dictionary for fixed read-only fields

I am creating a class for retrieving details about a computer such as host_name, kernel_version, bios_version, and so on. Some details are more expensive to collect than others so I have a get_* function to retrieve them, but keep the results cached in the object if they are needed again. I am considering implementing them to look like a dictionary object so the kernel version can be retrieved as so:
system = System()
kver = system['kernel_version']
This will call the instance method get_kernel_version(self) internally to retrieve the data. If the kernel version is retrieved a second time from the above instantiated object, it will returned the cached result from the original call to get_kernel_version(self). Note that all these key/value pairs are read-only, there are a fixed number of them based on the available get_* methods, and no new keys can be added later so it doesn't feel like a regular dictionary. There also shouldn't be a need to call something like the values() function which would simply cause all the get_* functions to be needlessly hit. Also, the syntax is a little more verbose than I'd like. Using system.kernel_version instead seems more natural for this use case.
I'm considering whether a better approach is to use dynamic attributes on a class instance. However, I need a natural way to retrieve a list of all attributes, but not the internal methods supporting them. I would probably use the __dir__ special method to return a list similar the keys() list of the dictionary. I would want to see kernel_version and host_name in the list, but not __class__ or get_kernel_version. This seems to go against recommended practice for the definition of __dir__ so I'm not sure if this is the right approach to use.
I could return a proxy class instance whose sole job calls back to a concrete class with the get_* functions when it doesn't have the appropriate attribute already defined.
Here is an example of a version I'm experimenting with implementing the dictionary approach:
class System(dict):
def __getitem__(self, key):
try:
return getattr(self, 'get_'+key)()
except AttributeError as ex:
raise KeyError(ex.message)
def __setitem__(self, key, value):
raise Exception('Read-only')
def __delitem__(self, key, value):
raise Exception('Read-only')
def keys(self):
return [ x[4:] for x in dir(self) if x.startswith('get_') ]
def get_host_name(self):
return 'localhost'
def get_kernel_version(self):
return '4.7.0'
system = System()
print repr(system.keys())
for key in system.keys():
print '{0}: {1}'.format(key, system[key])
try:
system['bios']
except Exception as ex:
print str(ex)
try:
system['kernel_version'] = '5.0'
except Exception as ex:
print str(ex)
Which produced the following output:
['host_name', 'kernel_version']
host_name: localhost
kernel_version: 4.7.0
"'System' object has no attribute 'get_bios'"
Read-only
The code above does not yet implement the caching of values yet, but that is easy to add. However, it's feeling more like I should be doing this as attributes. I am just not sure if when doing so I should abuse __dir__ to emulate the same functionality above I get with keys().
Should I stick with emulating a read-only dictionary or present a class instance with dynamic attributes?
I think sticking with the read-only dictionary subclass approach you're using is fine. However your implementation could be improved somewhat by creating a generic read-only dictionary superclass from which to derive your specific subclass, and using a metaclass to create the value returned by the keys() method. Doing both is illustrated below.
This means you don't have to "abuse" dir() (there's no such thing as a __dir__ attribute) any longer. You can also use reuse the generic MetaReadonlyDict and ReadonlyDict classes to create other similar types.
class MetaReadonlyDict(type):
def __new__(mcls, classname, bases, classdict):
classobj = type.__new__(mcls, classname, bases, classdict)
prefix = classdict['prefix']
_keys = set(name[len(prefix):] for name in classdict
if name.startswith(prefix))
setattr(classobj, 'keys', lambda self: list(_keys)) # define keys()
return classobj
class ReadonlyDict(dict):
def __getitem__(self, key):
try:
return getattr(self, self.prefix + key)()
except AttributeError as ex:
raise Exception(
"{} object has no {!r} key".format(self.__class__.__name__, key))
def __setitem__(self, key, value):
verb = "redefined" if key in self else "defined"
raise Exception(
"{} object is read-only: {!r} "
"key can not be {}".format(self.__class__.__name__, key, verb))
def __delitem__(self, key):
raise Exception(
"{} object is read-only: {!r} "
"key can not be deleted".format(self.__class__.__name__, key))
def __contains__(self, key):
return key in self.keys()
class System(ReadonlyDict):
__metaclass__ = MetaReadonlyDict
prefix = '_get_'
def _get_host_name(self):
return 'localhost'
def _get_kernel_version(self):
return '4.7.0'
system = System()
print('system.keys(): {!r}'.format(system.keys()))
print('values associated with system.keys():')
for key in system.keys():
print(' {!r}: {!r}'.format(key, system[key]))
try:
system['bios']
except Exception as ex:
print(str(ex))
try:
system['kernel_version'] = '5.0'
except Exception as ex:
print(str(ex))
try:
del system['host_name']
except Exception as ex:
print(str(ex))
Output:
system.keys(): ['kernel_version', 'host_name']
values associated with system.keys():
'kernel_version': '4.7.0'
'host_name': 'localhost'
System object has no 'bios' key
System object is read-only: 'kernel_version' key can not be redefined
System object is read-only: 'host_name' key can not be deleted

Default property when none matching - Python

I have browsed the web and pydoc to find my answer without success.
My issue is the following:
I want to define a class with properties, as I would do habitually.
class Container(object):
def __init__(self, content):
assert isinstance(content, dict), "The container can only contain a dictionary"
self._content = content
#property
def name():
try:
return self._content["its_name"]
except KeyError:
raise AttributeError
Now, to access the content's field "its_name", I can use container.name, with a slight modification between the field's name and the attribute's.
I would like to have a default behavior when no specific getter property is set.
I mean, if I call container.description, I want my class to try returning self._content["description"], and throw an AttributeError if there is no such key.
While still calling the specific property for cases like container.name.
Thanks in advance for your help.
This is what the __getattr__ special method is for:
def __getattr__(self, attrname):
# Only called if the other ways of accessing the attribute fail.
try:
return self._content[attrname]
except KeyError:
raise AttributeError
Note that if for some reason you try to retrieve an unknown attribute when the _content attribute doesn't exist, the line
return self._content[attrname]
will recursively invoke __getattr__ in an attempt to get the _content attribute, and that call will call __getattr__, and so on until stack overflow.

Strange issue at subclassing?

Let's have a class representing a Django controller, with one of methods called _onSuccess :
class ConfirmController(object):
...
def _onSuccess(self, controller):
...
The class is instantiated later with:
def credit_confirm_info(request, payment_module, template='/some/template.html'):
controller = ConfirmController(request, payment_module)
controller.confirm() # this method calls self._onSuccess
return controller.response
credit_confirm_info = never_cache(credit_confirm_info)
I'm trying to use subclass of ConfirmController:
class ConfirmControllerEx(ConfirmController):
def _onSuccess(self, controller):
# shortened to demonstrate even simple call to super
# causes a different behaviour
super(ConfirmControllerEx, self)._onSuccess(controller)
I've probably missed something at python learning but can anybody explain why is not the above sublassed _onSuccess equivalent to the original method ?
If I do use the above sublass ConfirmControllerEx:
def credit_confirm_info(request, payment_module, template='/some/template.html'):
controller = ConfirmControllerEx(request, payment_module)
controller.confirm() # this method calls self._onSuccess
return controller.response
credit_confirm_info = never_cache(credit_confirm_info)
I'm getting NoneType has no method has_header error, like credit_confirm_info is called again but with request parameter equal to None.
I expect the sublass and subclassed method _onSuccess with the plain call to super won't differ from the original. Am I missing something here ?
Update (traceback of the exception):
Traceback:
File "/home/dunric/Projects/Example.com/satchmo/lib/python2.7/site-packages/django/core/handlers/base.py" in get_response
111. response = callback(request, *callback_args, **callback_kwargs)
File "/home/dunric/Projects/Example.com/satchmo/gastroceny_cz/localsite/views.py" in cod_confirm_info
279. template='shop/checkout/cod/confirm.html')
File "/home/dunric/Projects/Example.com/satchmo/lib/python2.7/site-packages/django/views/decorators/cache.py" in _wrapped_view_func
90. add_never_cache_headers(response)
File "/home/dunric/Projects/Example.com/satchmo/lib/python2.7/site-packages/django/utils/cache.py" in add_never_cache_headers
129. patch_response_headers(response, cache_timeout=-1)
File "/home/dunric/Projects/Example.com/satchmo/lib/python2.7/site-packages/django/utils/cache.py" in patch_response_headers
119. if not response.has_header('Last-Modified'):
Exception Type: AttributeError at /checkout/cod/confirm/
Exception Value: 'NoneType' object has no attribute 'has_header'
I'm not up on the specifics of django involved here, but this method:
def _onSuccess(self, controller):
# shortened to demonstrate even simple call to super
# causes a different behaviour
super(ConfirmControllerEx, self)._onSuccess(controller)
Is not equivalent to the _onSuccess of the parent class. It calls the parent implementation through super, but it ignores whatever that call returns and just returns None (implicitly, by execution reaching the end of the method definition). Given you later get an error that seems to indicate you have a None object (instance of NoneType) where something else was expected, this would be my guess at the error. That's not going to be it if the contract of the _onSuccess method is to always return None, however.

Categories

Resources