I have the base class:
class BaseGameHandler(BaseRequestHandler):
name = 'Base'
def get(self):
self.render(self.name + ".html")
Now, I need to define a few subclasses of this but the thing is, they have to have a decorator. Equivalent code would be:
#route('asteroid')
class AsteroidGameHandler(BaseGameHandler):
name = 'asteroid'
#route('blah')
class BlahGameHandler(BaseGameHandler):
name = 'blah'
and maybe a few more.
A little background here: This is a tornado web app and the #route decorator allows you to map /blah to BlahGameHandler. This code maps /blah to BlahGameHandler and /asteroid to AsteroidGameHandler.
So I thoughtI should use metaprogramming in python and define all these classes on the fly. I tried the following which doesn't work(and by doesn't work I mean the final web-app throws 404 on both /asteroid and /blah):
game_names = ['asteroid', 'blah']
games = list([game, type('%sGameHandler' % (game.title()), (BaseGameHandler,), {'name': game})] for game in game_names)
for i in xrange(len(games)):
games[i][1] = route(games[i][0])(games[i][1])
What am I missing? Aren't these two codes equivalent when run?
The library that you use only looks for global class objects in your module.
Set each class as a global; the globals() function gives you access to your module namespace as a dictionary:
for i in xrange(len(games)):
globals()[games[i][1].__name__] = route(games[i][0])(games[i][1])
The include() code does not look for your views in lists.
To be specific, include() uses the following loop to detect handlers:
for member in dir(module):
member = getattr(module, member)
if isinstance(member, type) and issubclass(member, web.RequestHandler) and hasattr(member, 'routes'):
# ...
elif isinstance(member, type) and issubclass(member, web.RequestHandler) and hasattr(member, 'route_path'):
# ...
elif isinstance(member, type) and issubclass(member, web.RequestHandler) and hasattr(member, 'rest_route_path'):
# ...
dir(module) only considers top-level objects.
Related
Using the following, I am able to successfully create a parser and add my arguments to self._parser through the __init()__ method.
class Parser:
_parser_params = {
'description': 'Generate a version number from the version configuration file.',
'allow_abbrev': False
}
_parser = argparse.ArgumentParser(**_parser_params)
Now I wish to split the arguments into groups so I have updated my module, adding some classes to represent the argument groups (in reality there are several subclasses of the ArgumentGroup class), and updating the Parser class.
class ArgumentGroup:
_title = None
_description = None
def __init__(self, parser) -> ArgumentParser:
parser.add_argument_group(*self._get_args())
def _get_args(self) -> list:
return [self._title, self._description]
class ArgumentGroup_BranchType(ArgumentGroup):
_title = 'branch type arguments'
class Parser:
_parser_params = {
'description': 'Generate a version number from the version configuration file.',
'allow_abbrev': False
}
_parser = argparse.ArgumentParser(**_parser_params)
_argument_groups = [cls(_parser) for cls in ArgumentGroup.__subclasses__()]
However, I'm now seeing an error.
Traceback (most recent call last):
...
File "version2/args.py", line 62, in <listcomp>
_argument_groups = [cls(_parser) for cls in ArgumentGroup.__subclasses__()]
NameError: name '_parser' is not defined
What I don't understand is why _parser_params do exist when they are referred by another class attribute, but _parser seemingly does not exist in the same scenario? How can I refactor my code to add the parser groups as required?
This comes from the confluence of two quirks of Python:
class statements do not create a new local scope
List comprehensions do create a new local scope.
As a result, the name _parser is in a local scope whose closest enclosing scope is the global scope, so it cannot refer to the about-to-be class attribute.
A simple workaround would be to replace the list comprehension with a regular for loop.
_argument_groups = []
for cls in ArgumentGroup.__subclasses()__:
_argument_groups.append(cls(_parser))
(A better solution would probably be to stop using class attributes where instance attributes make more sense.)
Let's say I have a (simplified) class as below. I am using it for a program configuration (hyperparameters).
# config.py
class Config(object): # default configuration
GPU_COUNT = 1
IMAGES_PER_GPU = 2
MAP = {1:2, 2:3}
def display(self):
pass
# experiment1.py
from config import Config as Default
class Config(Default): # some over-written configuration
GPU_COUNT = 2
NAME='2'
# run.py
from experiment1 import Config
cfg = Config()
...
cfg.NAME = 'ABC' # possible runtime over-writing
# Now I would like to save `cfg` at this moment
I'd like to save this configuration and restore later. The member functions must be out of concern when restoring.
1. When I tried pickle:
import pickle
with open('cfg.pk', 'rb') as f: cfg = pickle.load(f)
##--> AttributeError: Can't get attribute 'Config' on <module '__main__'>
I saw a solution using class_def of Config, but I wish I can restore the configuration without knowing the class definition (eg, export to dict and save as JSON)
2. I tried to convert class to dict (so that I can export as JSON)
cfg.__dict__ # {'NAME': 'ABC'}
vars(cfg) # {'NAME': 'ABC'}
In both cases, it was difficult to access attributes. Is it possible?
The question's title is "how to convert python class to dict", but I suspect you are really just looking for an easy way to represent (hyper)parameters.
By far the easiest solution is to not use classes for this. I've seen it happen on some machine learning tutorials, but I consider it a pretty ugly hack. It breaks some semantics about classes vs objects, and the difficulty pickling is a result from that. How about you use a simple class like this one:
class Params(dict):
__getattr__ = dict.__getitem__
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
def __getstate__(self):
return self
def __setstate__(self, state):
self.update(state)
def copy(self, **extra_params):
return Params(**self, **extra_params)
It can do everything the class approach can. Predefined configs are then just objects you should copy before editing, as follows:
config = Params(
GPU_COUNT = 2,
NAME='2',
)
other_config = config.copy()
other_config.GPU_COUNT = 4
Or alternatively in one step:
other_config = config.copy(
GPU_COUNT = 4
)
Works fine with pickle (although you will need to have the Params class somewhere in your source), and you could also easily write load and save methods for the Params class if you want to use JSON.
In short, do not use a class for something that really is just an object.
Thankfully, #evertheylen's answer was great to me. However, the code returns error when p.__class__ = Params, so I slightly changed as below. I think it works in the same way.
class Params(dict):
__getattr__ = dict.__getitem__
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
def __getstate__(self):
return self
def __setstate__(self, state):
self.update(state)
def copy(self, **extra_params):
lhs = Params()
lhs.update(self)
lhs.update(extra_params)
return lhs
and you can do
config = Params(
GPU_COUNT = 2,
NAME='2',
)
other_config = config.copy()
other_config.GPU_COUNT = 4
I have a test class and a setup function that looks like this:
#pytest.fixture(autouse=True, scope='function')
def setup(self, request):
self.client = MyClass()
first_patcher = patch('myclass.myclass.function_to_patch')
first_mock = first_patcher.start()
first_mock.return_value = 'foo'
value_to_return = getattr(request, 'value_name', None)
second_patcher = patch('myclass.myclass.function_two')
second_mock = second_patcher.start()
second_mock.return_value = value_to_return
#could clean up my mocks here, but don't care right now
I see in the documentation for pytest, that introspection can be done for a module level value:
val = getattr(request.module, 'val_name', None)
But, I want to be able to specify different values to return based on the test I am in. So I am looking for a way to introspect the test_function not the test_module.
http://pytest.org/latest/fixture.html#fixtures-can-introspect-the-requesting-test-context
You can use request.function to get to the test function. Just follow the link on the b wepage you referenced to see what is available on the test request object :)
Maybe the documentation has changed since the time of the accepted answer.
At least for me it was not clear how to
Just follow the link
So I thought I'd update this thread with the link itself:
https://pytest.org/en/6.2.x/reference.html#request
Edit December 2021
Even when the link is correct now I think this statement from the pytest documentation is just not correct:
Fixture functions can accept the request object to introspect the “requesting” test function ...
While I found some examples for getting attributes of the module I did not find a single working example of introspecting the test function that requests the fixture. May be related to collection and runtime order.
What really helped me to get the desired behavior was to use the factory idiom a little further down in the pytest documentation:
https://pytest.org/en/6.2.x/fixture.html#factories-as-fixtures
Set up the fixture factory
#pytest.fixture(scope='function')
def getQueryResult() -> object:
def _impl(_mrId: int = 7622):
return QueryResult(_mrId)
return _impl
Usage
# Concrete value
def test_foo(getQueryResult):
queryResult = getQueryResult(4711)
...
# Default value
def test_bar(getQueryResult):
queryResult = getQueryResult()
...
I've been wondering this for a time now and it seems like something Python would have, but is it possible to call the same method from different using code similar to this?
Here's the class I'm using to test this:
class Test():
def __init__(self,foo):
self.foo = foo
def method_a():
print(self.foo)
And here's the running code:
import classTest
object1 = classTest.Test("Unn")
object2 = classTest.Test("Tss")
runThis = input("Type in either object1 or object2:")
runThis.method_a()
I get this:
AttributeError: 'str' object has no attribute 'method_a'
Have I done something wrong? Or does Python lack that functionality?
Put your objects in a dictionary:
objects = {}
objects['object1'] = classTest.Test("Unn")
objects['object2'] = classTest.Test("Tss")
runThis = input("Type in either object1 or object2:")
objects[runThis].method_a()
You can also access the module globals with the globals() function, which gives you a dictionary too, but most of the time you want to use a dedicated dictionary instead.
You can use globals to access your variables by name:
globals()[runThis].method_a()
Also it might be helpful in some case.
import gc
for obj in gc.get_objects():
if isinstance(obj, Test):
print obj
You could also use this:
import classTest
object1 = classTest.Test("Unn")
object2 = classTest.Test("Tss")
runThis = eval(raw_input("Type in either object1 or object2:"), globals())
runThis.method_a()
i have to get static information from one 'module' to another. I'm trying to write logger with information about code place from where we're logging.
For example, in some file:
LogObject.Log('Describe error', STATIC_INFORMATION)
Static information is class name, file name and function name.
I get it from this:
__file__
self.__class__.__name__
sys._getframe().f_code.co_name
But i don't want to write this variables during logging. Can i create some function and call it. For example:
LogObject.Log('Describe error', someFunction())
How can i use it for getting static information?
I don't think "static" is the world you're looking for. If I understand you correctly, you want to write a function that will return the filename, class name and method name of the caller.
Basically, you should use sys._getframe(1) to access the previous frame, and work from there.
Example:
def codeinfo():
import sys
f = sys._getframe(1)
filename = f.f_code.co_filename
classname = ''
if 'self' in f.f_locals:
classname = f.f_locals['self'].__class__.__name__
funcname = f.f_code.co_name
return "filename: %s\nclass: %s\nfunc: %s" % (filename, classname, funcname)
Then from a method somewhere you can write
logger.info("Some message \n %s" % codeinfo())
First, please use lower-case names for objects and methods. Only use UpperCase Names for Class definitions.
More importantly, you want a clever introspective function in every class, it appears.
class Loggable( object ):
def identification( self ):
return self.__class__.__module__, self.__class__.__name__, sys._getframe().f_code.co_name
class ARealClass( Loggable ):
def someFunction( self ):
logger.info( "Some Message from %r", self. identification() )
If all of your classes are subclasses of Loggable, you'll inherit this identification function in all classes.