If I import a module:
import foo
How can I find the names of the classes it contains?
You can use the inspect module to do this. For example:
import inspect
import foo
for name, obj in inspect.getmembers(foo):
if inspect.isclass(obj):
print name
Check in dir(foo). By convention, the class names will be those in CamelCase.
If foo breaks convention, you could I guess get the class names with something like [x for x in dir(foo) if type(getattr(foo, x)) == type], but that's ugly and probably quite fragile.
From the question How to check whether a variable is a class or not? we can check if a variable is a class or not with the inspect module (example code shamelessly lifted from the accepted answer):
>>> import inspect
>>> class X(object):
... pass
...
>>> inspect.isclass(X)
True
So to build a list with all the classes from a module we could use
import foo
classes = [c for c in dir(foo) if inspect.isclass(getattr(foo, c))]
Update: The above example has been edited to use getattr(foo, c) rather than use foo.__getattribute__(c), as per #Chris Morgan's comment.
You can check the type of elements found via dir(foo): classes will have type type.
import foo
classlist = []
for i in dir(foo):
if type(foo.__getattribute__(i)) is type:
classlist.append(i)
You can get a lot of information about a Python module, say foo, by importing it and then using help(module-name) as follows:
>>> import foo
>>> help(foo)
Related
I have the following setup:
test.py
test\
__init__.py
abstract_handler.py
first_handler.py
second_handler.py
first_handler.py and second_handler.py contain classes with the same names that inherit from abstract_handler.
What I want to do in test.py is: given a string containing "first_handler" or any other handler class, create an object of that class.
Most solutions I found assume that the classes are in the same module (test.py), I don't know how to dynamically import the specific required class.
Use the __import__ for importing. Note that if you use submodules, you have to specify the fromlist, otherwise you get the top-level module instead. Thus
__import__('foo.bar', fromlist=['foo']).__dict__['baz_handler']()
Will call foo.bar.baz_handler()
Use a dictionary for this sort of dispatching.
import first_handler
import second_handler
dispatch_dict = {
'first': first_handler.FirstHandler
'second': second_handler.SecondHandler
}
Now, assuming your choice is in choice_string:
instance = dispatch_dict[choice_string]()
You could probably do something like this:
from first_handler import SameName as handler1
from second_handler import SameName as handler2
handlers = {'UniqueName1': handler1,
'UniqueName2': handler2}
instance = handlers['UniqueName1']()
This does the trick:
import abstract_handler
import first_handler
import second_handler
output = globals()['first_handler']()
A broad answer to the question.
To dynamically import use __import__(string) and then you'll find all the objects in .__dict__
This way you can instance based on a strings like:
c = __import__('test.first_handler').__dict__['awesomeclassname']()
I'm using the Python execfile() function as a simple-but-flexible way of handling configuration files -- basically, the idea is:
# Evaluate the 'filename' file into the dictionary 'foo'.
foo = {}
execfile(filename, foo)
# Process all 'Bar' items in the dictionary.
for item in foo:
if isinstance(item, Bar):
# process item
This requires that my configuration file has access to the definition of the Bar class. In this simple example, that's trivial; we can just define foo = {'Bar' : Bar} rather than an empty dict. However, in the real example, I have an entire module I want to load. One obvious syntax for that is:
foo = {}
eval('from BarModule import *', foo)
execfile(filename, foo)
However, I've already imported BarModule in my top-level file, so it seems like I should be able to just directly define foo as the set of things defined by BarModule, without having to go through this chain of eval and import.
Is there a simple idiomatic way to do that?
Maybe you can use the __dict__ defined by the module.
>>> import os
>>> str = 'getcwd()'
>>> eval(str,os.__dict__)
Use the builtin vars() function to get the attributes of an object (such as a module) as a dict.
The typical solution is to use getattr:
>>> s = 'getcwd'
>>> getattr(os, s)()
I'm trying to create a doctest with mock of function that resides in a separate module
and that is imported as bellow
from foomodule import foo
def bar():
"""
>>> from minimock import mock
>>> mock('foo', nsdicts=(bar.func_globals,), returns=5)
>>> bar()
Called foo()
10
"""
return foo() * 2
import doctest
doctest.testmod()
foomodule.py:
def foo():
raise ValueError, "Don't call me during testing!"
This fails.
If I change import to import foomodule
and use foomodule.foo everywhere
Then it works.
But is there any solution for mocking function imported the way above?
You've just met one of the many reasons that make it best to never import object from "within" modules -- only modules themselves (possibly from within packages). We've made this rule part of our style guidelines at Google (published here) and I heartily recommend it to every Python programmer.
That being said, what you need to do is to take the foomodule.foo that you've just replaced with a mock and stick it in the current module. I don't recall enough of doctest's internal to confirm whether
>>> import foomodule
>>> foo = foomodule.foo
will suffice for that -- give it a try, and if it doesn't work, do instead
>>> import foomodule
>>> import sys
>>> sys.modules[__name__].foo = foomodule.foo
yeah, it's a mess, but the cause of that mess is that innocent-looking from foomodule import foo -- eschew that, and your life will be simpler and more productive;-).
Finally, found out that this was rather an issue of trunk version of MiniMock.
Old stable one performs as expected.
Say I have the following code:
from foo.bar import Foo
from foo.foo import Bar
__all__ = ["Foo", "Bar"]
def iterate_over_all():
...
How can I implement code in the function iterate_over_all() that can dynamically obtain references to whatever is referenced in __all__ the module where the function is implemented? IE: in iterate_over_all() I want to work with foo.bar.Foo and foo.foo.Bar.
Would this do?
def iterate_over_all():
for name in __all__:
value = globals()[name]
yield value # or do whatever with it
eval is one way. e.g. eval("Foo") would give you Foo. However you can also just put Foo and Bar directly in your list e.g. __all__ = [Foo, Bar]
It would depend on where the contents of __all__ is coming from in your actual program.
You can easily index with the strings from __all__ into the module's __dict__:
__dict__["Foo"] == foo.bar.Foo # -> True
I have a Python module with a function in it:
== bar.py ==
def foo(): pass
== EOF ==
And then I import it into the global namespace like so:
from bar import *
So now the function foo is available to me. If I print it:
print foo
The interpreter happily tells me:
<function foo at 0xb7eef10c>
Is there a way for me to find out that function foo came from module bar at this point?
foo.__module__ should return bar
If you need more info, you can get it from sys.modules['bar'], its __file__ and __package__ attributes may be interesting.
Try this:
help(foo.func_name)
Instead of
from bar import *
use
from bar import foo
Using the from ... import * syntax is a bad programming style precisely because it makes it hard to know where elements of your namespace come from.
If you have IPython, you can also use ?:
In[16]: runfile?
Signature: runfile(filename, args=None, wdir=None, is_module=False, global_vars=None)
Docstring:
Run filename
args: command line arguments (string)
wdir: working directory
File: /Applications/PyCharm.app/Contents/helpers/pydev/_pydev_bundle/pydev_umd.py
Type: function
Compare with __module__:
In[17]: runfile.__module__
Out[17]: '_pydev_bundle.pydev_umd'