How to customize module attribute reference behavior in Python [duplicate] - python

How can implement the equivalent of a __getattr__ on a class, on a module?
Example
When calling a function that does not exist in a module's statically defined attributes, I wish to create an instance of a class in that module, and invoke the method on it with the same name as failed in the attribute lookup on the module.
class A(object):
def salutation(self, accusative):
print "hello", accusative
# note this function is intentionally on the module, and not the class above
def __getattr__(mod, name):
return getattr(A(), name)
if __name__ == "__main__":
# i hope here to have my __getattr__ function above invoked, since
# salutation does not exist in the current namespace
salutation("world")
Which gives:
matt#stanley:~/Desktop$ python getattrmod.py
Traceback (most recent call last):
File "getattrmod.py", line 9, in <module>
salutation("world")
NameError: name 'salutation' is not defined

There are two basic problems you are running into here:
__xxx__ methods are only looked up on the class
TypeError: can't set attributes of built-in/extension type 'module'
(1) means any solution would have to also keep track of which module was being examined, otherwise every module would then have the instance-substitution behavior; and (2) means that (1) isn't even possible... at least not directly.
Fortunately, sys.modules is not picky about what goes there so a wrapper will work, but only for module access (i.e. import somemodule; somemodule.salutation('world'); for same-module access you pretty much have to yank the methods from the substitution class and add them to globals() eiher with a custom method on the class (I like using .export()) or with a generic function (such as those already listed as answers). One thing to keep in mind: if the wrapper is creating a new instance each time, and the globals solution is not, you end up with subtly different behavior. Oh, and you don't get to use both at the same time -- it's one or the other.
Update
From Guido van Rossum:
There is actually a hack that is occasionally used and recommended: a
module can define a class with the desired functionality, and then at
the end, replace itself in sys.modules with an instance of that class
(or with the class, if you insist, but that's generally less useful).
E.g.:
# module foo.py
import sys
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
sys.modules[__name__] = Foo()
This works because the import machinery is actively enabling this
hack, and as its final step pulls the actual module out of
sys.modules, after loading it. (This is no accident. The hack was
proposed long ago and we decided we liked enough to support it in the
import machinery.)
So the established way to accomplish what you want is to create a single class in your module, and as the last act of the module replace sys.modules[__name__] with an instance of your class -- and now you can play with __getattr__/__setattr__/__getattribute__ as needed.
Note 1: If you use this functionality then anything else in the module, such as globals, other functions, etc., will be lost when the sys.modules assignment is made -- so make sure everything needed is inside the replacement class.
Note 2: To support from module import * you must have __all__ defined in the class; for example:
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
__all__ = list(set(vars().keys()) - {'__module__', '__qualname__'})
Depending on your Python version, there may be other names to omit from __all__. The set() can be omitted if Python 2 compatibility is not needed.

A while ago, Guido declared that all special method lookups on
new-style classes bypass __getattr__ and __getattribute__. Dunder methods had previously worked on modules - you could, for example, use a module as a context manager simply by defining __enter__ and __exit__, before those tricks broke.
Recently some historical features have made a comeback, the module __getattr__ among them, and so the existing hack (a module replacing itself with a class in sys.modules at import time) should be no longer necessary.
In Python 3.7+, you just use the one obvious way. To customize attribute access on a module, define a __getattr__ function at the module level which should accept one argument (name of attribute), and return the computed value or raise an AttributeError:
# my_module.py
def __getattr__(name: str) -> Any:
...
This will also allow hooks into "from" imports, i.e. you can return dynamically generated objects for statements such as from my_module import whatever.
On a related note, along with the module getattr you may also define a __dir__ function at module level to respond to dir(my_module). See PEP 562 for details.

This is a hack, but you can wrap the module with a class:
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
# Perform custom logic here
try:
return getattr(self.wrapped, name)
except AttributeError:
return 'default' # Some sensible default
sys.modules[__name__] = Wrapper(sys.modules[__name__])

We don't usually do it that way.
What we do is this.
class A(object):
....
# The implicit global instance
a= A()
def salutation( *arg, **kw ):
a.salutation( *arg, **kw )
Why? So that the implicit global instance is visible.
For examples, look at the random module, which creates an implicit global instance to slightly simplify the use cases where you want a "simple" random number generator.

Similar to what #Håvard S proposed, in a case where I needed to implement some magic on a module (like __getattr__), I would define a new class that inherits from types.ModuleType and put that in sys.modules (probably replacing the module where my custom ModuleType was defined).
See the main __init__.py file of Werkzeug for a fairly robust implementation of this.

This is hackish, but...
# Python 2.7
import types
class A(object):
def salutation(self, accusative):
print("hello", accusative)
def farewell(self, greeting, accusative):
print(greeting, accusative)
def AddGlobalAttribute(classname, methodname):
print("Adding " + classname + "." + methodname + "()")
def genericFunction(*args):
return globals()[classname]().__getattribute__(methodname)(*args)
globals()[methodname] = genericFunction
# set up the global namespace
x = 0 # X and Y are here to add them implicitly to globals, so
y = 0 # globals does not change as we iterate over it.
toAdd = []
def isCallableMethod(classname, methodname):
someclass = globals()[classname]()
something = someclass.__getattribute__(methodname)
return callable(something)
for x in globals():
print("Looking at", x)
if isinstance(globals()[x], (types.ClassType, type)):
print("Found Class:", x)
for y in dir(globals()[x]):
if y.find("__") == -1: # hack to ignore default methods
if isCallableMethod(x,y):
if y not in globals(): # don't override existing global names
toAdd.append((x,y))
# Returns:
# ('Looking at', 'A')
# ('Found Class:', 'A')
# ('Looking at', 'toAdd')
# ('Looking at', '__builtins__')
# ('Looking at', 'AddGlobalAttribute')
# ('Looking at', 'register')
# ('Looking at', '__package__')
# ('Looking at', 'salutation')
# ('Looking at', 'farewell')
# ('Looking at', 'types')
# ('Looking at', 'x')
# ('Looking at', 'y')
# ('Looking at', '__name__')
# ('Looking at', 'isCallableMethod')
# ('Looking at', '__doc__')
# ('Looking at', 'codecs')
for x in toAdd:
AddGlobalAttribute(*x)
if __name__ == "__main__":
salutation("world")
farewell("goodbye", "world")
# Returns:
# hello world
# goodbye world
This works by iterating over the all the objects in the global namespace. If the item is a class, it iterates over the class attributes. If the attribute is callable it adds it to the global namespace as a function.
It ignore all attributes which contain "__".
I wouldn't use this in production code, but it should get you started.

Here's my own humble contribution -- a slight embellishment of #Håvard S's highly rated answer, but a bit more explicit (so it might be acceptable to #S.Lott, even though probably not good enough for the OP):
import sys
class A(object):
def salutation(self, accusative):
print "hello", accusative
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
try:
return getattr(self.wrapped, name)
except AttributeError:
return getattr(A(), name)
_globals = sys.modules[__name__] = Wrapper(sys.modules[__name__])
if __name__ == "__main__":
_globals.salutation("world")

Create your module file that has your classes. Import the module. Run getattr on the module you just imported. You can do a dynamic import using __import__ and pull the module from sys.modules.
Here's your module some_module.py:
class Foo(object):
pass
class Bar(object):
pass
And in another module:
import some_module
Foo = getattr(some_module, 'Foo')
Doing this dynamically:
import sys
__import__('some_module')
mod = sys.modules['some_module']
Foo = getattr(mod, 'Foo')

Related

How can I lazily import a module in python?

I have classes which require dependencies in order to be instantiated but are otherwise optional. I'd like to lazily import the dependencies and fail to instantiate the class if they aren't available. Note that these dependencies are not required at the package level (otherwise they'd be mandatory via setuptools). I currently have something like this:
class Foo:
def __init__(self):
try:
import module
except ImportError:
raise ModuleNotFoundError("...")
def foo(self):
import module
Because this try/except pattern is common, I'd like to abstract it into a lazy importer. Ideally if module is available, I won't need to import it again in Foo.foo so I'd like module to be available once it's been imported in __init__. I've tried the following, which populates globals() and fails to instantiate the class if numpy isn't available, but it pollutes the global namespace.
def lazy_import(name, as_=None):
# Doesn't handle error_msg well yet
import importlib
mod = importlib.import_module(name)
if as_ is not None:
name = as_
# yuck...
globals()[name] = mod
class NeedsNumpyFoo:
def __init__(self):
lazy_import("numpy", as_="np")
def foo(self):
return np.array([1,2,])
I could instantiate the module outside the class and point to the imported module if import doesn't fail, but that is the same as the globals() approach. Alternatively lazy_import could return the mod and I could call it whenever the module is needed, but this is tantamount to just importing it everywhere as before.
Is there a better way to handle this?
Pandas actually has a function import_optional_dependency which may make a good example (link GitHub) as used in SQLAlchemyEngine (link GitHub)
However, this is only used during class __init__ to get a meaningful error (raise ImportError(...) by default!) or warn about absence or old dependencies (which is likely a more practical use of it, as older or newer dependencies may import correctly anywhere if they exist, but not work or be explicitly tested against or even be an accidental local import)
I'd consider doing similarly, and either not bother to have special handling or only do it in the __init__ (and then perhaps only for a few cases where you're interested in the version, etc.) and otherwise simply import where needed
class Foo():
def __init__(self, ...):
import bar # only tests for existence
def usebar(self, value):
import bar
bar.baz(value)
Plausibly you could assign to a property of the class, but this may cause some trouble or confusion (as the import should already be available in globals once imported)
class Foo():
def __init__(self, ...):
import bar
self.bar = bar
def usebar(self, value):
self.bar.baz(value)
Gave it a quick test with a wrapper, seems to work fine:
def requires_math(fn):
def wrapper(*args, **kwargs):
global math
try:
math
except NameError:
import math
return fn(*args, **kwargs)
return wrapper
#requires_math
def func():
return math.ceil(5.5)
print(func())
Edit: More advanced one that works with any module, and ensures it is a module in case it's been set to something else.
from types import ModuleType
def requires_import(*mods):
def decorator(fn):
def wrapper(*args, **kwargs):
for mod in mods:
if mod not in globals() or not isinstance(globals()[mod], ModuleType):
globals()[mod] = __import__(mod)
return fn(*args, **kwargs)
return wrapper
return decorator
#requires_import('math', 'random')
def func():
return math.ceil(random.uniform(0, 10))
print(func())

How to add a module's attributes to a class with custom behavior

I have a module in some path. I want to create a class with the same attributes as the module (so that it can be used the same way as the module) but perform some custom actions before accessing the attributes - such as reloading the module.
def get_method_with_extra(method_name, module):
def method_with_extra(self, *args):
imp.reload(module)
func_to_call = getattr(module, method_name)
func_to_call(*args)
return method_with_extra
class tester():
def __init__(self, module_path):
self.module = imp.load_source('module', module_path)
method_list = [func for func in dir(self.module) if
callable(getattr(self.module, func))]
for method_name in method_list:
method_with_extra = get_method_with_extra(method_name,
self.module)
setattr(type(self), method_name, method_with_extra)
So if for example the module has a method named "Parse", I would like an instance of tester - tess - to have it as well, and for me to be able to call tess.parse() which should reload the inner module and then call the module's parse(). Instead, I get this error:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "<string>", line 4, in __init__
AttributeError: attribute '__call__' of 'namespace#' object is read-only
If you are allowed to change the source of the target module, and it is small enough, I think the cleanest solution here is rewriting it as a class. And then import the class, inherit from it and customize.
Also be aware that reloading modules in Python has lots of caveats, it's more for playing in Python shell than for production code.
Based on what you said in the comment, I changed your code just a little bit. I used importlib because module imp is deprecated. Note also that "monkey patching" (that's what this kind of technique is called, you want to make a runtime patch in the code) is always tightly coupled with the target code. It there are changes, your patch code can break easily.
I wrote two files module.py and test_module.py:
#-----------
# module.py
a = 100
b = 200
# returns something
def sum3(x,y):
return x + y + 3
# does something
def print_a_b():
global a
print(a,b)
a = a + 1 # test module reloads. If ok, "a" remains 100
#----------------
# test_module.py
import module
import importlib as imp
def get_method_with_extra(method_name, module):
def method_with_extra(self, *args):
imp.reload(module) # comment to see that "a" increases
func_to_call = getattr(module, method_name)
if args: # function may not have args
return func_to_call(*args)
else: # function may not have args
return func_to_call()
return method_with_extra
class tester():
def __init__(self, module_path):
self.module = imp.import_module('module', module_path)
method_list = [func for func in dir(self.module)
if callable(getattr(self.module, func))]
for method_name in method_list:
#print(method_name)
method_with_extra = \
get_method_with_extra(method_name, self.module)
setattr(type(self), method_name, method_with_extra)
t = tester('.')
print(t.sum3(1,2))
t.print_a_b()
t.print_a_b() # checking for the reload, "a" should be 100

Override a function' sub-function from a decorator?

Let's consider this piece of code where I would like to create bar dynamically with a decorator
def foo():
def bar():
print "I am bar from foo"
print bar()
def baz():
def bar():
print "I am bar from baz"
print bar()
I thought I could create bar from the outside with a decorator:
def bar2():
print "I am super bar from foo"
setattr(foo, 'bar', bar2)
But the result is not what I was expecting (I would like to get I am super bar from foo:
>>> foo()
I am bar from foo
Is it possible to override a sub-function on an existing function with a decorator?
The actual use case
I am writing a wrapper for a library and to avoid boilerplate code I would like to simplify my work.
Each library function has a prefix lib_ and returns an error code. I would like to add the prefix to the current function and treat the error code. This could be as simple as this:
def call():
fname = __libprefix__ + inspect.stack()[1][3]
return_code = getattr(__lib__, fname)(*args)
if return_code < 0: raise LibError(fname, return_code)
def foo():
call()
The problem is that call might act differently in certain cases. Some library functions do not return an error_code so it would be easier to write it like
this:
def foo():
call(check_status=True)
Or much better in my opinion (this is the point where I started thinking about decorators):
#LibFunc(check_status=True)
def foo():
call()
In this last example I should declare call inside foo as a sub-function created dynamically by the decorator itself.
The idea was to use something like this:
class LibFunc(object):
def __init__(self,**kwargs):
self.kwargs = kwargs
def __call__(self, original_func):
decorator_self = self
def wrappee( *args, **kwargs):
def call(*args):
fname = __libprefix__ + original_func.__name__
return_code = getattr(__lib__, fname)(*args)
if return_code < 0: raise LibError(fname, return_code)
print original_func
print call
# <<<< The part that does not work
setattr(original_func, 'call', call)
# <<<<
original_func(*args,**kwargs)
return wrappee
Initially I was tempted to call the call inside the decorator itself to minimize the writing:
#LibFunc():
foo(): pass
Unfortunately, this is not an option since other things should sometime be done before and after the call:
#LibFunc():
foo(a,b):
value = c_float()
call(a, pointer(value), b)
return value.value
Another option that I thought about was to use SWIG, but again this is not an option because I will need to rebuild the existing library with the SWIG wrapping functions.
And last but not least, I may get inspiration from SWIG typemaps and declare my wrapper as this:
#LibFunc(check_exit = true, map = ('<a', '>c_float', '<c_int(b)')):
foo(a,b): pass
This looks like the best solution to me, but this is another topic and another question...
Are you married to the idea of a decorator? Because if your goal is bunch of module-level functions each of which wraps somelib.lib_somefunctionname, I don't see why you need one.
Those module-level names don't have to be functions, they just have to be callable. They could be a bunch of class instances, as long as they have a __call__ method.
I used two different subclasses to determine how to treat the return value:
#!/usr/bin/env python3
import libtowrap # Replace with the real library name.
class Wrapper(object):
'''
Parent class for all wrapped functions in libtowrap.
'''
def __init__(self, name):
self.__name__ = str(name)
self.wrapped_name = 'lib_' + self.__name__
self.wrapped_func = getattr(libtowrap, self.wrapped_name)
self.__doc__ = self.wrapped_func.__doc__
return
class CheckedWrapper(Wrapper):
'''
Wraps functions in libtowrap that return an error code that must
be checked. Negative return values indicate an error, and will
raise a LibError. Successful calls return None.
'''
def __call__(self, *args, **kwargs):
error_code = self.wrapped_func(*args, **kwargs)
if error_code < 0:
raise LibError(self.__name__, error_code)
return
class UncheckedWrapper(Wrapper):
'''
Wraps functions in libtowrap that return a useful value, as
opposed to an error code.
'''
def __call__(self, *args, **kwargs):
return self.wrapped_func(*args, **kwargs)
strict = CheckedWrapper('strict')
negative_means_failure = CheckedWrapper('negative_means_failure')
whatever = UncheckedWrapper('whatever')
negative_is_ok = UncheckedWrapper('negative_is_ok')
Note that the wrapper "functions" are assigned while the module is being imported. They are in the top-level module namespace, and not hidden by any if __name__ == '__main__' test.
They will behave like functions for most purposes, but there will be minor differences. For example, I gave each instance a __name__ that matches the name they're assigned to, not the lib_-prefixed name used in libtowrap... but I copied the original __doc__, which might refer to a prefixed name like lib_some_other_function. Also, testing them with isinstance will probably surprise people.
For more about decorators, and for many more annoying little discrepancies like the ones I mentioned above, see Graham Dumpleton's half-hour lecture "Advanced Methods for Creating Decorators" (PyCon US 2014; slides). He is the author of the wrapt module (Python Package Index; Git Hub; Read the Docs), which corrects all(?) of the usual decorator inconsistencies. It might solve your problem entirely (except for the old lib_-style names showing up in __doc__).

Use only class name without namespace in isinstance

This works in a script to recognise if a is of class myproject.aa.RefClass
isinstance(a, myproject.aa.RefClass)
But how could I do it so I do not have to specify the full namespace ? I would like to be able to type:
isinstance(a, RefClass)
How is this done in Python ?
EDIT: let me give more details.
In module aa.referencedatatable.py:
class ReferenceDataTable(object):
def __init__(self, name):
self.name = name
def __call__(self, f):
self._myfn = f
return self
def referencedatatable_from_tag(tag):
import definitions
defn_lst = [definitions]
for defn in defn_lst:
referencedatatable_instance_lst = [getattr(defn, a) for a in dir(defn) if isinstance(getattr(defn, a), ReferenceDataTable)]
for referencedatatable_instance in referencedatatable_instance_lst
if referencedatatable_instance.name == tag
return referencedatatable_instance
raise("could not find")
def main()
referencedata_from_tag("Example")
In module aa.definitions.py:
from aa.referencedatatable import ReferenceDataTable
#ReferenceDataTable("Example")
def EXAMPLE():
raise NotImplementedError("not written")
For some reason calling the main from aa.referencedatatable.py will throw as it will not be able to recognise the instance of the class. But if I copy this main in another module it will work:
import aa.referencedatatable
a = aa.referencedatatable.referencedatatable_from_tag("Example")
print a
This second example works, for some reason calling this function inside the same module where the class is declared does not.
The 'namespace' is just a module object, and so is the class. You can always assign the class to a different name:
RefClass = myproject.aa.RefClass
or better yet, import it directly into your own namespace:
from myproject.aa import RefClass
Either way, now you have a global name RefClass that references the class object, so you can do:
isinstance(a, RefClass)

How to intercept access to module variables in python? [duplicate]

How can implement the equivalent of a __getattr__ on a class, on a module?
Example
When calling a function that does not exist in a module's statically defined attributes, I wish to create an instance of a class in that module, and invoke the method on it with the same name as failed in the attribute lookup on the module.
class A(object):
def salutation(self, accusative):
print "hello", accusative
# note this function is intentionally on the module, and not the class above
def __getattr__(mod, name):
return getattr(A(), name)
if __name__ == "__main__":
# i hope here to have my __getattr__ function above invoked, since
# salutation does not exist in the current namespace
salutation("world")
Which gives:
matt#stanley:~/Desktop$ python getattrmod.py
Traceback (most recent call last):
File "getattrmod.py", line 9, in <module>
salutation("world")
NameError: name 'salutation' is not defined
There are two basic problems you are running into here:
__xxx__ methods are only looked up on the class
TypeError: can't set attributes of built-in/extension type 'module'
(1) means any solution would have to also keep track of which module was being examined, otherwise every module would then have the instance-substitution behavior; and (2) means that (1) isn't even possible... at least not directly.
Fortunately, sys.modules is not picky about what goes there so a wrapper will work, but only for module access (i.e. import somemodule; somemodule.salutation('world'); for same-module access you pretty much have to yank the methods from the substitution class and add them to globals() eiher with a custom method on the class (I like using .export()) or with a generic function (such as those already listed as answers). One thing to keep in mind: if the wrapper is creating a new instance each time, and the globals solution is not, you end up with subtly different behavior. Oh, and you don't get to use both at the same time -- it's one or the other.
Update
From Guido van Rossum:
There is actually a hack that is occasionally used and recommended: a
module can define a class with the desired functionality, and then at
the end, replace itself in sys.modules with an instance of that class
(or with the class, if you insist, but that's generally less useful).
E.g.:
# module foo.py
import sys
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
sys.modules[__name__] = Foo()
This works because the import machinery is actively enabling this
hack, and as its final step pulls the actual module out of
sys.modules, after loading it. (This is no accident. The hack was
proposed long ago and we decided we liked enough to support it in the
import machinery.)
So the established way to accomplish what you want is to create a single class in your module, and as the last act of the module replace sys.modules[__name__] with an instance of your class -- and now you can play with __getattr__/__setattr__/__getattribute__ as needed.
Note 1: If you use this functionality then anything else in the module, such as globals, other functions, etc., will be lost when the sys.modules assignment is made -- so make sure everything needed is inside the replacement class.
Note 2: To support from module import * you must have __all__ defined in the class; for example:
class Foo:
def funct1(self, <args>): <code>
def funct2(self, <args>): <code>
__all__ = list(set(vars().keys()) - {'__module__', '__qualname__'})
Depending on your Python version, there may be other names to omit from __all__. The set() can be omitted if Python 2 compatibility is not needed.
A while ago, Guido declared that all special method lookups on
new-style classes bypass __getattr__ and __getattribute__. Dunder methods had previously worked on modules - you could, for example, use a module as a context manager simply by defining __enter__ and __exit__, before those tricks broke.
Recently some historical features have made a comeback, the module __getattr__ among them, and so the existing hack (a module replacing itself with a class in sys.modules at import time) should be no longer necessary.
In Python 3.7+, you just use the one obvious way. To customize attribute access on a module, define a __getattr__ function at the module level which should accept one argument (name of attribute), and return the computed value or raise an AttributeError:
# my_module.py
def __getattr__(name: str) -> Any:
...
This will also allow hooks into "from" imports, i.e. you can return dynamically generated objects for statements such as from my_module import whatever.
On a related note, along with the module getattr you may also define a __dir__ function at module level to respond to dir(my_module). See PEP 562 for details.
This is a hack, but you can wrap the module with a class:
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
# Perform custom logic here
try:
return getattr(self.wrapped, name)
except AttributeError:
return 'default' # Some sensible default
sys.modules[__name__] = Wrapper(sys.modules[__name__])
We don't usually do it that way.
What we do is this.
class A(object):
....
# The implicit global instance
a= A()
def salutation( *arg, **kw ):
a.salutation( *arg, **kw )
Why? So that the implicit global instance is visible.
For examples, look at the random module, which creates an implicit global instance to slightly simplify the use cases where you want a "simple" random number generator.
Similar to what #Håvard S proposed, in a case where I needed to implement some magic on a module (like __getattr__), I would define a new class that inherits from types.ModuleType and put that in sys.modules (probably replacing the module where my custom ModuleType was defined).
See the main __init__.py file of Werkzeug for a fairly robust implementation of this.
This is hackish, but...
# Python 2.7
import types
class A(object):
def salutation(self, accusative):
print("hello", accusative)
def farewell(self, greeting, accusative):
print(greeting, accusative)
def AddGlobalAttribute(classname, methodname):
print("Adding " + classname + "." + methodname + "()")
def genericFunction(*args):
return globals()[classname]().__getattribute__(methodname)(*args)
globals()[methodname] = genericFunction
# set up the global namespace
x = 0 # X and Y are here to add them implicitly to globals, so
y = 0 # globals does not change as we iterate over it.
toAdd = []
def isCallableMethod(classname, methodname):
someclass = globals()[classname]()
something = someclass.__getattribute__(methodname)
return callable(something)
for x in globals():
print("Looking at", x)
if isinstance(globals()[x], (types.ClassType, type)):
print("Found Class:", x)
for y in dir(globals()[x]):
if y.find("__") == -1: # hack to ignore default methods
if isCallableMethod(x,y):
if y not in globals(): # don't override existing global names
toAdd.append((x,y))
# Returns:
# ('Looking at', 'A')
# ('Found Class:', 'A')
# ('Looking at', 'toAdd')
# ('Looking at', '__builtins__')
# ('Looking at', 'AddGlobalAttribute')
# ('Looking at', 'register')
# ('Looking at', '__package__')
# ('Looking at', 'salutation')
# ('Looking at', 'farewell')
# ('Looking at', 'types')
# ('Looking at', 'x')
# ('Looking at', 'y')
# ('Looking at', '__name__')
# ('Looking at', 'isCallableMethod')
# ('Looking at', '__doc__')
# ('Looking at', 'codecs')
for x in toAdd:
AddGlobalAttribute(*x)
if __name__ == "__main__":
salutation("world")
farewell("goodbye", "world")
# Returns:
# hello world
# goodbye world
This works by iterating over the all the objects in the global namespace. If the item is a class, it iterates over the class attributes. If the attribute is callable it adds it to the global namespace as a function.
It ignore all attributes which contain "__".
I wouldn't use this in production code, but it should get you started.
Here's my own humble contribution -- a slight embellishment of #Håvard S's highly rated answer, but a bit more explicit (so it might be acceptable to #S.Lott, even though probably not good enough for the OP):
import sys
class A(object):
def salutation(self, accusative):
print "hello", accusative
class Wrapper(object):
def __init__(self, wrapped):
self.wrapped = wrapped
def __getattr__(self, name):
try:
return getattr(self.wrapped, name)
except AttributeError:
return getattr(A(), name)
_globals = sys.modules[__name__] = Wrapper(sys.modules[__name__])
if __name__ == "__main__":
_globals.salutation("world")
Create your module file that has your classes. Import the module. Run getattr on the module you just imported. You can do a dynamic import using __import__ and pull the module from sys.modules.
Here's your module some_module.py:
class Foo(object):
pass
class Bar(object):
pass
And in another module:
import some_module
Foo = getattr(some_module, 'Foo')
Doing this dynamically:
import sys
__import__('some_module')
mod = sys.modules['some_module']
Foo = getattr(mod, 'Foo')

Categories

Resources