Share namespace of caller with imported module - python

The short version of the question first:
Assume we have a module called "module" and a python script "caller.py" that imports module.
Is it possible to share the globals() namespace of caller.py with the module?
Such that i could do something like this:
module.py
def print_handle(fkt_name):
globals()[fkt_name]
caller.py:
def function_from_caller():
return 0
import module
module.print_handle('function_from_caller')
# which then returns something like:
# <function __main__.function_from_caller()>
Long version:
As far as I understand, the scope of imported module in python is restricted to that module.
Anything that is not defined in the module or imported somehow is unknown to it.
If a module is imported I can share it's namespace with the namespace of caller by either specifically naming the functions of interest with
from module import function_of_interest
or to share the full namespace
from module import *
However, as far as I know it is not possible to achieve this the other way around, or is it?
Can I pass the namespace from the caller function to the module in any way?
I.e. with something like
pi = 3
import module with pi
or in case I want to pass everything
import module with *
If this is not possible as suspected, why is that?

I do not see the reason why you should do that.
if the module has anb attribute a in it, you can just do
module.a = 12
If you have many attributes, just use setattr(module, AttrName, AttrValue) in a for loop.
MMan

Related

What is meaning of underscore(_) in python function import?

I've inherited some code with imports in each function and using underscores for each module imported as below
def my_func():
from foo import bar as _bar
from spam import meat as _meat
# Do some work
What is the point in the _bar? All imports are done like this.
If the actual names are things that exist as a part of the built in commands in python, this is done as a way to avoid shadowing those built in functions (for example - from mymodule import open would make the built in open which returns file handles inaccessble). Otherwise, it's simply convention for the original author.
I believe functions with name starting with a single underscore can't be imported using this line :
from module import *
for example this module :
def _some_function_1():
pass
def some_function_2():
pass
if you imported this module, you will be able to access only some_function_2()

Importing modules in python (3 modules)

Let's say i have 3 modules within the same directory. (module1,module2,module3)
Suppose the 2nd module imports the 3rd module then if i import module2 in module 1. Does that automatically import module 3 to module 1 ?
Thanks
No. The imports only work inside a module. You can verify that by creating a test.
Saying,
# module1
import module2
# module2
import module3
# in module1
module3.foo() # oops
This is reasonable because you can think in reverse: if imports cause a chain of importing, it'll be hard to decide which function is from which module, thus causing complex naming conflicts.
No, it will not be imported unless you explicitly specify python to, like so:
from module2 import *
What importing does conceptually is outlined below.
import some_module
The statement above is equivalent to:
module_variable = import_module("some_module")
All we have done so far is bind some object to a variable name.
When it comes to the implementation of import_module it is also not that hard to grasp.
def import_module(module_name):
if module_name in sys.modules:
module = sys.modules[module_name]
else:
filename = find_file_for_module(module_name)
python_code = open(filename).read()
module = create_module_from_code(python_code)
sys.modules[module_name] = module
return module
First, we check if the module has been imported before. If it was, then it will be available in the global list of all modules (sys.modules), and so will simply be reused. In the case that the module is not available, we create it from the code. Once the function returns, the module will be assigned to the variable name that you have chosen. As you can see the process is not inefficient or wasteful. All you are doing is creating an alias for your module. In most cases, transparency is prefered, hence having a quick look at the top of the file can tell you what resources are available to you. Otherwise, you might end up in a situation where you are wondering where is a given resource coming from. So, that is why you do not get modules inherently "imported".
Resource:
Python doc on importing

Getting a function's module of original definition

Given a class or function, is there a way to find the full path of the module where it is originally defined? (I.e. using def xxx or class xxx.)
I'm aware that there is sys.modules[func.__module__]. However, if func is imported in a package's __init__.py, then sys.modules will simply redirect to that __init__.py, because the function has been brought into that namespace, as far as my understanding goes.
A concrete example:
>>> import numpy as np
>>> import sys
>>> np.broadcast.__module__
'numpy'
>>> sys.modules[np.broadcast.__module__]
<module 'numpy' from '/Users/brad/.../site-packages/numpy/__init__.py'>
Obviously, broadcast is not defined in __init__.py; it is just brought into the namespace with one of these from module import * statements.
It would be nice to see where in the source np.broadcast is defined (regardless of the file extension, be it .c or .py). Is this possible?
Your understanding:
However, if func is imported in a package's __init__.py, then
sys.modules will simply redirect to that __init__.py, because the
function has been brought into that namespace, as far as my
understanding goes.
is wrong. __init__.py importing a thing has no effect on that thing's __module__.
The behavior you're seeing with numpy.broadcast happens because C types don't really have a "defining module" the same way types written in Python do. numpy.broadcast.__module__ == 'numpy' because numpy.broadcast is written in C and declares its name to be "numpy.broadcast", and a C type's __module__ is determined from its name.
As for how to get a class or function's "module of original definition", the best you really have is __module__ and other functions that go through __module__.

"attach" a python module similar to R?

In Python, when we import something:
import Module
when we later want to use functions created in the module we have to say
Module.foo()
Is there any way to "attach" the module so that if I simply call
foo()
It knows that I mean to use the foo defined in Module, as long as the name does not conflict with any name in the current file?
from Module import *
This imports all symbols in Module unless overriden by __all__.
You can also explicitly import (which is better) only the symbols you actually need.
from Module import foo
It's typically preferred to use the later. Even better is to use the module as namespacing. There's nothing wrong with Module.foo() vs. foo(). Once your program gets fairly large, this will help you quite a bit with refactoring.
You can just do from module import foo, and then refer to foo() directly.

Python: setting __globals__ for modules and their imports

I'm developing a test engine with Python, but I'm facing some problems related to module loading and global functions.
The main idea of the framework I'm creating is to load a Python file containing functions and annotations "#thisisatest" to tell which functions are tests. I load this file with imp.load_source, and latter, I spawn threads that calls the function from the loaded module. It's something like this:
module = imp.load_source("test", "testdir/test.py")
function = module.testFunction
thread = threading.Thread(target=function)
thread.start()
Anyway, I want to connect to this test a "assertion function", doing something like:
module = imp.load_source("test", "testdir/test.py")
module.__globals__.assertAndTerminate = assertionFunction
function = module.testFunction
thread = threading.Thread(target=function)
thread.start()
And that's all right. The problem starts when the test.py imports another module that uses the assertAndTerminate function inside it. The module loaded by test.py is completely unaware from the __globals__ from test.py and don't know who's the assertAndTerminate I'm talking about (and that makes sense, since each module has its own __globals__).
Does anyone know a way I could set the same assertAndTerminate function for the test.py module and the modules loaded by it in a thread? I would prefer not searching for imports in a tree, is it possible?
Is there something like Thread(target=function, global_vars=["assertAndTerminate":assertionFunction])?
You need to set the attribute directly on the module; that is the global namespace for that module:
module = imp.load_source("test", "testdir/test.py")
module.assertAndTerminate = assertionFunction
You do have to set globals on a per-module basis. Globals from one module do no not propagate to other modules on import.
You can add to the __builtin__ module (builtin in Python 3):
import __builtin__
__builtin__.assertAndTerminate = assertionFunction
These are then visible in all modules:
>>> import __builtin__
>>> __builtin__.foobar = 'barbaz'
>>> foobar
'barbaz'
Generally speaking, you really want to avoid doing this. Find some other method to solve your problem. Import code instead of relying on globals being set.

Categories

Resources