Python: import modules dynamically in different namespace - python

import v_framework as framework
framework.loadModules(["Maintenance"])
framework.Maintenance.showPage()
In framework I have:
def loadModules(aModules):
d_utility = {"Maintenance":"COOl_M_PAGE"}
for module in a_aModules:
exec("import " + d_utility[module] + " as " + module)
When loadModules is executed, it imports the modules in the v_framework namespace. Since I am importing v_framework as framework, I think I should be able to use the imported module using framework.Maintenance. But it does not work that way.
Is there a way to do way to do what I'm trying to do? Alternatively, is there any way to import modules in a namespace other than the one where exec is executed?

There are libraries for importing modules dynamically. You could use importlib (and another one that might be useful is pkgutil). Now, for your case, I guess this would do the job:
import importlib
mods = {}
def loadModules(aModule):
global mods
mods[module] = importlib.import_module(d_utility[module])
# or maybe globals()[module] = ... would work also (exactly as you expect it to
UPDATE: exec modifies the function's local namespace, not the global one (I think).
Hope it helps. :)

When you import inside a function, the module is imported/executed as normal, but the name you import under is local to the function, just like any other variable assigned inside a function.
>>> def test_import():
... import os
... print os
...
>>> test_import()
<module 'os' from '/usr/lib/python2.7/os.pyc'>
>>> os
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
os has been imported though, and you can still access it through sys.modules:
>>> import sys
>>> sys.modules['os']
<module 'os' from '/usr/lib/python2.7/os.pyc'>
>>> os = sys.modules['os']
>>> os
<module 'os' from '/usr/lib/python2.7/os.pyc'>
A quick and dirty way to do what you want would be something like this; exec takes an optional mapping to be used as the local and global variables. So you could do
def loadModules(aModules):
d_utility = {"Maintenance":"COOl_M_PAGE"}
for module in aModules:
exec ('import %s as %s' % (d_utility[module], module)) in globals()
Though this is ugly and probably has security implications or something. As jadkik94 mentions, there are libraries that provide cleaner ways to deal with this.

Related

Pass the function as an argument to main function python and docker file [duplicate]

I'm writing a Python application that takes a command as an argument, for example:
$ python myapp.py command1
I want the application to be extensible, that is, to be able to add new modules that implement new commands without having to change the main application source. The tree looks something like:
myapp/
__init__.py
commands/
__init__.py
command1.py
command2.py
foo.py
bar.py
So I want the application to find the available command modules at runtime and execute the appropriate one.
Python defines an __import__() function, which takes a string for a module name:
__import__(name, globals=None, locals=None, fromlist=(), level=0)
The function imports the module name, potentially using the given globals and locals to determine how to interpret the name in a package context. The fromlist gives the names of objects or submodules that should be imported from the module given by name.
Source: https://docs.python.org/3/library/functions.html#__import__
So currently I have something like:
command = sys.argv[1]
try:
command_module = __import__("myapp.commands.%s" % command, fromlist=["myapp.commands"])
except ImportError:
# Display error message
command_module.run()
This works just fine, I'm just wondering if there is possibly a more idiomatic way to accomplish what we are doing with this code.
Note that I specifically don't want to get in to using eggs or extension points. This is not an open-source project and I don't expect there to be "plugins". The point is to simplify the main application code and remove the need to modify it each time a new command module is added.
See also: How do I import a module given the full path?
With Python older than 2.7/3.1, that's pretty much how you do it.
For newer versions, see importlib.import_module for Python 2 and Python 3.
Or using __import__ you can import a list of modules by doing this:
>>> moduleNames = ['sys', 'os', 're', 'unittest']
>>> moduleNames
['sys', 'os', 're', 'unittest']
>>> modules = map(__import__, moduleNames)
Ripped straight from Dive Into Python.
The recommended way for Python 2.7 and 3.1 and later is to use importlib module:
importlib.import_module(name, package=None)
Import a module. The name argument specifies what module to import in absolute or relative terms (e.g. either pkg.mod or ..mod). If the name is specified in relative terms, then the package argument must be set to the name of the package which is to act as the anchor for resolving the package name (e.g. import_module('..mod', 'pkg.subpkg') will import pkg.mod).
e.g.
my_module = importlib.import_module('os.path')
Note: imp is deprecated since Python 3.4 in favor of importlib
As mentioned the imp module provides you loading functions:
imp.load_source(name, path)
imp.load_compiled(name, path)
I've used these before to perform something similar.
In my case I defined a specific class with defined methods that were required.
Once I loaded the module I would check if the class was in the module, and then create an instance of that class, something like this:
import imp
import os
def load_from_file(filepath):
class_inst = None
expected_class = 'MyClass'
mod_name,file_ext = os.path.splitext(os.path.split(filepath)[-1])
if file_ext.lower() == '.py':
py_mod = imp.load_source(mod_name, filepath)
elif file_ext.lower() == '.pyc':
py_mod = imp.load_compiled(mod_name, filepath)
if hasattr(py_mod, expected_class):
class_inst = getattr(py_mod, expected_class)()
return class_inst
Using importlib
Importing a source file
Here is a slightly adapted example from the documentation:
import sys
import importlib.util
file_path = 'pluginX.py'
module_name = 'pluginX'
spec = importlib.util.spec_from_file_location(module_name, file_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
# Verify contents of the module:
print(dir(module))
From here, module will be a module object representing the pluginX module (the same thing that would be assigned to pluginX by doing import pluginX). Thus, to call e.g. a hello function (with no parameters) defined in pluginX, use module.hello().
To get the effect "importing" functionality from the module instead, store it in the in-memory cache of loaded modules, and then do the corresponding from import:
sys.modules[module_name] = module
from pluginX import hello
hello()
Importing a package
To import a package instead, calling import_module is sufficient. Suppose there is a package folder pluginX in the current working directory; then just do
import importlib
pkg = importlib.import_module('pluginX')
# check if it's all there..
print(dir(pkg))
Use the imp module, or the more direct __import__() function.
You can use exec:
exec("import myapp.commands.%s" % command)
If you want it in your locals:
>>> mod = 'sys'
>>> locals()['my_module'] = __import__(mod)
>>> my_module.version
'2.6.6 (r266:84297, Aug 24 2010, 18:46:32) [MSC v.1500 32 bit (Intel)]'
same would work with globals()
Similar as #monkut 's solution but reusable and error tolerant described here http://stamat.wordpress.com/dynamic-module-import-in-python/:
import os
import imp
def importFromURI(uri, absl):
mod = None
if not absl:
uri = os.path.normpath(os.path.join(os.path.dirname(__file__), uri))
path, fname = os.path.split(uri)
mname, ext = os.path.splitext(fname)
if os.path.exists(os.path.join(path,mname)+'.pyc'):
try:
return imp.load_compiled(mname, uri)
except:
pass
if os.path.exists(os.path.join(path,mname)+'.py'):
try:
return imp.load_source(mname, uri)
except:
pass
return mod
The below piece worked for me:
>>>import imp;
>>>fp, pathname, description = imp.find_module("/home/test_module");
>>>test_module = imp.load_module("test_module", fp, pathname, description);
>>>print test_module.print_hello();
if you want to import in shell-script:
python -c '<above entire code in one line>'
The following worked for me:
import sys, glob
sys.path.append('/home/marc/python/importtest/modus')
fl = glob.glob('modus/*.py')
modulist = []
adapters=[]
for i in range(len(fl)):
fl[i] = fl[i].split('/')[1]
fl[i] = fl[i][0:(len(fl[i])-3)]
modulist.append(getattr(__import__(fl[i]),fl[i]))
adapters.append(modulist[i]())
It loads modules from the folder 'modus'. The modules have a single class with the same name as the module name. E.g. the file modus/modu1.py contains:
class modu1():
def __init__(self):
self.x=1
print self.x
The result is a list of dynamically loaded classes "adapters".

Why do *import ...* and *from x import y* idioms behave so differently here?

I have a package structure like this:
- src
- src/main.py
- src/package1
- src/package1/__init__.py
- src/package1/module1.py
- src/package1/module2.py
... where module2 is a subclass of module1, and therefore module1 gets referenced by an absolute import path in module2.py.
That is, in src/package1/module2.py:
from package1.module1 import SomeClassFromModule1
The problem occurs in the main.py script:
## here the imports
def main():
# create an instance of the child class in Module2
if __name__ == "__main__":
main()
Option 1 works. That is, in src/main.py:
from package1.module2 import SomeClassFromModule2
some_name = SomeClassFromModule2()
Option 2 does not work. That is, in src/main.py:
import package1.module2.SomeClassFromModule2
some_name = package1.module2.SomeClassFromModule2()
... causes the following error.
ModuleNotFoundError: No module named 'package1.module2.SomeClassFromModule2'; 'package1.module2' is not a package
So why is there this difference between the import and from ... import idiom?
Would be glad for some clarification.
import x keyword brings all the methods and class from x in the the file it is being called.
from x import y this brings a specific method or class('y' is a method or class) from that .py file ('x' is the file here) instead of bringing all the methods it has.
In your case when you import package1.module2 the SomeClassForModule2() is being already imported and hence you need not write import package1.module2.SomeClassFromModule2
here I guess you want to access a class, so you need to create a object in order to access it.
hope this helped you
After some test, I think you cannot import a function or class by using import your_module.your_class.
It's all about package, module, function and class:
# import module
>>>import os
<module 'os' from ooxx>
#use module of module (a litte weird)
>>>os.path
<module 'posixpath' from ooxx>
#import module of module (a litte weird)
>>>import os.path
#use function
>>>os.path.dirname
<function posixpath.dirname(p)>
# you cannot import a function (or class) by using 'import your.module.func'
# 'import ooxx' always get a module or package.
>>>import os.path.dirname
ModuleNotFoundError
No module named 'os.path.dirname'; 'os.path' is not a package
# instead of it, using 'from your_module import your_function_or_class'
>>>from os.path import dirname
<function posixpath.dirname(p)>

python create and register module via compile function [duplicate]

I have some code in the form of a string and would like to make a module out of it without writing to disk.
When I try using imp and a StringIO object to do this, I get:
>>> imp.load_source('my_module', '', StringIO('print "hello world"'))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: load_source() argument 3 must be file, not instance
>>> imp.load_module('my_module', StringIO('print "hello world"'), '', ('', '', 0))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: load_module arg#2 should be a file or None
How can I create the module without having an actual file? Alternatively, how can I wrap a StringIO in a file without writing to disk?
UPDATE:
NOTE: This issue is also a problem in python3.
The code I'm trying to load is only partially trusted. I've gone through it with ast and determined that it doesn't import anything or do anything I don't like, but I don't trust it enough to run it when I have local variables running around that could get modified, and I don't trust my own code to stay out of the way of the code I'm trying to import.
I created an empty module that only contains the following:
def load(code):
# Delete all local variables
globals()['code'] = code
del locals()['code']
# Run the code
exec(globals()['code'])
# Delete any global variables we've added
del globals()['load']
del globals()['code']
# Copy k so we can use it
if 'k' in locals():
globals()['k'] = locals()['k']
del locals()['k']
# Copy the rest of the variables
for k in locals().keys():
globals()[k] = locals()[k]
Then you can import mymodule and call mymodule.load(code). This works for me because I've ensured that the code I'm loading does not use globals. Also, the global keyword is only a parser directive and can't refer to anything outside of the exec.
This really is way too much work to import the module without writing to disk, but if you ever want to do this, I believe it's the best way.
Here is how to import a string as a module (Python 2.x):
import sys,imp
my_code = 'a = 5'
mymodule = imp.new_module('mymodule')
exec my_code in mymodule.__dict__
In Python 3, exec is a function, so this should work:
import sys,imp
my_code = 'a = 5'
mymodule = imp.new_module('mymodule')
exec(my_code, mymodule.__dict__)
Now access the module attributes (and functions, classes etc) as:
print(mymodule.a)
>>> 5
To ignore any next attempt to import, add the module to sys:
sys.modules['mymodule'] = mymodule
imp.new_module is deprecated since python 3.4, but it still works as of python 3.9
imp.new_module was replaced with importlib.util.module_from_spec
importlib.util.module_from_spec
is preferred over using types.ModuleType to create a new module as
spec is used to set as many import-controlled attributes on the module
as possible.
importlib.util.spec_from_loader
uses available loader APIs, such as InspectLoader.is_package(), to
fill in any missing information on the spec.
these module attributes are __builtins__ __doc__ __loader__ __name__ __package__ __spec__
import sys, importlib.util
def import_module_from_string(name: str, source: str):
"""
Import module from source string.
Example use:
import_module_from_string("m", "f = lambda: print('hello')")
m.f()
"""
spec = importlib.util.spec_from_loader(name, loader=None)
module = importlib.util.module_from_spec(spec)
exec(source, module.__dict__)
sys.modules[name] = module
globals()[name] = module
# demo
# note: "if True:" allows to indent the source string
import_module_from_string('hello_module', '''if True:
def hello():
print('hello')
''')
hello_module.hello()
You could simply create a Module object and stuff it into sys.modules and put your code inside.
Something like:
import sys
from types import ModuleType
mod = ModuleType('mymodule')
sys.modules['mymodule'] = mod
exec(mycode, mod.__dict__)
If the code for the module is in a string, you can forgo using StringIO and use it directly with exec, as illustrated below with a file named dynmodule.py.
Works in Python 2 & 3.
from __future__ import print_function
class _DynamicModule(object):
def load(self, code):
execdict = {'__builtins__': None} # optional, to increase safety
exec(code, execdict)
keys = execdict.get(
'__all__', # use __all__ attribute if defined
# else all non-private attributes
(key for key in execdict if not key.startswith('_')))
for key in keys:
setattr(self, key, execdict[key])
# replace this module object in sys.modules with empty _DynamicModule instance
# see Stack Overflow question:
# https://stackoverflow.com/questions/5365562/why-is-the-value-of-name-changing-after-assignment-to-sys-modules-name
import sys as _sys
_ref, _sys.modules[__name__] = _sys.modules[__name__], _DynamicModule()
if __name__ == '__main__':
import dynmodule # name of this module
import textwrap # for more readable code formatting in sample string
# string to be loaded can come from anywhere or be generated on-the-fly
module_code = textwrap.dedent("""\
foo, bar, baz = 5, 8, 2
def func():
return foo*bar + baz
__all__ = 'foo', 'bar', 'func' # 'baz' not included
""")
dynmodule.load(module_code) # defines module's contents
print('dynmodule.foo:', dynmodule.foo)
try:
print('dynmodule.baz:', dynmodule.baz)
except AttributeError:
print('no dynmodule.baz attribute was defined')
else:
print('Error: there should be no dynmodule.baz module attribute')
print('dynmodule.func() returned:', dynmodule.func())
Output:
dynmodule.foo: 5
no dynmodule.baz attribute was defined
dynmodule.func() returned: 42
Setting the '__builtins__' entry to None in the execdict dictionary prevents the code from directly executing any built-in functions, like __import__, and so makes running it safer. You can ease that restriction by selectively adding things to it you feel are OK and/or required.
It's also possible to add your own predefined utilities and attributes which you'd like made available to the code thereby creating a custom execution context for it to run in. That sort of thing can be useful for implementing a "plug-in" or other user-extensible architecture.
you could use exec or eval to execute python code as a string. see here, here and here
The documentation for imp.load_source says (my emphasis):
The file argument is the source file, open for reading as text, from the beginning. It must currently be a real file object, not a user-defined class emulating a file.
... so you may be out of luck with this method, I'm afraid.
Perhaps eval would be enough for you in this case?
This sounds like a rather surprising requirement, though - it might help if you add some more to your question about the problem you're really trying to solve.

How can you import packages in a python for loop? [duplicate]

I'm writing a Python application that takes a command as an argument, for example:
$ python myapp.py command1
I want the application to be extensible, that is, to be able to add new modules that implement new commands without having to change the main application source. The tree looks something like:
myapp/
__init__.py
commands/
__init__.py
command1.py
command2.py
foo.py
bar.py
So I want the application to find the available command modules at runtime and execute the appropriate one.
Python defines an __import__() function, which takes a string for a module name:
__import__(name, globals=None, locals=None, fromlist=(), level=0)
The function imports the module name, potentially using the given globals and locals to determine how to interpret the name in a package context. The fromlist gives the names of objects or submodules that should be imported from the module given by name.
Source: https://docs.python.org/3/library/functions.html#__import__
So currently I have something like:
command = sys.argv[1]
try:
command_module = __import__("myapp.commands.%s" % command, fromlist=["myapp.commands"])
except ImportError:
# Display error message
command_module.run()
This works just fine, I'm just wondering if there is possibly a more idiomatic way to accomplish what we are doing with this code.
Note that I specifically don't want to get in to using eggs or extension points. This is not an open-source project and I don't expect there to be "plugins". The point is to simplify the main application code and remove the need to modify it each time a new command module is added.
See also: How do I import a module given the full path?
With Python older than 2.7/3.1, that's pretty much how you do it.
For newer versions, see importlib.import_module for Python 2 and Python 3.
Or using __import__ you can import a list of modules by doing this:
>>> moduleNames = ['sys', 'os', 're', 'unittest']
>>> moduleNames
['sys', 'os', 're', 'unittest']
>>> modules = map(__import__, moduleNames)
Ripped straight from Dive Into Python.
The recommended way for Python 2.7 and 3.1 and later is to use importlib module:
importlib.import_module(name, package=None)
Import a module. The name argument specifies what module to import in absolute or relative terms (e.g. either pkg.mod or ..mod). If the name is specified in relative terms, then the package argument must be set to the name of the package which is to act as the anchor for resolving the package name (e.g. import_module('..mod', 'pkg.subpkg') will import pkg.mod).
e.g.
my_module = importlib.import_module('os.path')
Note: imp is deprecated since Python 3.4 in favor of importlib
As mentioned the imp module provides you loading functions:
imp.load_source(name, path)
imp.load_compiled(name, path)
I've used these before to perform something similar.
In my case I defined a specific class with defined methods that were required.
Once I loaded the module I would check if the class was in the module, and then create an instance of that class, something like this:
import imp
import os
def load_from_file(filepath):
class_inst = None
expected_class = 'MyClass'
mod_name,file_ext = os.path.splitext(os.path.split(filepath)[-1])
if file_ext.lower() == '.py':
py_mod = imp.load_source(mod_name, filepath)
elif file_ext.lower() == '.pyc':
py_mod = imp.load_compiled(mod_name, filepath)
if hasattr(py_mod, expected_class):
class_inst = getattr(py_mod, expected_class)()
return class_inst
Using importlib
Importing a source file
Here is a slightly adapted example from the documentation:
import sys
import importlib.util
file_path = 'pluginX.py'
module_name = 'pluginX'
spec = importlib.util.spec_from_file_location(module_name, file_path)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
# Verify contents of the module:
print(dir(module))
From here, module will be a module object representing the pluginX module (the same thing that would be assigned to pluginX by doing import pluginX). Thus, to call e.g. a hello function (with no parameters) defined in pluginX, use module.hello().
To get the effect "importing" functionality from the module instead, store it in the in-memory cache of loaded modules, and then do the corresponding from import:
sys.modules[module_name] = module
from pluginX import hello
hello()
Importing a package
To import a package instead, calling import_module is sufficient. Suppose there is a package folder pluginX in the current working directory; then just do
import importlib
pkg = importlib.import_module('pluginX')
# check if it's all there..
print(dir(pkg))
Use the imp module, or the more direct __import__() function.
You can use exec:
exec("import myapp.commands.%s" % command)
If you want it in your locals:
>>> mod = 'sys'
>>> locals()['my_module'] = __import__(mod)
>>> my_module.version
'2.6.6 (r266:84297, Aug 24 2010, 18:46:32) [MSC v.1500 32 bit (Intel)]'
same would work with globals()
Similar as #monkut 's solution but reusable and error tolerant described here http://stamat.wordpress.com/dynamic-module-import-in-python/:
import os
import imp
def importFromURI(uri, absl):
mod = None
if not absl:
uri = os.path.normpath(os.path.join(os.path.dirname(__file__), uri))
path, fname = os.path.split(uri)
mname, ext = os.path.splitext(fname)
if os.path.exists(os.path.join(path,mname)+'.pyc'):
try:
return imp.load_compiled(mname, uri)
except:
pass
if os.path.exists(os.path.join(path,mname)+'.py'):
try:
return imp.load_source(mname, uri)
except:
pass
return mod
The below piece worked for me:
>>>import imp;
>>>fp, pathname, description = imp.find_module("/home/test_module");
>>>test_module = imp.load_module("test_module", fp, pathname, description);
>>>print test_module.print_hello();
if you want to import in shell-script:
python -c '<above entire code in one line>'
The following worked for me:
import sys, glob
sys.path.append('/home/marc/python/importtest/modus')
fl = glob.glob('modus/*.py')
modulist = []
adapters=[]
for i in range(len(fl)):
fl[i] = fl[i].split('/')[1]
fl[i] = fl[i][0:(len(fl[i])-3)]
modulist.append(getattr(__import__(fl[i]),fl[i]))
adapters.append(modulist[i]())
It loads modules from the folder 'modus'. The modules have a single class with the same name as the module name. E.g. the file modus/modu1.py contains:
class modu1():
def __init__(self):
self.x=1
print self.x
The result is a list of dynamically loaded classes "adapters".

How can a string be imported as a Python module bound to a local name? [duplicate]

I have some code in the form of a string and would like to make a module out of it without writing to disk.
When I try using imp and a StringIO object to do this, I get:
>>> imp.load_source('my_module', '', StringIO('print "hello world"'))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: load_source() argument 3 must be file, not instance
>>> imp.load_module('my_module', StringIO('print "hello world"'), '', ('', '', 0))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: load_module arg#2 should be a file or None
How can I create the module without having an actual file? Alternatively, how can I wrap a StringIO in a file without writing to disk?
UPDATE:
NOTE: This issue is also a problem in python3.
The code I'm trying to load is only partially trusted. I've gone through it with ast and determined that it doesn't import anything or do anything I don't like, but I don't trust it enough to run it when I have local variables running around that could get modified, and I don't trust my own code to stay out of the way of the code I'm trying to import.
I created an empty module that only contains the following:
def load(code):
# Delete all local variables
globals()['code'] = code
del locals()['code']
# Run the code
exec(globals()['code'])
# Delete any global variables we've added
del globals()['load']
del globals()['code']
# Copy k so we can use it
if 'k' in locals():
globals()['k'] = locals()['k']
del locals()['k']
# Copy the rest of the variables
for k in locals().keys():
globals()[k] = locals()[k]
Then you can import mymodule and call mymodule.load(code). This works for me because I've ensured that the code I'm loading does not use globals. Also, the global keyword is only a parser directive and can't refer to anything outside of the exec.
This really is way too much work to import the module without writing to disk, but if you ever want to do this, I believe it's the best way.
Here is how to import a string as a module (Python 2.x):
import sys,imp
my_code = 'a = 5'
mymodule = imp.new_module('mymodule')
exec my_code in mymodule.__dict__
In Python 3, exec is a function, so this should work:
import sys,imp
my_code = 'a = 5'
mymodule = imp.new_module('mymodule')
exec(my_code, mymodule.__dict__)
Now access the module attributes (and functions, classes etc) as:
print(mymodule.a)
>>> 5
To ignore any next attempt to import, add the module to sys:
sys.modules['mymodule'] = mymodule
imp.new_module is deprecated since python 3.4, but it still works as of python 3.9
imp.new_module was replaced with importlib.util.module_from_spec
importlib.util.module_from_spec
is preferred over using types.ModuleType to create a new module as
spec is used to set as many import-controlled attributes on the module
as possible.
importlib.util.spec_from_loader
uses available loader APIs, such as InspectLoader.is_package(), to
fill in any missing information on the spec.
these module attributes are __builtins__ __doc__ __loader__ __name__ __package__ __spec__
import sys, importlib.util
def import_module_from_string(name: str, source: str):
"""
Import module from source string.
Example use:
import_module_from_string("m", "f = lambda: print('hello')")
m.f()
"""
spec = importlib.util.spec_from_loader(name, loader=None)
module = importlib.util.module_from_spec(spec)
exec(source, module.__dict__)
sys.modules[name] = module
globals()[name] = module
# demo
# note: "if True:" allows to indent the source string
import_module_from_string('hello_module', '''if True:
def hello():
print('hello')
''')
hello_module.hello()
You could simply create a Module object and stuff it into sys.modules and put your code inside.
Something like:
import sys
from types import ModuleType
mod = ModuleType('mymodule')
sys.modules['mymodule'] = mod
exec(mycode, mod.__dict__)
If the code for the module is in a string, you can forgo using StringIO and use it directly with exec, as illustrated below with a file named dynmodule.py.
Works in Python 2 & 3.
from __future__ import print_function
class _DynamicModule(object):
def load(self, code):
execdict = {'__builtins__': None} # optional, to increase safety
exec(code, execdict)
keys = execdict.get(
'__all__', # use __all__ attribute if defined
# else all non-private attributes
(key for key in execdict if not key.startswith('_')))
for key in keys:
setattr(self, key, execdict[key])
# replace this module object in sys.modules with empty _DynamicModule instance
# see Stack Overflow question:
# https://stackoverflow.com/questions/5365562/why-is-the-value-of-name-changing-after-assignment-to-sys-modules-name
import sys as _sys
_ref, _sys.modules[__name__] = _sys.modules[__name__], _DynamicModule()
if __name__ == '__main__':
import dynmodule # name of this module
import textwrap # for more readable code formatting in sample string
# string to be loaded can come from anywhere or be generated on-the-fly
module_code = textwrap.dedent("""\
foo, bar, baz = 5, 8, 2
def func():
return foo*bar + baz
__all__ = 'foo', 'bar', 'func' # 'baz' not included
""")
dynmodule.load(module_code) # defines module's contents
print('dynmodule.foo:', dynmodule.foo)
try:
print('dynmodule.baz:', dynmodule.baz)
except AttributeError:
print('no dynmodule.baz attribute was defined')
else:
print('Error: there should be no dynmodule.baz module attribute')
print('dynmodule.func() returned:', dynmodule.func())
Output:
dynmodule.foo: 5
no dynmodule.baz attribute was defined
dynmodule.func() returned: 42
Setting the '__builtins__' entry to None in the execdict dictionary prevents the code from directly executing any built-in functions, like __import__, and so makes running it safer. You can ease that restriction by selectively adding things to it you feel are OK and/or required.
It's also possible to add your own predefined utilities and attributes which you'd like made available to the code thereby creating a custom execution context for it to run in. That sort of thing can be useful for implementing a "plug-in" or other user-extensible architecture.
you could use exec or eval to execute python code as a string. see here, here and here
The documentation for imp.load_source says (my emphasis):
The file argument is the source file, open for reading as text, from the beginning. It must currently be a real file object, not a user-defined class emulating a file.
... so you may be out of luck with this method, I'm afraid.
Perhaps eval would be enough for you in this case?
This sounds like a rather surprising requirement, though - it might help if you add some more to your question about the problem you're really trying to solve.

Categories

Resources