I've got a function here for running an external python script in another process. m is the Multiprocessing module
def run(app,WindowOffX,WindowOffY,WindowWidth,WindowHeight):
try:
exec("import Programs."+app+".main as Foo")
Foo.init()
p = m.Process(target=Foo.main(WindowOffX,WindowOffY,WindowWidth,WindowHeight))
except ImportError:
print("That app doesn't exist!!! O.O")
But this generates NameError: global name 'Foo' is not defined. Can someone help?
Ha, the problem is that python doesn't know your exec statement defines Foo, so it tries to look it up as a global. To clue it in, try this:
try:
Foo = None
exec("import Programs."+app+".main as Foo")
Foo.init()
Incidentally, here's how you can do what you're after without using exec:
Foo = __import__("Programs."+app+".main")
Related
In a.py)
def pool_init():
global aa
aa = "I'm in a.py"
def pool_func(chunk, job_func):
return job_func(chunk)
def run_pool(mp_iterable, job_func, pool_func = pool_func):
rst = []
partial(pool_func, job_func = job_func)
with multiprocessing.Pool(4, initializer=pool_init,) as p:
for ir in p.imap_unordered(pool_func, mp_iterable):
rst.append(ir)
In main.py)
def job_func():
print(aa)
a.run_pool(range(5), job_func = job_func)
raised (in Cent OS, jupyter)
NameError: name 'aa' is not defined
Questions)
Why did this error occured?
How could i fix this error?
Edit)
Recently, I need multiprocessing module more often, so I wanted to make some functions to run multiprocessing.Pool in my custom module not to write redundant codes.
I want to run multiprocessing.Pool just like:
run_pool(iterable, job_func)
# Acutal processes in job_func, the both arguments are defined in main.py
Need a help with the next situation. I want to implement debug mode in my script through printing small completion report in functions with command executed name and ellapsed time like:
def cmd_exec(cmd):
if isDebug:
commandStart = datetime.datetime.now()
print commandStart
print cmd
...
... exucuting commands
...
if isDebug:
print datetime.datetime.now() - command_start
return
def main():
...
if args.debug:
isDebug = True
...
cmd_exec(cmd1)
...
cmd_exec(cmd2)
...
How can isDebug variable be simply passed to functions?
Should I use "global isDebug"?
Because
...
cmd_exec(cmd1, isDebug)
...
cmd_exec(cmd2, isDebug)
...
looks pretty bad. Please help me find more elegant way.
isDebug is state that applies to the application of a function cmd_exec. Sounds like a use-case for a class to me.
class CommandExecutor(object):
def __init__(self, debug):
self.debug = debug
def execute(self, cmd):
if self.debug:
commandStart = datetime.datetime.now()
print commandStart
print cmd
...
... executing commands
...
if self.debug:
print datetime.datetime.now() - command_start
def main(args):
ce = CommandExecutor(args.debug)
ce.execute(cmd1)
ce.execute(cmd2)
Python has a built-in __debug__ variable that could be useful.
if __debug__:
print 'information...'
When you run your program as python test.py, __debug__ is True. If you run it as python -O test.py, it will be False.
Another option which I do in my projects is set a global DEBUG var at the beginning of the file, after importing:
DEBUG = True
You can then reference this DEBUG var in the scope of the function.
You can use a module to create variables that are shared. This is better than a global because it only affects code that is specifically looking for the variable, it doesn't pollute the global namespace. It also lets you define something without your main module needing to know about it.
This works because modules are shared objects in Python. Every import gets back a reference to the same object, and modifications to the contents of that module get shared immediately, just like a global would.
my_debug.py:
isDebug = false
main.py:
import my_debug
def cmd_exec(cmd):
if my_debug.isDebug:
# ...
def main():
# ...
if args.debug:
my_debug.isDebug = True
Specifically for this, I would use partials/currying, basically pre-filling a variable.
import sys
from functools import partial
import datetime
def _cmd_exec(cmd, isDebug=False):
if isDebug:
command_start = datetime.datetime.now()
print command_start
print cmd
else:
print 'isDebug is false' + cmd
if isDebug:
print datetime.datetime.now() - command_start
return
#default, keeping it as is...
cmd_exec = _cmd_exec
#switch to debug
def debug_on():
global cmd_exec
#pre-apply the isDebug optional param
cmd_exec = partial(_cmd_exec, isDebug=True)
def main():
if "-d" in sys.argv:
debug_on()
cmd_exec("cmd1")
cmd_exec("cmd2")
main()
In this case, I check for -d on the command line to turn on debug mode and I do pre-populate isDebug on the function call by creating a new function with isDebug = True.
I think even other modules will see this modified cmd_exec, because I replaced the function at the module level.
output:
jluc#explore$ py test_so64.py
isDebug is falsecmd1
isDebug is falsecmd2
jluc#explore$ py test_so64.py -d
2016-10-13 17:00:33.523016
cmd1
0:00:00.000682
2016-10-13 17:00:33.523715
cmd2
0:00:00.000009
I have the function changecheck() which is defined in a different module(called check.py). I want to pass changeId as a parameter to this function.
I am calling this function from the file test.py.
I am unable to understand the reason why this parameter is not being passed correctly.
check.py
returnVal = changecheck(changeInfoItem['changeId'])
In a differnt module test.py
def changecheck(changeId):
print changeId //nothing gets printed
This should fix your problem:
In the module test.py
def changecheck(changeId):
print changeId
In the module check.py
import test
returnVal = test.changecheck(changeInfoItem['changeId'])
You have to do this:
import check
returnVal = check.changecheck(changeInfoItem['changeId'])
I've written an IRC bot using Twisted and now I've gotten to the point where I want to be able to dynamically reload functionality.
In my main program, I do from bots.google import GoogleBot and I've looked at how to use reload to reload modules, but I still can't figure out how to do dynamic re-importing of classes.
So, given a Python class, how do I dynamically reload the class definition?
Reload is unreliable and has many corner cases where it may fail. It is suitable for reloading simple, self-contained, scripts. If you want to dynamically reload your code without restart consider using forkloop instead:
http://opensourcehacker.com/2011/11/08/sauna-reload-the-most-awesomely-named-python-package-ever/
You cannot reload the module using reload(module) when using the from X import Y form. You'd have to do something like reload(sys.modules['module']) in that case.
This might not necessarily be the best way to do what you want, but it works!
import bots.google
class BotClass(irc.IRCClient):
def __init__(self):
global plugins
plugins = [bots.google.GoogleBot()]
def privmsg(self, user, channel, msg):
global plugins
parts = msg.split(' ')
trigger = parts[0]
if trigger == '!reload':
reload(bots.google)
plugins = [bots.google.GoogleBot()]
print "Successfully reloaded plugins"
I figured it out, here's the code I use:
def reimport_class(self, cls):
"""
Reload and reimport class "cls". Return the new definition of the class.
"""
# Get the fully qualified name of the class.
from twisted.python import reflect
full_path = reflect.qual(cls)
# Naively parse the module name and class name.
# Can be done much better...
match = re.match(r'(.*)\.([^\.]+)', full_path)
module_name = match.group(1)
class_name = match.group(2)
# This is where the good stuff happens.
mod = __import__(module_name, fromlist=[class_name])
reload(mod)
# The (reloaded definition of the) class itself is returned.
return getattr(mod, class_name)
Better yet subprocess the plugins, then hypervise the subprocess, when the files change reload the plugins process.
Edit: cleaned up.
You can use the sys.modules to dynamically reload modules based on user-input.
Say that you have a folder with multiple plugins such as:
module/
cmdtest.py
urltitle.py
...
You can use sys.modules in this way to load/reload modules based on userinput:
import sys
if sys.modules['module.' + userinput]:
reload(sys.modules['module.' + userinput])
else:
' Module not loaded. Cannot reload '
try:
module = __import__("module." + userinput)
module = sys.modules["module." + userinput]
except:
' error when trying to load %s ' % userinput
When you do a from ... import ... it binds the object into the local namespace, so all you need to is re-import it. However, since the module is already loaded, it will just re-import the same version of the class so you would need to reload the module too. So this should do it:
from bots.google import GoogleBot
...
# do stuff
...
reload(bots.google)
from bots.google import GoogleBot
If for some reason you don't know the module name you can get it from GoogleBot.module.
def reload_class(class_obj):
module_name = class_obj.__module__
module = sys.modules[module_name]
pycfile = module.__file__
modulepath = string.replace(pycfile, ".pyc", ".py")
code=open(modulepath, 'rU').read()
compile(code, module_name, "exec")
module = reload(module)
return getattr(module,class_obj.__name__)
There is a lot of error checking you can do on this, if your using global variables you will probably have to figure out what happens then.
I'm very new to python. Here is the problem Im having.
I have hooked the builtin._import_ with my custom hook which loads a module from a string.
def import_hook(name, globals=None, locals=None, fromlist=None):
if name in sys.modules:
obj = sys.modules[name]
return obj
#Make sure we hook only for modules which start with ajay
if name.startswith("ajay"):
statement = '''
print 'inside the ajay module'
def ajay_func():
print 'inside the func'
'''
mod = imp.new_module(name)
mod.__file__ = name
compiled = compile(statement, '<string>', 'exec')
exec compiled in mod.__dict__
sys.modules[name] = mod
return mod
return original_import(name, globals, locals, fromlist)
Then I use this hook in function which is loading a module and calling its function in exec statement.
original_import = __builtin__.__import__
def test(request):
statement = '''
import sys
import ajay
def ILessons_1(request):
ajay.ajay_func()
'''
try:
__builtin__.__import__ = import_hook
compiled = compile(statement, '<string>', 'exec')
exec (compiled, globals(), locals()) #in statement_module.__dict__
ajay.ajay_func()
return ILessons_1(request);
finally:
__builtin__.__import__ = original_import
pass
When I run this code I get error "global name 'ajay' is not defined"in line "return ILessons_1(request);". Interesting thing is python is able to resolve ajay in line just above this line. Im pretty sure Im making some mistake in exec statement but have not been able to figure out.
Can some please help me solve this problem.
Thanks
Noted few issues here:
1) globals and locals are built-in function names, You should not use them as variable names.
2) Possibly a bug? (tested with python 2.7.1 under ubuntu)
Consider following code (note the exec statement):
def outer_function():
foo = "foobar"
statement = '''
def inner_function():
print foo
'''
#exec statement in globals(), locals()
exec statement in globals().update(locals())
inner_function()
outer_function()
Here commented string (exec with 2 arguments after in) will not work as described at in the documentation and result in:
NameError: global name 'foo' is not defined
But one may manually combine globals+locals an pass them to exec (next string after comment in my example). Looks like a workaround to me.