I am designing a module, say mymodule.py and I write the code for the module as follows:
def charCount(my_string, my_char):
a = my_string.count(my_char)
return a
def aCount(my_string):
a = charCount(my_string, 'a')
return a
Inside Python shell, I use the following command:
import mymodule as mm
and then,
mString = 'ghghghghgaaaaa'
and then
a = mm.aCount(mString)
It is seen that there is an error. Apparently, the function is not able to be called from the same module. How can this be avoided?
You need to put return statement in both the functions and it would work fine.
Try this:
def charCount(my_string, my_char):
a = my_string.count(my_char)
return a
def aCount(my_string):
a = charCount(my_string, 'a')
return a
Related
I'm trying to create a process that dynamically watches jupyter notebooks, compiles them on modification and imports them into my current file, however I can't seem to execute the updated code. It only executes the first version that was loaded.
There's a file called producer.py that calls this function repeatedly:
import fs.fs_util as fs_util
while(True):
fs_util.update_feature_list()
In fs_util.py I do the following:
from fs.feature import Feature
import inspect
from importlib import reload
import os
def is_subclass_of_feature(o):
return inspect.isclass(o) and issubclass(o, Feature) and o is not Feature
def get_instances_of_features(name):
module = __import__(COMPILED_MODULE, fromlist=[name])
module = reload(module)
feature_members = getattr(module, name)
all_features = inspect.getmembers(feature_members, predicate=is_subclass_of_feature)
return [f[1]() for f in all_features]
This function is called by:
def update_feature_list(name):
os.system("jupyter nbconvert --to script {}{} --output {}{}"
.format(PATH + "/" + s3.OUTPUT_PATH, name + JUPYTER_EXTENSION, PATH + "/" + COMPILED_PATH, name))
features = get_instances_of_features(name)
for f in features:
try:
feature = f.create_feature()
except Exception as e:
print(e)
There is other irrelevant code that checks for whether a file has been modified etc.
I can tell the file is being reloaded correctly because when I use inspect.getsource(f.create_feature) on the class it displays the updated source code, however during execution it returns older values. I've verified this by changing print statements as well as comparing the return values.
Also for some more context the file I'm trying to import:
from fs.feature import Feature
class SubFeature(Feature):
def __init__(self):
Feature.__init__(self)
def create_feature(self):
return "hello"
I was wondering what I was doing incorrectly?
So I found out what I was doing wrong.
When called reload I was reloading the module I had newly imported, which was fairly idiotic I suppose. The correct solution (in my case) was to reload the module from sys.modules, so it would be something like reload(sys.modules[COMPILED_MODULE + "." + name])
i have this code in a python file:
from dec import my_decorator
import asyncio
#my_decorator
async def simple_method(bar): # , x, plc_name, var_name):
print("Henlo from simple_method\npartent:{}".format(parent))
return
#my_decorator
async def other_simple_meth(bar, value):
print("Henlo from other_simple_meth:\t Val:{}".format(value))
return
async def main():
print("Start Module-Export")
open('module_functions.py', 'a').close()
# Write all decorated functions to modue_functions.py
print("Functions in module_functions.py exported")
while True:
asyncio.sleep(2)
print("z...z...Z...")
My goal is to write all decorated functions (inc. the import dependencies) into a second module file (here "module_functions.py"). My 'module_functions.py' file should look like this:
from dec import my_decorator
import asyncio
#my_decorator
async def simple_method(bar): # , x, plc_name, var_name):
print("Henlo from simple_method\npartent:{}".format(parent))
return
#my_decorator
async def other_simple_meth(bar, value):
print("Henlo from other_simple_meth:\t Val:{}".format(value))
return
I know how to get references and names of a function, but not how to "copy/paste" the functioncode (incl. decorator and all dependencies) into a seperated file. Is this even possible?
EDIT: I know that pickle and dill exist, but this may not fullfill the goal. The problem is, that someone else may not know the order of the dumped file and loading them back may/will cause problem. As well it seems to be not possible to edit such loaded functions again.
I found a (not ideal, but ok) solution for my problems.
I) Find and write functions, coroutines etc. into a file (works):
Like #MisterMiyagi suspected, is the inspect module a good way to go. For the common stuff, it is possible with inspect.getsource() to get the code and write them into a file:
# List of wanted stuff
func_list = [simple_method, meth_with_input, meth_with_input_and_output, func_myself]
with open('module_functions.py', 'a') as module_file:
for func in func_list:
try:
module_file.write(inspect.getsource(func))
module_file.write("\n")
except:
print("Error :( ")
II) But what about decorated stuff(seems to work)?
I) will not work for decorated stuff, it is just ignored without throwing an exception. What seems to be used is from functools import wraps.
In many examples the #wraps decorator is added into the decorator class. This was not possible for me, but there is a good workaround:
#wraps(lambda: simple_method) #<---add wraps-decorator here
#my_decorator
async def simple_method(parent): # , x, plc_name, var_name):
print("Henlo from simple_method\npartent:{}".format(parent))
return
Wraps can be placed above the original decorated method/class/function and it seems to behave like I want. Now we can add simple_methodinto the func_listof I).
III) What about the imports?
Well it seems to be quite tricky/impossible to actually read the dependencies of a function. My workaround is to drop all wanted imports into a class (sigh). This class can be throw into the func_listof I) and is written into the file.
EDIT:
There is a cleaner way, which may works, after some modification, with I) and II) as well. The magic module is ast.
I have overwritten following:
class ImportVisitor(ast.NodeVisitor):
def __init__(self, target):
super().__init__()
self.file_target = target
"pick these special nodes via overwriting: visit_classname." \
"classnames are listed in https://docs.python.org/3.6/library/ast.html#abstract-grammar"
def visit_Import(self, node):
"Overwrite func!"
"Write all statements just with import like - import ast into file_target"
str = 'import '+', '.join(alias.name for alias in node.names)
self.file_target.write(str+"\n")
def visit_ImportFrom(self, node):
"Overwrite func!"
"Write all statements with from ... import (like - from os.path import basename) into file_tagrget"
str = 'from '+ node.module+ ' import '+', '.join(alias.name for alias in node.names)
self.file_target.write(str+"\n")
Now I can parse my own script name and fill the module_file with the imports and from...imports it will find while visiting all nodes in this tree:
with open('module_functions.py', 'a') as module_file:
with open(basename(__file__), "rb") as f:
tree = ast.parse(f.read(), basename(__file__))
visitor = ImportVisitor(module_file)
visitor.visit(tree)
module_file.write("\n\n")
I want to replace some builtin functions inside the code that I run with exec. It is possible by passing it as a dictionary entry in the second exec argument. But when I try to import a module inside the executed code, the functions are as in original bultins, when called inside imported module.
This is the example of what I'm trying to achieve:
from inspect import cleandoc
def new_print(val):
print('Hello', val)
code_inner = cleandoc("""
def bar():
print('Inner')
""")
with open('inner.py', 'w') as f:
f.write(code_inner)
code_outer = cleandoc("""
import inner
print('Outer')
inner.bar()
""")
exec(code_outer, {'print': new_print}, {})
This is the response that I receive:
Hello Outer
Inner
And this is what I would like to have:
Hello Outer
Hello Inner
Is there any way to pass new globals, or builtins, or maybe variable list to the module beeing imported?
I'm not sure if it's quite what you want, but passing a dictionary parameter to the function and updating its modules globals works.
code_inner = cleandoc("""
def bar(d):
globals().update(d)
print('Inner')
""")
code_outer = cleandoc("""
import inner
print('Outer')
inner.bar({'print': print})
""")
Alternatively, without modification of the bar function you can pass its module a global like so:
code_outer = cleandoc("""
import inner
inner.print = print
print('Outer')
inner.bar()
""")
Need a help with the next situation. I want to implement debug mode in my script through printing small completion report in functions with command executed name and ellapsed time like:
def cmd_exec(cmd):
if isDebug:
commandStart = datetime.datetime.now()
print commandStart
print cmd
...
... exucuting commands
...
if isDebug:
print datetime.datetime.now() - command_start
return
def main():
...
if args.debug:
isDebug = True
...
cmd_exec(cmd1)
...
cmd_exec(cmd2)
...
How can isDebug variable be simply passed to functions?
Should I use "global isDebug"?
Because
...
cmd_exec(cmd1, isDebug)
...
cmd_exec(cmd2, isDebug)
...
looks pretty bad. Please help me find more elegant way.
isDebug is state that applies to the application of a function cmd_exec. Sounds like a use-case for a class to me.
class CommandExecutor(object):
def __init__(self, debug):
self.debug = debug
def execute(self, cmd):
if self.debug:
commandStart = datetime.datetime.now()
print commandStart
print cmd
...
... executing commands
...
if self.debug:
print datetime.datetime.now() - command_start
def main(args):
ce = CommandExecutor(args.debug)
ce.execute(cmd1)
ce.execute(cmd2)
Python has a built-in __debug__ variable that could be useful.
if __debug__:
print 'information...'
When you run your program as python test.py, __debug__ is True. If you run it as python -O test.py, it will be False.
Another option which I do in my projects is set a global DEBUG var at the beginning of the file, after importing:
DEBUG = True
You can then reference this DEBUG var in the scope of the function.
You can use a module to create variables that are shared. This is better than a global because it only affects code that is specifically looking for the variable, it doesn't pollute the global namespace. It also lets you define something without your main module needing to know about it.
This works because modules are shared objects in Python. Every import gets back a reference to the same object, and modifications to the contents of that module get shared immediately, just like a global would.
my_debug.py:
isDebug = false
main.py:
import my_debug
def cmd_exec(cmd):
if my_debug.isDebug:
# ...
def main():
# ...
if args.debug:
my_debug.isDebug = True
Specifically for this, I would use partials/currying, basically pre-filling a variable.
import sys
from functools import partial
import datetime
def _cmd_exec(cmd, isDebug=False):
if isDebug:
command_start = datetime.datetime.now()
print command_start
print cmd
else:
print 'isDebug is false' + cmd
if isDebug:
print datetime.datetime.now() - command_start
return
#default, keeping it as is...
cmd_exec = _cmd_exec
#switch to debug
def debug_on():
global cmd_exec
#pre-apply the isDebug optional param
cmd_exec = partial(_cmd_exec, isDebug=True)
def main():
if "-d" in sys.argv:
debug_on()
cmd_exec("cmd1")
cmd_exec("cmd2")
main()
In this case, I check for -d on the command line to turn on debug mode and I do pre-populate isDebug on the function call by creating a new function with isDebug = True.
I think even other modules will see this modified cmd_exec, because I replaced the function at the module level.
output:
jluc#explore$ py test_so64.py
isDebug is falsecmd1
isDebug is falsecmd2
jluc#explore$ py test_so64.py -d
2016-10-13 17:00:33.523016
cmd1
0:00:00.000682
2016-10-13 17:00:33.523715
cmd2
0:00:00.000009
I decided to try to preprocess function text before it's compilation into byte-code and following execution. This is merely for training. I hardly imagine situations where it'll be a satisfactory solution to be used. I have faced one problem which I wanted to solve in this way, but eventually a better way was found. So this is just for training and to learn something new, not for real usage.
Assume we have a function, which source code we want to be modified quite a bit before compilation:
def f():
1;a()
print('Some statements 1')
1;a()
print('Some statements 2')
Let, for example, mark some lines of it with 1;, for them to be sometimes commented and sometimes not. I just take it for example, modifications of the function may be different.
To comment these lines I made a decorator. The whole code it bellow:
from __future__ import print_function
def a():
print('a()')
def comment_1(s):
lines = s.split('\n')
return '\n'.join(line.replace(';','#;',1) if line.strip().startswith('1;') else line for line in lines)
def remove_1(f):
import inspect
source = inspect.getsource(f)
new_source = comment_1(source)
with open('temp.py','w') as file:
file.write(new_source)
from temp import f as f_new
return f_new
def f():
1;a()
print('Some statements 1')
1;a()
print('Some statements 2')
f = remove_1(f) #If decorator #remove is used above f(), inspect.getsource includes #remove inside the code.
f()
I used inspect.getsourcelines to retrieve function f code. Then I made some text-processing (in this case commenting lines starting with 1;). After that I saved it to temp.py module, which is then imported. And then a function f is decorated in the main module.
The output, when decorator is applied, is this:
Some statements 1
Some statements 2
when NOT applied is this:
a()
Some statements 1
a()
Some statements 2
What I don't like is that I have to use hard drive to load compiled function. Can it be done without writing it to temporary module temp.py and importing from it?
The second question is about placing decorator above f: #replace. When I do this, inspect.getsourcelines returns f text with this decorator. I could manually be deleted from f's text. but that would be quite dangerous, as there may be more than one decorator applied. So I resorted to the old-style decoration syntax f = remove_1(f), which does the job. But still, is it possible to allow normal decoration technique with #replace?
One can avoid creating a temporary file by invoking the exec statement on the source. (You can also explicitly call compile prior to exec if you want additional control over compilation, but exec will do the compilation for you, so it's not necessary.) Correctly calling exec has the additional benefit that the function will work correctly if it accesses global variables from the namespace of its module.
The problem described in the second question can be resolved by temporarily blocking the decorator while it is running. That way the decorator remains, along all the other ones, but is a no-op.
Here is the updated source.
from __future__ import print_function
import sys
def a():
print('a()')
def comment_1(s):
lines = s.split('\n')
return '\n'.join(line.replace(';','#;',1) if line.strip().startswith('1;') else line for line in lines)
_blocked = False
def remove_1(f):
global _blocked
if _blocked:
return f
import inspect
source = inspect.getsource(f)
new_source = comment_1(source)
env = sys.modules[f.__module__].__dict__
_blocked = True
try:
exec new_source in env
finally:
_blocked = False
return env[f.__name__]
#remove_1
def f():
1;a()
print('Some statements 1')
1;a()
print('Some statements 2')
f()
def remove_1(f):
import inspect
source = inspect.getsource(f)
new_source = comment_1(source)
env = sys.modules[f.__module__].__dict__.copy()
exec new_source in env
return env[f.__name__]
I'll leave a modified version of the solution given in the answer by user4815162342. It uses ast module to delete some parts of f, as was suggested in the comment to the question. To make it I majorly relied on the information in this article.
This implementation deletes all occurrences of a as standalone expression.
from __future__ import print_function
import sys
import ast
import inspect
def a():
print('a() is called')
_blocked = False
def remove_1(f):
global _blocked
if _blocked:
return f
import inspect
source = inspect.getsource(f)
a = ast.parse(source) #get ast tree of f
class Transformer(ast.NodeTransformer):
'''Will delete all expressions containing 'a' functions at the top level'''
def visit_Expr(self, node): #visit all expressions
try:
if node.value.func.id == 'a': #if expression consists of function with name a
return None #delete it
except(ValueError):
pass
return node #return node unchanged
transformer = Transformer()
a_new = transformer.visit(a)
f_new_compiled = compile(a_new,'<string>','exec')
env = sys.modules[f.__module__].__dict__
_blocked = True
try:
exec(f_new_compiled,env)
finally:
_blocked = False
return env[f.__name__]
#remove_1
def f():
a();a()
print('Some statements 1')
a()
print('Some statements 2')
f()
The output is:
Some statements 1
Some statements 2