Passing arguments to python eval() - python

I'm doing genetic programming framework and I need to be able to execute some string representing complete python programs. I'm using Python 2.7. I have a config class in which the primitive sets are defined. Lets say
class Foo():
def a(self,x):
return x
def b(self,y):
return y
I'm extracting the functions with the python inspect module and I want to create some executable source code with imports and everything. I end up with a string that looks like this
import sys
def a(x,y):
return x
def b(y):
return y
def main(x,y)
lambda x,y: a(b(y),a(x,y))
main(*sys.argv)
My problem is that I don't know how to pass command line arguments to the string I'm running with eval(). How can I pass command line arguments to a source file I want to run with eval()?
Edit: There are millions of individuals so writing to a file is not a great option.
Edit: I made a mistake. The eval() method is used only for expressions and not statements so using exec() is the correct approach

eval("function_name")(arg1, arg2)
or if you have a list of arguments:
arguments= [arg1,arg2,arg3,something]
eval("function_name")(*arguments)

You have three options, roughly speaking. You can keep going with eval(),you could actually write the string as a file and execute it with subprocess.Popen(), or you could call the function something besides main() and call it after defining it with eval().
exec() way:
In the string you want to exec
main(#REPLACE_THIS#)
Function to evaluate
import string
def exec_with_args(exec_string,args):
arg_string=reduce(lambda x,y:x+','+y,args)
exec_string.replace("#REPLACE_THIS#", arg_string)
Subprocess way:
import subprocess
#Write string to a file
exec_file=open("file_to_execute","w")
exec_file.write(string_to_execute)
#Run the python file as a separate process
output=subprocess.Popen(["python","file_to_execute"].extend(argument_list),
stdout=subprocess.PIPE)
Function Definition Way
In the string you want to exec
def function_name(*args):
import sys
def a(x,y):
return x
def b(y):
return y
def inner_main(x,y):
lambda x,y: a(b(y),a(x,y))
inner_main(*args)
Outer code
exec(program_string)
function_name(*args)

Related

Decorator function to wrap a function?

I have to write a dummy function to get my code running on different systems, from which some don't have the needed packages. The function is wrapped and then called like a class-function. I am struggling with this problem, any ideas how to do that?
Here I got a short snippet, I import a python script ray.py which should contain this remote() function. The remote function has to take two arguments, without any usage.
Edit: The#ray.remote() wraps the run() function to be parallel executable. It doesn’t change the return of run(). On some systems ray is not supported and I want the same script to execute sequentially without changing anything. Therefore I import a ray-dummy instead of the real one. Now I want to write the ray.remote() to wrap the run() function in a way so that it’s callable with run.remote().
That may be a very inconvenient method to just sequentially execute a function, but necessary to achieve an easy integration for different systems.
# here the wrapped function
#ray.remote(arg1, arg2)
def run(x):
return x**2
# call it
squared = run.remote(2)
I got a working script, located in the ray.py file:
def remote(*args, **kwargs):
def new_func(func):
class Wrapper:
def __init__(self, f):
self.func = f
def remote(self, *arg):
out = self.func(*arg)
return out
ret = Wrapper(func)
return ret
return new_func

Best way to pass function specified in file x as commandline parameter to file y in python

I'm writing a wrapper or pipeline to create a tfrecords dataset to which I would like to supply a function to apply to the dataset.
I would like to make it possible for the user to inject a function defined in another python file which is called in my script to transform the data.
Why? The only thing the user has to do is write the function which brings his data into the right format, then the existing code does the rest.
I'm aware of the fact that I could have the user write the function in the same file and call it, or to have an import statement etc.
So as a minimal example, I would like to have file y.py
def main(argv):
# Parse args etc, let's assume it is there.
dataset = tf.data.TFRecordDataset(args.filename)
dataset = dataset.map(args.function)
# Continue with doing stuff that is independent from actual content
So what I'd like to be able to do is something like this
python y.py --func x.py my_func
And use the function defined in x.py my_func in dataset.map(...)
Is there a way to do this in python and if yes, which is the best way to do it?
Pass the name of the file as an argument to your script (and function name)
Read the file into a string, possibly extracting the given function
use Python exec() to execute the code
An example:
file = "def fun(*args): \n return args"
func = "fun(1,2,3)"
def execute(func, file):
program = file + "\nresult = " + func
local = {}
exec(program, local)
return local['result']
r = execute(func, file)
print(r)
Similar to here however we must use locals() as we are not calling exec in global scope.
Note: the use of exec is somewhat dangerous, you should be sure that the function is safe - if you are using it then its fine!
Hope this helps.
Ok so I have composed the answer myself now using the information from comments and this answer.
import importlib, inspect, sys, os
# path is given path to file, funcion_name is name of function and args are the function arguments
# Create package and module name from path
package = os.path.dirname(path).replace(os.path.sep,'.')
module_name = os.path.basename(path).split('.')[0]
# Import module and get members
module = importlib.import_module(module_name, package)
members = inspect.getmembers(module)
# Find matching function
function = [t[1] for t in members if t[0] == function_name][0]
function(args)
This exactly solves the question, since I get a callable function object which I can call, pass around, use it as a normal function.

Call a functionfrom command line with arguments - Python (multiple function choices)

I'm using Python 3.6, I have a file called file.py, with have this two functions:
def country(countryName):
print(countryName)
def capital(capitalName):
print(capitalName)
I need to call any of these two methods from Command Line, but sincerely I don't know how to do that, also with arguments in this way.
python file.py <method> <argument>
Does someone knows how to do that?
Greetings!
To use command line arguments in a program you can use sys.argv. Read more
import sys
def country(countryName):
print(countryName)
def capital(capitalName):
print(capitalName)
method_name = sys.argv[1]
parameter_name = sys.argv[2]
getattr(sys.modules[__name__], method_name)(parameter_name)
To run the program:
python file.py capital delhi
output:
delhi
Your input parameter method_name is a string so can't be called directly. Hence we need to fetch the method handle using getattr.
Command sys.modules[__name__] fetches the current module. which is file.py module. Then we use getattr to fetch the method we want to call and call it. we pass the parameter to the method as `(parameter_name)'
you could have a module that inspects your file.py, call it executor.py, and adjust your methods in file.py to handle argument lists
executor.py:
import file
import sys
method = file.__dict__.get(sys.argv[0])
method(sys.argv[1:-1])

Parse Python file and evaluate selected functions

I have a file that contains several python functions, each with some statements.
def func1():
codeX...
def func2():
codeY...
codeX and codeY can be multiple statements. I want to be able to parse the file, find a function by name, then evaluate the code in that function.
With the ast module, I can parse the file, find the FunctionDef objects, and get the list of Stmt objects, but how do I turn this into bytecode that I can pass to eval? Should I use the compile module, or the parser module instead?
Basically, the function defs are just used to create separate blocks of code. I want to be able to grab any block of code given the name and then execute that code in eval (providing my own local/global scope objects). If there is a better way to do this than what I described that would be helpful too.
Thanks
I want to be able to grab any block of code given the name and then execute that code ... (providing my own local/global scope objects).
A naive solution looks like this. This is based on the assumption that the functions don't all depend on global variables.
from file_that_contains_several_python_functions import *
Direction = some_value
func1()
func2()
func3()
That should do exactly what you want.
However, if all of your functions rely on global variables -- a design that calls to mind 1970's-era FORTRAN -- then you have to do something slightly more complex.
from file_that_contains_several_python_functions import *
Direction = some_value
func1( globals() )
func2( globals() )
func3( globals() )
And you have to rewrite all of your global-using functions like this.
def func1( context )
globals().update( context )
# Now you have access to all kinds of global variables
This seems ugly because it is. Functions which rely entirely on global variables are not really the best idea.
Using Python 2.6.4:
text = """
def fun1():
print 'fun1'
def fun2():
print 'fun2'
"""
import ast
tree = ast.parse(text)
# tree.body[0] contains FunctionDef for fun1, tree.body[1] for fun2
wrapped = ast.Interactive(body=[a.body[1]])
code = compile(wrapped, 'yourfile', 'single')
eval(code)
fun2() # prints 'fun2'
Take a look at grammar in ast doc: http://docs.python.org/library/ast.html#abstract-grammar. Top-level statement must be either Module, Interactive or Expression, so you need to wrap function def in one of those.
If you're using Python 2.6 or later, then the compile() function accepts AST objects in addition to source code.
>>> import ast
>>> a = ast.parse("print('hello world')")
>>> x = compile(a, "(none)", "exec")
>>> eval(x)
hello world
These modules have all been rearranged for Python 3.

Parsing Functions

I'm making a script parser in python and I'm a little stuck. I am not quite sure how to parse a line for all its functions (or even just one function at a time) and then search for a function with that name, and if it exists, execute that function short of writing a massive list if elif else block....
EDIT
This is for my own scripting language that i'm making. its nothing very complex, but i have a standard library of 8 functions or so that i need to be able to be run, how can i parse a line and run the function named in the line?
Once you get the name of the function, use a dispatch dict to run the function:
def mysum(...): ...
def myotherstuff(...): ...
# create dispatch dict:
myfunctions = {'sum': mysum, 'stuff': myotherstuff}
# run your parser:
function_name, parameters = parse_result(line)
# run the function:
myfunctions[function_name](parameters)
Alternatively create a class with the commands:
class Commands(object):
def do_sum(self, ...): ...
def do_stuff(self, ...): ...
def run(self, funcname, params):
getattr(self, 'do_' + funcname)(params)
cmd = Commands()
function_name, parameters = parse_result(line)
cmd.run(function_name, parameters)
You could also look at the cmd module in the stdlib to do your class. It can provide you with a command-line interface for your language, with tab command completion, automatically.
Check out PyParsing, it allows for definition of the grammar directly in Python code:
Assuming a function call is just somename():
>>> from pyparsing import *
>>> grammar = Word(alphas + "_", alphanums + "_")("func_name") + "()" + StringEnd()
>>> grammar.parseString("ab()\n")["func_name"]
"ab"
Take a look at PLY. It should help you keep your parser specification clean.
It all depends on what code you are parsing.
If you are parsing Python syntax, use the parser module from Python:
http://docs.python.org/library/parser.html
A quite complete list of parser libraries available for Python you can find at: http://nedbatchelder.com/text/python-parsers.html

Categories

Resources