How to use argparse without needing to define arguments? - python

Using the very interesting answer provided here, I would like to be able to use argparse and execute multiple functions without having to define the arguments to each function. Otherwise the code is very overloaded or one has to multiply files to run different functions, which is not very practical either.
In the following example, the functions function1 and function2 have different arguments such as: function1(arg1: int, arg2: bool) and function2(arg3: float, arg4: str)
# file test.py
import argparse
from file1 import function1
from file2 import function2
FUNCTION_MAP = {'function1' : function1, 'function2': function2}
parser = argparse.ArgumentParser()
parser.add_argument('command', choices=FUNCTION_MAP.keys())
# no specific argument to add here.
args = parser.parse_args()
func = FUNCTION_MAP[args.command]
func(**vars(args))
The following commands with -- arguments do not work.
python test.py "function1" --arg1=10 --arg2=True
python test.py "function2" --arg3=2.4 --arg4="a_file.csv"
command as python test.py "function1" works but asks me the arguments of the function1.
Thanks for your help.

If functions will be shuttling around/sharing lots of the same data, sounds like you want a class/object?
myobj.fn1() and myobj.fn2() will both have implicit access to the myobj object's data.
Use argparse input to define the initial instance via your class' __init__(self, x, y ...) method.

Related

How to use a module with arguments?

I have a module, called T, has a couple of functions and the main part, where calls these functions. From another module, I want to use this module. The main scheme is like:
"""Module T"""
def parse_args():
parser = argparse.ArgumentParser(description='Desc')
parser.add_argument('something')
def foo():
pass
if __name__ == "__main__":
args = parse_args()
foo()
And the other module I want to use:
"""Module M"""
def foo():
pass
def do_something():
"""Where I want to use module T's main"""
I have used module T from terminal with arguments and worked fine. The question may be easy but, how can I use it's main with parameters?
Add a run function (or main or whatever you like) to your module that accepts your command line, and make sure that parse_args optionally accepts an arbitrary list as well:
def parse_args(args=None):
parser = argparse.ArgumentParser(description='Desc')
parser.add_argument('something')
return parser.parse_args(args)
def foo():
pass
def run(args=None):
args = parse_args(args)
foo()
if __name__ == "__main__":
run()
Basically, instead of trying to simulate the import operation and injecting sys.argv (which might actually be possible), you can just factor out the part that the import runs that's of interest to you, and provide access to it:
import T
T.run() # Uses sys.argv
T.run(['my', 'list', '--of', 'args'])
While totally untested, you can possibly also do something like the following (please don't though, this is just for fun):
import sys
from importlib import reload
sys.argv[1:] = my_args
if 'T' in sys.modules:
reload(sys.modules['T'])
else:
import T
But then you'd need to remove the import guard in T.py. Unless you wanted to implement your own module loading sequence, which would let you inject T.__name__, rather than having to modify the import guard: Injecting variables into an import namespace
The if __name__ ... pattern is executed if the script is called directly, so the real solution is to call foo in your entrypoint. The if __name__ ... patter basically protects lines of code from being executed on import. For example, this is a very convenient pattern for testing - just put your tests in the protected block. The straightforward way to do what you're asking:
"""Module M"""
def bar():
pass
def do_something(args):
args = parse_args()
module_t.foo()
If you want module M to be completely "fire and forget" then Mad Physicist's answer is for you.

Python: Choose function in command line argument

In a Python program (with more than one user defined functions), I want to specify which function to use through command line arguments. For e.g., I have the following functions, func1(), func2(), func3() in my Python program, and I am using the following technique presently:
python prog.py -func func2"
My program is something like this:
from __future__ import division
import numpy as np
import argparse
parser = argparse.ArgumentParser(description='')
parser.add_argument('-func', help='')
args = parser.parse_args()
func_type = globals()[args.func]()
def func1():
print "something1"
def func2():
print "something2"
def func3():
print "something3"
func_type()
I get the following error:
KeyError: 'func2'
I will really appreciate if someone can tell me how I can implement the desired functionality.
Two simple mistakes related to func_type = globals()[args.func]()
The functions you are looking for have not been defined yet. Move the function definitions above this line.
You are calling the function instead of saving a reference to it in variable func_type.

Call a functionfrom command line with arguments - Python (multiple function choices)

I'm using Python 3.6, I have a file called file.py, with have this two functions:
def country(countryName):
print(countryName)
def capital(capitalName):
print(capitalName)
I need to call any of these two methods from Command Line, but sincerely I don't know how to do that, also with arguments in this way.
python file.py <method> <argument>
Does someone knows how to do that?
Greetings!
To use command line arguments in a program you can use sys.argv. Read more
import sys
def country(countryName):
print(countryName)
def capital(capitalName):
print(capitalName)
method_name = sys.argv[1]
parameter_name = sys.argv[2]
getattr(sys.modules[__name__], method_name)(parameter_name)
To run the program:
python file.py capital delhi
output:
delhi
Your input parameter method_name is a string so can't be called directly. Hence we need to fetch the method handle using getattr.
Command sys.modules[__name__] fetches the current module. which is file.py module. Then we use getattr to fetch the method we want to call and call it. we pass the parameter to the method as `(parameter_name)'
you could have a module that inspects your file.py, call it executor.py, and adjust your methods in file.py to handle argument lists
executor.py:
import file
import sys
method = file.__dict__.get(sys.argv[0])
method(sys.argv[1:-1])

How to figure out which command line parameters have been set in plac?

My Python script takes configuration values in this order:
command line argument (possibility to overwrite user defined values in configuration file)
configuration file value (user defined values)
default value in source code
I need to figure out which option has been set on the command line in order to determine whether a default value has been set explicitly or not. plac is [not?] very transparent and I don't see how it's possible. I would like to avoid to parse sys.argv because writing a command line parser in order to use a command line parser doesn't seem like a good idea.
I'm using plac 0.9.1 on Ubuntu 15.04.
Could you give a simple example of your plac setup? I used to know it well, but now know the underlying argparse better.
Are you using plac to invoke a function(s) directly, e.g.
plac.call(main)
Internally plac creates an argparse.ArgumentParser (actually its own subclass), populates it with arguments derived from a functions signature. And then calls that function with the values it parsed.
If working with argparse directly, you create the parser, populate it, and invoke it with
args = parser.parse_args()
args is now a Namespace object with attributes named after the arguments. It can also be converted to a dictionary.
If you go the argparse route (possibly still creating the parser with plac), you can easily check the attributes in args, and compare those values with your defaults, or with the config file values.
IPython actually populates an argparse.parser with arguments derived from config files (default and custom). This lets you over ride config defaults at several stages - with custom files or with the commandline.
plac is supposed to be easier than argparse, by integrating parser creation and function calling. But in your case it may be better to separate those steps.
The code for plac.call is:
def call(obj, arglist=sys.argv[1:], eager=True):
...
cmd, result = parser_from(obj).consume(arglist)
...
The first part creates a parser; the arguments of this parser are based on the annotations decorator. main in my test is from the plac documentation.
In [22]: p=plac.parser_from(main)
In [23]: p
Out[23]: ArgumentParser(prog='ipython2.7', usage=None, description=None, version=None, formatter_class=<class 'argparse.RawDescriptionHelpFormatter'>, conflict_handler='error', add_help=True)
p.consume is a longer function, that checks for things like subparsers. But at some point it does
ns, extraopts = self.parse_known_args(arglist)
# or self.parse_args
ns is an argparse Namespace, with parameters set by the parser - both defaults and ones from the command line. It then calls 'the function' with values from ns (split into Python positional args and keyword args).
So you can call that parser directly:
In [25]: p.parse_args([]) # or with the default sys.argv[1:]
Out[25]: Namespace(args=[], kw={}, opt=None)
The p.print_help() also shows the arguments that it expects from the commandline, but in a neatly formatted form.
And if you want to get further into the argparse guts you can look at p._actions, a list of argparse.Action objects that it uses to parse the commandline.

Passing arguments to python eval()

I'm doing genetic programming framework and I need to be able to execute some string representing complete python programs. I'm using Python 2.7. I have a config class in which the primitive sets are defined. Lets say
class Foo():
def a(self,x):
return x
def b(self,y):
return y
I'm extracting the functions with the python inspect module and I want to create some executable source code with imports and everything. I end up with a string that looks like this
import sys
def a(x,y):
return x
def b(y):
return y
def main(x,y)
lambda x,y: a(b(y),a(x,y))
main(*sys.argv)
My problem is that I don't know how to pass command line arguments to the string I'm running with eval(). How can I pass command line arguments to a source file I want to run with eval()?
Edit: There are millions of individuals so writing to a file is not a great option.
Edit: I made a mistake. The eval() method is used only for expressions and not statements so using exec() is the correct approach
eval("function_name")(arg1, arg2)
or if you have a list of arguments:
arguments= [arg1,arg2,arg3,something]
eval("function_name")(*arguments)
You have three options, roughly speaking. You can keep going with eval(),you could actually write the string as a file and execute it with subprocess.Popen(), or you could call the function something besides main() and call it after defining it with eval().
exec() way:
In the string you want to exec
main(#REPLACE_THIS#)
Function to evaluate
import string
def exec_with_args(exec_string,args):
arg_string=reduce(lambda x,y:x+','+y,args)
exec_string.replace("#REPLACE_THIS#", arg_string)
Subprocess way:
import subprocess
#Write string to a file
exec_file=open("file_to_execute","w")
exec_file.write(string_to_execute)
#Run the python file as a separate process
output=subprocess.Popen(["python","file_to_execute"].extend(argument_list),
stdout=subprocess.PIPE)
Function Definition Way
In the string you want to exec
def function_name(*args):
import sys
def a(x,y):
return x
def b(y):
return y
def inner_main(x,y):
lambda x,y: a(b(y),a(x,y))
inner_main(*args)
Outer code
exec(program_string)
function_name(*args)

Categories

Resources