I have an object A which contains parserA - an argparse.ArgumentParser object
There is also object B which contains parserB - another argparse.ArgumentParser
Object A contains an instance of object B, however object B's arguments now need to be parsed by the parser in object A (since A is the one being called from the command line with the arguments, not B)
Is there a way to write in Python object A: parserA += B.parserB?
argparse was developed around objects. Other than a few constants and utility functions it is all class definitions. The documentation focuses on use rather than that class structure. But it may help to understand a bit of that.
parser = argparse.ArgumentParser(...)
creates a parser object.
arg1 = parser.add_argument(...)
creates an argparse.Action (subclass actually) object and adds it to several parser attributes (lists). Normally we ignore the fact that the method returns this Action object, but occasionally I find it helpful. And when I build a parser in an interactive shell I see a this action.
args = parser.parse_args()
runs another method, and returns an namespace object (class argparse.Namespace).
The group methods and subparsers methods also create and return objects (groups, actions and/or parsers).
The ArgumentParser method takes a parents parameter, where the value is a list of parser objects.
With
parsera = argparse.ArgumentParser(parents=[parserb])
during the creation of parsera, the actions and groups in parserb are copied to parsera. That way, parsera will recognize all the arguments that parserb does. I encourage you to test it.
But there are a few qualifications. The copy is by reference. That is, parsera gets a pointer to each Action defined in parserb. Occasionally that creates problems (I won't get into that now). And one or the other has to have add_help=False. Normally a help action is added to a parser at creation. But if parserb also has a help there will be conflict (a duplication) that has to be resolved.
But parents can't be used if parsera has been created independently of parserb. There's no existing mechanism for adding Actions from parserb. It might possible to make a new parser, with both as parents
parserc = argparse.ArgumentParser(parents=[parsera, parserb])
I could probably write a function that would add arguments from parserb to parsera, borrowing ideas from the method that implements parents. But I'd have to know how conflicts are to be resolved.
Look at the argparse._ActionsContainer._add_container_actions to see how arguments (Actions) are copies from a parent to a parser. Something that may be confusing is that each Action is part of a group (user defined or one of the 2 default groups (seen in the help)) in addition to being in a parser.
Another possibility is to use
[argsA, extrasA] = parserA.parse_known_args()
[argsB, extrasB] = parserB.parse_known_args() # uses the same sys.argv
# or
args = parserB.parse_args(extrasA, namespace=argsA)
With this each parser handles the arguments it knows about, and returns the rest in the extras list.
Unless the parsers are designed for this kind of integration, there will be rough edges with this kind of integration. It may be easier to deal with those conficts with Arnial's approach, which is to put the shared argument definitions in your own methods. Others like to put the argument parameters in some sort of database (list, dictionary, etc), and build the parser from that. You can wrap parser creation in as many layers of boilerplate as you find convenient.
You can't use one ArgumentParser inside another. But there is a way around. You need to extract to method code that add arguments to parser.
Then you will be able to use them to merge arguments in parser.
Also it will be easer to group arguments (related to their parsers). But you must be shore that sets of arguments names do not intersect.
Example:
foo.py:
def add_foo_params( group ):
group.add_argument('--foo', help='foo help')
if __name__ = "__main__":
parser = argparse.ArgumentParser(prog='Foo')
boo.py
def add_boo_params( group ):
group.add_argument('--boo', help='boo help')
if __name__ = "__main__":
parser = argparse.ArgumentParser(prog='Boo')
fooboo.py
from foo import add_foo_params
from boo import add_boo_params
if __name__ = "__main__":
parser = argparse.ArgumentParser(prog='FooBoo')
foo_group = parser.add_argument_group(title="foo params")
boo_group = parser.add_argument_group(title="boo params")
add_foo_params( foo_group )
add_boo_params( boo_group )
For your use case, if you can, you could try simply sharing the same argparse object between classes via a dedicated method.
Below is based on what it seems like your situation is.
import argparse
class B(object):
def __init__(self, parserB=argparse.ArgumentParser()):
super(B, self).__init__()
self.parserB = parserB
def addArguments(self):
self.parserB.add_argument("-tb", "--test-b", help="Test B", type=str, metavar="")
#Add more arguments specific to B
def parseArgs(self):
return self.parserB.parse_args()
class A(object):
def __init__(self, parserA=argparse.ArgumentParser(), b=B()):
super(A, self).__init__()
self.parserA = parserA
self.b = b
def addArguments(self):
self.parserA.add_argument("-ta", "--test-a", help="Test A", type=str, metavar="")
#Add more arguments specific to A
def parseArgs(self):
return self.parserA.parse_args()
def mergeArgs(self):
self.b.parserB = self.parserA
self.b.addArguments()
self.addArguments()
Code Explanation:
As stated, in the question, object A and object B contain their own parser objects. Object A also contains an instance of object B.
The code simply separates the anticipated flow into separate methods so that it is possible to keep adding arguments to a single parser before attempting to parse it.
Test Individual
a = A()
a.addArguments()
print(vars(a.parseArgs()))
# CLI Command
python test.py -ta "Testing A"
# CLI Result
{'test_a': 'Testing A'}
Combined Test
aCombined = A()
aCombined.mergeArgs()
print(vars(aCombined.parseArgs()))
# CLI Command
testing -ta "Testing A" -tb "Testing B"
# CLI Result
{'test_b': 'Testing B', 'test_a': 'Testing A'}
Additional
You can also make a general method that takes variable args, and would iterate over and keep adding the args of various classes. I created class C and D for sample below with a general "parser" attribute name.
Multi Test
# Add method to Class A
def mergeMultiArgs(self, *objects):
parser = self.parserA
for object in objects:
object.parser = parser
object.addArguments()
self.addArguments()
aCombined = A()
aCombined.mergeMultiArgs(C(), D())
print(vars(aCombined.parseArgs()))
# CLI Command
testing -ta "Testing A" -tc "Testing C" -td "Testing D"
# CLI Result
{'test_d': 'Testing D', 'test_c': 'Testing C', 'test_a': 'Testing A'}
Yes they can be combined, do this:
Here is a function that merges two args:
def merge_args_safe(args1: Namespace, args2: Namespace) -> Namespace:
"""
Merges two namespaces but throws an error if there are keys that collide.
ref: https://stackoverflow.com/questions/56136549/how-can-i-merge-two-argparse-namespaces-in-python-2-x
:param args1:
:param args2:
:return:
"""
# - the merged args
# The vars() function returns the __dict__ attribute to values of the given object e.g {field:value}.
args = Namespace(**vars(args1), **vars(args2))
return args
test
def merge_args_test():
args1 = Namespace(foo="foo", collided_key='from_args1')
args2 = Namespace(bar="bar", collided_key='from_args2')
args = merge_args(args1, args2)
print('-- merged args')
print(f'{args=}')
output:
Traceback (most recent call last):
File "/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/pydevd.py", line 1483, in _exec
pydev_imports.execfile(file, globals, locals) # execute the script
File "/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile
exec(compile(contents+"\n", file, 'exec'), glob, loc)
File "/Users/brando/ultimate-utils/ultimate-utils-proj-src/uutils/__init__.py", line 1202, in <module>
merge_args_test()
File "/Users/brando/ultimate-utils/ultimate-utils-proj-src/uutils/__init__.py", line 1192, in merge_args_test
args = merge_args(args1, args2)
File "/Users/brando/ultimate-utils/ultimate-utils-proj-src/uutils/__init__.py", line 1116, in merge_args
args = Namespace(**vars(args1), **vars(args2))
TypeError: argparse.Namespace() got multiple values for keyword argument 'collided_key'
python-BaseException
you can find it in this library: https://github.com/brando90/ultimate-utils
If you want to have collisions resolved do this:
def merge_two_dicts(starting_dict: dict, updater_dict: dict) -> dict:
"""
Starts from base starting dict and then adds the remaining key values from updater replacing the values from
the first starting/base dict with the second updater dict.
For later: how does d = {**d1, **d2} replace collision?
:param starting_dict:
:param updater_dict:
:return:
"""
new_dict: dict = starting_dict.copy() # start with keys and values of starting_dict
new_dict.update(updater_dict) # modifies starting_dict with keys and values of updater_dict
return new_dict
def merge_args(args1: Namespace, args2: Namespace) -> Namespace:
"""
ref: https://stackoverflow.com/questions/56136549/how-can-i-merge-two-argparse-namespaces-in-python-2-x
:param args1:
:param args2:
:return:
"""
# - the merged args
# The vars() function returns the __dict__ attribute to values of the given object e.g {field:value}.
merged_key_values_for_namespace: dict = merge_two_dicts(vars(args1), vars(args2))
args = Namespace(**merged_key_values_for_namespace)
return args
test:
def merge_args_test():
args1 = Namespace(foo="foo", collided_key='from_args1')
args2 = Namespace(bar="bar", collided_key='from_args2')
args = merge_args(args1, args2)
print('-- merged args')
print(f'{args=}')
assert args.collided_key == 'from_args2', 'Error in merge dict, expected the second argument to be the one used' \
'to resolve collision'
Related
I have the following class, which uses a function to load config values from a file. load_config essentially converts the property in question in the yaml into a string.
import load_config
import sys
class Example:
def __init__(self):
parser = ArgumentParser()
self.args = parser.parse_known_args(sys.argv[1:])
self.conf = self.args + {"val_b": "green"}
self.val_a = load_config(self.conf, "val_a")
self.val_b = load_config(self.conf, "val_b")
if not self.val_a:
raise ValeError()
if not self.val_b:
raise ValeError()
self.args loads arguments from the CLI. Imagine, for this example, self.conf is a merge of CLI args and hard coded args into a python dict.
I'm trying to test against the if conditions. How can I patch or pass fake values to self.val_a and self.val_b so that I can raise the exceptions and have a test case against it?
Since val_a comes from CLI, I can patch sys.argv as patch("sys.argv", self.cli_arguments), but how can I patch or is it possible to pass a fake value to self.val_b during instantiation?
I don't want to patch the call to load_config because I want to run separate tests that confirm the if statements get triggered.
I am trying to port an argparse command-line interface (CLI) to click. This CLI must maintain the order in which parameters are provided by the user. With the argparse version, I used this StackOverflow answer to maintain order. With click, I am not sure how to do this.
I tried creating a custom callback to store params and values in order, but if a parameter is used multiple times, the callback fires the first time it sees a matching parameter.
import click
import typing
class OrderParams:
_options: typing.List[typing.Tuple[click.Parameter, typing.Any]] = []
#classmethod
def append(cls, ctx: click.Context, param: click.Parameter, value: typing.Any):
cls._options.append((param, value))
#click.command()
#click.option("--animal", required=True, multiple=True, callback=OrderParams.append)
#click.option("--thing", required=True, multiple=True, callback=OrderParams.append)
def cli(*, animal, thing):
click.echo("Got this order of parameters:")
for param, value in OrderParams._options:
print(" ", param.name, value)
if __name__ == "__main__":
cli()
Current output:
$ python cli.py --animal cat --thing rock --animal dog
Got this order of parameters:
animal ('cat', 'dog')
thing ('rock',)
Desired output:
$ python cli.py --animal cat --thing rock --animal dog
Got this order of parameters:
animal 'cat'
thing 'rock'
animal 'dog'
The click documentation describes the behavior that a callback is called once the first time a parameter is encountered, even if the parameter is used multiple times.
If an option or argument is split up on the command line into multiple places because it is repeated [...] the callback will fire based on the position of the first option.
This can be done by over riding the click.Command argument parser invocation using a custom class like:
Custom Class:
class OrderedParamsCommand(click.Command):
_options = []
def parse_args(self, ctx, args):
# run the parser for ourselves to preserve the passed order
parser = self.make_parser(ctx)
opts, _, param_order = parser.parse_args(args=list(args))
for param in param_order:
type(self)._options.append((param, opts[param.name].pop(0)))
# return "normal" parse results
return super().parse_args(ctx, args)
Using Custom Class:
Then to use the custom command, pass it as the cls argument to the command decorator like:
#click.command(cls=OrderedParamsCommand)
#click.option("--animal", required=True, multiple=True)
#click.option("--thing", required=True, multiple=True)
def cli(*, animal, thing):
....
How does this work?
This works because click is a well designed OO framework. The #click.command() decorator
usually instantiates a click.Command object but allows this behavior to be over ridden with
the cls parameter. So it is a relatively easy matter to inherit from click.Command in our
own class and over ride desired methods.
In this case we over ride click.Command.parse_args() and run the parser ourselves to preserve
the order passed in.
Test Code:
import click
class OrderedParamsCommand(click.Command):
_options = []
def parse_args(self, ctx, args):
parser = self.make_parser(ctx)
opts, _, param_order = parser.parse_args(args=list(args))
for param in param_order:
type(self)._options.append((param, opts[param.name].pop(0)))
return super().parse_args(ctx, args)
#click.command(cls=OrderedParamsCommand)
#click.option("--animal", required=True, multiple=True)
#click.option("--thing", required=True, multiple=True)
def cli(*, animal, thing):
click.echo("Got this order of parameters:")
for param, value in OrderedParamsCommand._options:
print(" ", param.name, value)
if __name__ == "__main__":
cli('--animal cat --thing rock --animal dog'.split())
Results:
Got this order of parameters:
animal cat
thing rock
animal dog
#StephenRauch's answer is correct but it misses one minor detail. Calling the cli function i.e. the script without any arguments will result in an error:
type(self)._options.append((param, opts[param.name].pop(0)))
AttributeError: 'NoneType' object has no attribute 'pop'
The fix for this is checking opts[param.name] if it is not None. The modified OrderedParamsCommand looks like:
class OrderedParamsCommand(click.Command):
_options = []
def parse_args(self, ctx, args):
# run the parser for ourselves to preserve the passed order
parser = self.make_parser(ctx)
opts, _, param_order = parser.parse_args(args=list(args))
for param in param_order:
# Type check
option = opts[param.name]
if option != None:
type(self)._options.append((param, option.pop(0)))
# return "normal" parse results
return super().parse_args(ctx, args)
EDIT: I posted an answer as I could not propose an edit
EDIT 2: Uhh, this should not be the solution. I just tried --help with this and it crashed with this error:
type(self)._options.append((param, option.pop(0)))
AttributeError: 'bool' object has no attribute 'pop'
Too bad, I wished there is a proper solution
I have a number of functions that need to get called from various imported files.
The functions are formated along the lines of this:
a.foo
b.foo2
a.bar.foo4
a.c.d.foo5
and they are passed in to my script as a raw string.
I'm looking for a clean way to run these, with arguments, and get the return values
Right now I have a messy system of splitting the strings then feeding them to the right getattr call but this feels kind of clumsy and is very un-scalable. Is there a way I can just pass the object portion of getattr as a string? Or some other way of doing this?
import a, b, a.bar, a.c.d
if "." in raw_script:
split_script = raw_script.split(".")
if 'a' in raw_script:
if 'a.bar' in raw_script:
out = getattr(a.bar, split_script[-1])(args)
if 'a.c.d' in raw_script:
out = getattr(a.c.d, split_script[-1])(args)
else:
out = getattr(a, split_script[-1])(args)
elif 'b' in raw_script:
out = getattr(b, split_script[-1])(args)
It's hard to tell from your question, but it sounds like you have a command line tool you run as my-tool <function> [options]. You could use importlib like this, avoiding most of the getattr calls:
import importlib
def run_function(name, args):
module, function = name.rsplit('.', 1)
module = importlib.import_module(module)
function = getattr(module, function)
function(*args)
if __name__ == '__main__':
# Elided: retrieve function name and args from command line
run_function(name, args)
Try this:
def lookup(path):
obj = globals()
for element in path.split('.'):
try:
obj = obj[element]
except KeyError:
obj = getattr(obj, element)
return obj
Note that this will handle a path starting with ANY global name, not just your a and b imported modules. If there are any possible concerns with untrusted input being provided to the function, you should start with a dict containing the allowed starting points, not the entire globals dict.
I use arparse in Python to parse arguments from the command line:
def main():
parser = argparse.ArgumentParser(usage=usage)
parser.add_argument('-v', '--verbose', dest='verbose', action='store_true')
parser.add_argument(...)
args = parser.parse_args()
I use object args only in few places of the code.
There are three methods and the call stack looks like this
def first_level(args):
second_level()
def second_level():
third_level()
def third_level():
### here I want to add some logging if args.verbose is True
I want to add some logging to third_level().
I don't like to change the signature of the method second_level().
How can I make the arg object available in third_lelvel()?
I could store arg as global variable, but I was told to not use global variables in a developer training some years ago ....
What is common way to handle this?
Converting my comment to answer. I'd suggest not condition in your third_level(..) at all. There are mechanisms to let the logging module take care of that -- and those mechanisms can be controlled from outside of those 3 functions.
Something like:
def first_level(args):
second_level()
def second_level():
third_level()
def third_level():
logging.info("log line which will be printed if logging is at INFO level")
def main():
args = ....
#Set the logging level, conditionally
if args.verbose:
logging.basicConfig(filename='myapp.log', level=logging.INFO)
else:
logging.basicConfig(filename='myapp.log', level=logging.WARNING)
first_level(args)
A common module structure is something like:
imports ...
<constants>
options = {verbose=0, etc}
# alt options = argparse.Namespace(logging=False,....)
def levelone(args, **kwargs):
....
def leveltwo(...):
def levelthree(...):
<use constant>
<use options>
def parser():
p = argparse.ArgumentParser()
....
args = p.parse_args() # this uses sys.argv
if __name__=='__main__':
args = parser()
options.update(vars(args))
levelone(args)
The body of the module has function definitions, and can be imported by another module. If used as a script then parser reads the commandline. That global options is available for all sorts of state like parameters. In sense they are are constants that the user, or importing module, can tweak. Values imported from a config file can have the same role.
Another common pattern is to make your functions methods of a class, and pass args as object attributes.
class Foo():
def __init__(self, logging=False):
self.logging = logging
def levelone():
def leveltwo():
<use self.logging>
foo(args.logging).levelone()
While globals are discouraged, it's more because they get overused, and spoil the modularity that functions provide. But Python also provides a module level namespace that can contain more than just functions and classes. And any function defined in the module can access that namespace - unless its own definitions shadow it.
var1 = 'module level variable'
var2 = 'another'
def foo(var3):
x = var1 # read/use var1
var2 = 1 # shadow the module level definition
etc
================
I'm not sure I should recommend this or not, but you could parse sys.argv within third_level.
def third_level():
import argparse
p = argparse.ArgumentParser()
p.add_argument('-v','--verbose',action='count')
args = p.parse_known_args()
verbose = args.verbose
<logging>
argparse imports sys and uses sys.argv. It can do that regardless of whether it is use at your script level, in your main or some nested function. logging does the same sort of thing. You could use your own imported module to covertly pass values into functions. Obviously that can be abused. A class with class attributes can also be used this way.
I am working on a quick python script using the cmd module that will allow the user to enter text commands followed by parameters in basic url query string format. The prompts will be answered with something like
commandname foo=bar&baz=brack
Using cmd, I can't seem to find which method to override to affect the way the argument line is handed off to all the do_* methods. I want to run urlparse.parse_qs on these values, and calling this upon line in every do_* method seems clumsy.
The precmd method gets the whole line, before the commandname is split off and interpreted, so this will not work for my purposes. I'm also not terribly familiar with how to place a decorator inside a class like this and haven't been able to pull it off without breaking the scope.
Basically, the python docs for cmd say the following
Repeatedly issue a prompt, accept input, parse an initial prefix off
the received input, and dispatch to action methods, passing them the
remainder of the line as argument.
I want to make a method that will do additional processing to that "remainder of the line" and hand that generated dictionary off to the member functions as the line argument, rather than interpreting them in every function.
Thanks!
You could potentially override the onecmd() method, as the following quick example shows. The onecmd() method there is basically a copy of the one from the original cmd.py, but adds a call to urlparse.parse_qs() before passing the arguments to a function.
import cmd
import urlparse
class myCmd(cmd.Cmd):
def onecmd(self, line):
"""Mostly ripped from Python's cmd.py"""
cmd, arg, line = self.parseline(line)
arg = urlparse.parse_qs(arg) # <- added line
if not line:
return self.emptyline()
if cmd is None:
return self.default(line)
self.lastcmd = line
if cmd == '':
return self.default(line)
else:
try:
func = getattr(self, 'do_' + cmd)
except AttributeError:
return self.default(line)
return func(arg)
def do_foo(self, arg)
print arg
my_cmd = myCmd()
my_cmd.cmdloop()
Sample output:
(Cmd) foo
{}
(Cmd) foo a b c
{}
(Cmd) foo a=b&c=d
{'a': ['b'], 'c': ['d']}
Is this what you are trying to achieve?
Here's another potential solution that uses a class decorator to modify a
cmd.Cmd subclass and basically apply a decorator function to all do_*
methods of that class:
import cmd
import urlparse
import types
# function decorator to add parse_qs to individual functions
def parse_qs_f(f):
def f2(self, arg):
return f(self, urlparse.parse_qs(arg))
return f2
# class decorator to iterate over all attributes of a class and apply
# the parse_qs_f decorator to all do_* methods
def parse_qs(cls):
for attr_name in dir(cls):
attr = getattr(cls, attr_name)
if attr_name.startswith('do_') and type(attr) == types.MethodType:
setattr(cls, attr_name, parse_qs_f(attr))
return cls
#parse_qs
class myCmd(cmd.Cmd):
def do_foo(self, args):
print args
my_cmd = myCmd()
my_cmd.cmdloop()
I quickly cobbled this together and it appears to work as intended, however, I'm
open to suggestions on any pitfalls or how this solution could be improved.