Condense Multiple Variables - python

I have tons of variables and argument definitions. Is there a way to make this take up less lines, or am I stuck with it?
# Standard input module to absorb commands from CLI
parser = argparse.ArgumentParser(description='User inputs source and destination tables to transfer data.')
parser.add_argument('src_table', help='Source table not supplied.', type=str)
parser.add_argument('dest_table', help='Destination table not supplied.', nargs='?', type=str) # optional arg
parser.add_argument('instance_object', help='New item not supplied.', nargs='?', type=str)
parser.add_argument('instance_id', help='Item ID not supplied.', nargs='?', type=int)
args = parser.parse_args()
src_table = args.src_table
dest_table = args.dest_table

Resist the urge to create tons of variables. Access the values through args instead.
However, it is possible to convert all the attributes of args into global variables:
globals().update(**vars(args))
but this pollutes the global namespace.
A better option is to pass the args to a function:
def main(src_table, dest_table, instance_object, instance_id):
print(src_table, dest_table, instance_object, instance_id)
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser(description='User inputs source and destination tables to transfer data.')
parser.add_argument('src_table', help='Source table not supplied.', type=str)
parser.add_argument('dest_table', help='Destination table not supplied.', nargs='?', type=str) # optional arg
parser.add_argument('instance_object', help='New item not supplied.', nargs='?', type=str)
parser.add_argument('instance_id', help='Item ID not supplied.', nargs='?', type=int)
args = parser.parse_args()
main(**vars(args))
Thus, inside of main, all of arg's attributes will be accessible as local variables. Structuring your program this way allows your code to be used as both a script and be importable as a module. This makes your code more versatile, and is thus much nicer than "polluting" the global namespace with lots of variables.

Related

python argparse default with nargs wont work [duplicate]

This question already has answers here:
Argparse optional argument with different default if specified without a value
(2 answers)
Closed last month.
Here is my code:
from argparse import ArgumentParser, RawTextHelpFormatter
example_text = "test"
parser = ArgumentParser(description='my script.',
epilog=example_text,
formatter_class=RawTextHelpFormatter)
parser.add_argument('host', type=str, default="10.10.10.10",
help="Device IP address or Hostname.")
parser.add_argument('-j','--json_output', type=str, default="s", nargs='?',choices=["s", "l"],
help="Print GET statement in json form.")
#mutally exclusive required settings supplying the key
settingsgroup = parser.add_mutually_exclusive_group(required=True)
settingsgroup.add_argument('-k', '--key', type=str,
help="the api-key to use. WARNING take care when using this, the key specified will be in the user's history.")
settingsgroup.add_argument('--config', type=str,
help="yaml config file. All parameters can be placed in the yaml file. Parameters provided from form command line will take priority.")
args = parser.parse_args()
print(args.json_output)
my output:
None
Everything I am reading online says this should work, but it doesn't. Why?
You could use the const= parameter. const stores its value when the option is present but have no values
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-j', '--json-output', nargs='?', choices=['s', 'l'], const='s')
args = parser.parse_args()
However design wise, it might be better to use the following:
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-o', '--output-type', choices=['json-s', 'json-l', 'normal'], default='normal')
args = parser.parse_args()

Store argparse input to use as variable

I am using argparse to require input from the user for a hardware id to then be called later on, I cannot work out how to get it so the user types
<command> --id <id>
Please help me see where I'm going wrong! Thanks
parser = argparse.ArgumentParser(description='Return a list of useful information after specifying a hardare/asset ID')
parser.add_argument('--id', type=str, required=True, help ='A hardware/asset id to provide information on')
args = vars(parser.parse_args())
args = parser.parse_args()
def main():
hardware_id = hardware_id_input
host = get_host_information(hardware_id)
print(host["hostname"])
print(host["hardware_id"])
Ditch the call to vars. You would have your argument stored as args.id after parsing. You would then call your main with the args.id as input.
Edit: added a code sample
def main(hw_id):
print(hw_id)
if __name__ == "__main__":
import argparse
parser = argparse.ArgumentParser(description='Description.')
parser.add_argument('--id', type=str, help='The hardware id.', required=True)
args = parser.parse_args()
main(args.id)

Store values of mutually exclusive options in same argument in Python argparse

I have a Python script that can deploy some software in three different environments, let's call them development, testing and production. In order to select which environment the script should work with, I want to use mutually exclusive flags, like so:
./deploy.py --development
./deploy.py --testing
./deploy.py --production
So far I have this:
parser = argparse.ArgumentParser(description="Manage deployment")
group = parser.add_mutually_exclusive_group()
group.add_argument("-d", "--development", action='store_true', required=False)
group.add_argument("-t", "--testing", action='store_true', required=False)
group.add_argument("-p", "--production", action='store_true', required=False)
args = parser.parse_args()
Thing is, I want to have the environment in a single variable, so I can easily check it, instead of having to manually check args.development, args.testing and args.production.
Is there a way of having a common variable that the three arguments could write to so I could do something like args.environment?
Instead of using action='store_true', you can use action='store_const', assign an individual const value for each argument and the dest option of add_argument, so that all the arguments in the mutually exclusive group have the same destination in the object returned by parse_args.
parser = argparse.ArgumentParser(description="Manage deployment")
group = parser.add_mutually_exclusive_group()
group.add_argument("-d", "--development", action='store_const', dest='environment', const='development', required=False)
group.add_argument("-t", "--testing", action='store_const', dest='environment', const='testing', required=False)
group.add_argument("-p", "--production", action='store_const', dest='environment', const='production', required=False)
args = parser.parse_args()
print(args)
The output is:
$ ./test_args.py --production
Namespace(environment='production')
Instead of 3 arguments, you can just use one with choices:
parser = argparse.ArgumentParser(description="Manage deployment")
parser.add_argument("environment", choices=['development', 'testing', 'production'])
args = parser.parse_args()
print(args)
Examples:
>>> test_args.py
usage: concept.py [-h] {development,testing,production}
test_args.py: error: the following arguments are required: environment
>>> test_args.py testing
Namespace(environment='testing')

argparse - Modify the 'required' state of a subcommand's parent arguments

I'm trying to build a CLI in Python and I have an argument (--arg) that I want to reuse across multiple subcommands (req and opt).
But one subcommand (req) will have to require --arg and the other (opt) doesn't. How do I resolve this without having to make two versions of the same argument?
import argparse
arg_1 = argparse.ArgumentParser(add_help=False)
arg_1.add_argument('-a', '--arg', required=True,
help='reusable argument')
parser = argparse.ArgumentParser()
subp = parser.add_subparsers()
cmd_require = subp.add_parser('req', parents=[arg_1],
help='this subcommand requires --arg')
cmd_optional = subp.add_parser('opt', parents=[arg_1],
help='this subcommand doesn\'t require --arg')
I don't know any 'native' argparse feature that does that. However, I thought of 2 different approaches to solve your problem.
Validate args in a separate function -
Sometimes CLI application get complicated and by adding a validator function you can 'complete' the missing argparse features you wish for.
import argparse
arg_1 = argparse.ArgumentParser(add_help=False)
arg_1.add_argument('-a', '--arg', required=False,
help='reusable argument')
parser = argparse.ArgumentParser()
subp = parser.add_subparsers(dest='sub_parser')
cmd_require = subp.add_parser('req', parents=[arg_1],
help='this subcommand requires --arg')
cmd_optional = subp.add_parser('opt', parents=[arg_1],
help='this subcommand doesn\'t require --arg')
def validate_args(args):
print(args)
if args.sub_parser == 'req' and not args.arg:
print("Invalid usage! using 'req' requires 'arg'")
exit(1)
if __name__ == '__main__':
args = parser.parse_args()
validate_args(args)
Note:
I used dest for the subparser in order to later identify the chosen
subparser.
Using argparse, if an optional argument is not passed it will be 'None'
"prepared argument" -
Although argparse doesn't support an argument object - you could 'prepare' an argument by unpacking a dict and a tuple (*args, **kwargs)
import argparse
arg_name = ('-a', '--arg')
arg_dict = {'help': 'reusable argument'}
parser = argparse.ArgumentParser()
subp = parser.add_subparsers()
cmd_require = subp.add_parser('req',
help='this subcommand requires --arg')
cmd_optional = subp.add_parser('opt',
help='this subcommand doesn\'t require --arg')
cmd_optional.add_argument(*arg_name, **arg_dict, required=False)
cmd_require.add_argument(*arg_name, **arg_dict, required=True)
if __name__ == '__main__':
args = parser.parse_args()
validate_args(args)
I like the first approach better.
Hope you find that useful
import argparse
arg_1 = argparse.ArgumentParser(add_help=False)
foobar = arg_1.add_argument('-a', '--arg', required=True,
help='reusable argument')
arg_1 is a parser object. When you use the add_argument command, it creates an Action object and adds it to the args_1._actions list. I just saved a reference to that in the foobar variable.
parser = argparse.ArgumentParser()
subp = parser.add_subparsers()
The parents mechanism adds the args_1._actions list to the cmd_require._actions list. So foobar will appear in both subparsers. It's copy by reference, which is common in python.
cmd_require = subp.add_parser('req', parents=[arg_1],
help='this subcommand requires --arg')
cmd_optional = subp.add_parser('opt', parents=[arg_1],
help='this subcommand doesn\'t require --arg')
foobar.required=False will turn off at attribute, but will do so for both parsers. I've seen this issue come up when people wanted to assign different default attributes.
The parents mechanism just a shortcut, that occasionally is useful, but not always. It doesn't do anything special; just saves a bit of typing. There are plenty other ways of defining an Action with the same flags in both subparsers.
typing it twice (horror of horrors!)
copy-n-paste
writing a utility function to add arguments to subparsers (see Larry Wall's Three Virtues)

Python calling a module that uses argparser

This is probably a silly question, but I have a python script that current takes in a bunch of arguments using argparser and I would like to load this script as a module in another python script, which is fine. But I am not sure how to call the module as no function is defined; can I still call it the same way I do if I was just invoking it from cmd?
Here is the child script:
import argparse as ap
from subprocess import Popen, PIPE
parser = ap.ArgumentParser(
description='Gathers parameters.')
parser.add_argument('-f', metavar='--file', type=ap.FileType('r'), action='store', dest='file',
required=True, help='Path to json parameter file')
parser.add_argument('-t', metavar='--type', type=str, action='store', dest='type',
required=True, help='Type of parameter file.')
parser.add_argument('-g', metavar='--group', type=str, action='store', dest='group',
required=False, help='Group to apply parameters to')
# Gather the provided arguments as an array.
args = parser.parse_args()
... Do stuff in the script
and here is the parent script that I want to invoke the child script from; it also uses arg parser and does some other logic
from configuration import parameterscript as paramscript
# Can I do something like this?
paramscript('parameters/test.params.json', test)
Inside the configuration directory, I also created an init.py file that is empty.
The first argument to parse_args is a list of arguments. By default it's None which means use sys.argv. So you can arrange your script like this:
import argparse as ap
def main(raw_args=None):
parser = ap.ArgumentParser(
description='Gathers parameters.')
parser.add_argument('-f', metavar='--file', type=ap.FileType('r'), action='store', dest='file',
required=True, help='Path to json parameter file')
parser.add_argument('-t', metavar='--type', type=str, action='store', dest='type',
required=True, help='Type of parameter file.')
parser.add_argument('-g', metavar='--group', type=str, action='store', dest='group',
required=False, help='Group to apply parameters to')
# Gather the provided arguments as an array.
args = parser.parse_args(raw_args)
print(vars(args))
# Run with command line arguments precisely when called directly
# (rather than when imported)
if __name__ == '__main__':
main()
And then elsewhere:
from first_module import main
main(['-f', '/etc/hosts', '-t', 'json'])
Output:
{'group': None, 'file': <_io.TextIOWrapper name='/etc/hosts' mode='r' encoding='UTF-8'>, 'type': 'json'}
There may be a simpler and more pythonic way to do this, but here is one possibility using the subprocess module:
Example:
child_script.py
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("-n", "--name", help="your name")
args = parser.parse_args()
print("hello there {}").format(args.name)
Then another Python script can call that script like so:
calling_script.py:
import subprocess
# using Popen may suit better here depending on how you want to deal
# with the output of the child_script.
subprocess.call(["python", "child_script.py", "-n", "Donny"])
Executing the above script would give the following output:
"hello there Donny"
One of the option is to call it as subprocess call like below:
import subprocess
childproc = subprocess.Popen('python childscript.py -file yourjsonfile')
op, oe = childproc.communicate()
print op

Categories

Resources