I'm trying to access to the "resources" folder with the ArgumentParser.
This code and the "resources" folder are in the same folder...
Just to try to run the code, I've put a print function in the predict function. However this error occurs:
predict.py: error: the following arguments are required: resources_path
How can I fix it?
from argparse import ArgumentParser
def parse_args():
parser = ArgumentParser()
parser.add_argument("resources_path", help='/resources')
return parser.parse_args()
def predict(resources_path):
print(resources_path)
pass
if __name__ == '__main__':
args = parse_args()
predict(args.resources_path)
I am guessing from your error message that you are trying to call your program like this:
python predict.py
The argument parser by default gets the arguments from sys.argv, i.e. the command line. You'll have to pass it yourself like this:
python predict.py resources
It's possible that you want the resources argument to default to ./resources if you don't pass anything. (And I further assume you want ./resources, not /resources.) There's a keyword argument for that:
....
parser.add_argument('resources_path', default='./resources')
...
Related
I have a use case where I have a main python script with many command line arguments, I need to break it's functionality into multiple smaller scripts, a few command-line arguments will be common to more than one smaller scripts. I want to reduce code duplicacy. I tried to use decorators to register each argument to one or more scripts, but am not able to get around an error. Another caveat I have is I want to set default values for shared argument according to which script is being run. This is what I have currently
argument_parser.py
import argparse
import functools
import itertools
from scripts import Scripts
from collections import defaultdict
_args_register = defaultdict(list)
def argument(scope):
"""
Decorator to add argument to argument registry
:param scope: The module name to register current argument function to can also be a list of modules
:return: The decorated function after after adding it to registry
"""
def register(func):
if isinstance(scope, Scripts):
_args_register[scope].append(func)
elif isinstance(scope, list) and Scripts.ALL in scope:
_args_register[Scripts.ALL].append(func)
else:
for module in scope:
_args_register[module].append(func)
return func
return register
class ArgumentHandler:
def __init__(self, script, parser=None):
self._parser = parser or argparse.ArgumentParser(description=__doc__)
assert script in Scripts
self._script = script
#argument(scope=Scripts.ALL)
def common_arg(self):
self._parser.add_arg("--common-arg",
default=self._script,
help="An arg common to all scripts")
#argument(scope=[Scripts.TRAIN, Scripts.TEST])
def train_test_arg(self):
self._parser.add_arg("--train-test-arg",
default=self._script,
help=f"An arg common to train-test scripts added in argument handler"
)
def parse_args(self):
for argument in itertools.chain(_args_register[Scripts.ALL],
_args_register[self._script]):
argument()
_args = self._parser.parse_args()
return _args
One of the smaller scripts train.py
"""
A Train script to abstract away training tasks
"""
import argparse
from argument_parser import ArgumentHandler
from scripts import Scripts
current = Scripts.TRAIN
parser = argparse.ArgumentParser(description=__doc__)
def get_args() -> argparse.Namespace:
parser.add_argument('--train-arg',
default='blah',
help='a train argumrnt set in the train script')
args_handler = ArgumentHandler(parser=parser, script=current)
return args_handler.parse_args()
if __name__ == '__main__':
print(get_args())
When I run train.py I get the following error
File "../argument_parser.py", line 68, in parse_args
argument()
TypeError: common_arg() missing 1 required positional argument: 'self'
Process finished with exit code 1
I think this is because decorators are run at import time, but am not sure, is there any work around this? or any other better way to reduce code duplicacy? Any help will be highly appreciated. Thanks!
Background
I am trying to write a python script that contains multiple functions like this:
import sys
def util1(x, y):
assert(x is not None)
assert(y is not None)
#does something
def util2(x, y):
assert(x is not None)
assert(y is not None)
#does something
def util3(x, y):
assert(x is not None)
assert(y is not None)
#does something
I need to be able to call any method command line:
python3 myscript.py util1 arg1 arg2
or
python3 myscript.py util3 arg1 arg2
Problem
I don't know the proper way to grab the command line args and pass them to the methods. I found a way to grab the first arg... but I would like a way to say "pass all arg to function x" if this is possible.
What I've tried So far
So far, I at the bottom of my script, I added the following logic:
if __name__ == '__main__':
globals()[sys.argv[1]]()
and so now, when I try to run my script, I get the following response:
lab-1:/var/www/localhost/htdocs/widgets# python3 myscript.py utils1 1 99999
Traceback (most recent call last):
File "myscript.py", line 62, in <module>
globals()[sys.argv[1]]()
TypeError: util1() missing 2 required positional arguments: 'x' and 'y'
I've also tried the following:
globals()[*sys.argv[1:]]()
globals()[*sys.argv[1]:[2]]()
But that doesn't work. I'm getting errors like "TypeError: unhashable type: 'list'
If you can point me in the right direction, I'd appreciate it.
Thanks.
EDIT 1
Based on the recommendation here to review a similar post, I changed my logic to include the argparse library. So now I have the following:
parser = argparse.ArgumentParser(description='This is the description of my program')
parser.add_argument('-lc','--lower_create', type=int, help='lower range value for util1')
parser.add_argument('-uc','--upper_create', type=int, help='upper range value for util1')
parser.add_argument('-lr','--lower_reserve', type=int, help='lower range value for util3')
parser.add_argument('-ur','--upper_reserve', type=int, help='upper range value for util3')
args = parser.parse_args()
#if __name__ == '__main__':
# globals()[sys.argv[1]](sys.argv[2], sys.argv[3])
What's not clear is how I "link" these arguments with a specific function.
So let's say I need -lc and -uc for util1. How can I make that association?
and then for example associate -lr and -ur with util3?
Thank you
You need to pass the arguments to the function when you call it. The naive way to do this would be like this: globals()[sys.argv[1]](sys.argv[2], sys.argv[3]) although you'll probably want to do some extra checking to make sure the arguments exist, as well as the function being called.
That is a nice question.
Try like this.
import sys
def util1(x, y):
print('This is "util1" with the following arguments: "'+x+'" and "'+y+'"')
#does something
def util2(x, y):
print('This is "util2" with the following arguments: "'+x+'" and "'+y+'"')
#does something
def util3(x, y):
print('This is "util3" with the following arguments: "'+x+'" and "'+y+'"')
#does something
locals()[sys.argv[1]](sys.argv[2] , sys.argv[3])
Then calling it like this, works great for me. Just tried it on my test machine.
python file.py util1 arg1 arg2
You could do this quite neatly with click, e.g.
#click.command()
#click.argument('x')
#click.argument('y')
def util1(x, y):
#does something
You can also use varargs, so you don't have to specify every argument:
#click.command()
#click.argument('args', nargs=-1)
def util2(args):
#does something, args is a list
Click also supports different arguments types, validation, etc.
In the following example, how do I pass args of run_tests() to pytest.main(...) so that I can use args for the test methods of TestFooBar in test_module.py?
my_module.py
def run_tests(args):
# How do I pass parameter 'args' to pytest here.
pytest.main(['-q', '-s', 'test_module.py::TestFooBar'])
test_module.py
class TestFooBar():
# How do I get 'args' of 'run_tests()' from 'my_module.py' here.
#pytest.mark.parametrize("args", [args])
def test_something(args):
assert 'foo' == args['foo']
#pytest.mark.parametrize("args", [args])
def test_something_else(args):
assert 'bar' == args['bar']
If you execute pytest.main the you are doing the equivalent to calling py.test from the command line, so the only way to pass arguments that I am aware of is via a command line parameter. For this to be possible, your parameters need to be convertable to and from string.
Basically this means creating a conftest.py with the following
def pytest_addoption(parser):
parser.addoption('--additional_arguments', action='store', help='some helptext')
def pytest_configure(config):
args = config.getoption('additional_arguments')
Now do something with args: deserialize it, make it a global variable, make it a fixture, anything you want. Fixtures from conftest.py will be available to the entire test.
Needless to say that your call should now include the new parameter:
pytest.main(['-q', '-s', '--additional_arguments', args_string, 'test_module.py::TestFooBar'])
If I import a Python module that is already using argparse, however, I would like to use argparse in my script as well ...how should I go about doing this?
I'm receiving a unrecognized arguments error when using the following code and invoking the script with a -t flag:
Snippet:
#!/usr/bin/env python
....
import conflicting_module
import argparse
...
#################################
# Step 0: Configure settings... #
#################################
parser = argparse.ArgumentParser(description='Process command line options.')
parser.add_argument('--test', '-t')
Error:
unrecognized arguments: -t foobar
You need to guard your imported modules with
if __name__ == '__main__':
...
against it running initialization code such as argument parsing on import. See What does if __name__ == "__main__": do?.
So, in your conflicting_module do
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Process command line options in conflicting_module.py.')
parser.add_argument('--conflicting', '-c')
...
instead of just creating the parser globally.
If the parsing in conflicting_module is a mandatory part of application configuration, consider using
args, rest = parser.parse_known_args()
in your main module and passing rest to conflicting_module, where you'd pass either None or rest to parse_args:
args = parser.parse_args(rest)
That is still a bit bad style and actually the classes and functions in conflicting_module would ideally receive parsed configuration arguments from your main module, which would be responsible for parsing them.
When I run this code I get
AttributeError: 'ArgumentParser' object has no attribute 'max_seed'
Here's the code
import argparse
import ConfigParser
CFG_FILE='/my.cfg'
# Get command line arguments
args = argparse.ArgumentParser()
args.add_argument('verb', choices=['new'])
args.add_argument('--max_seed', type=int, default=1000)
args.add_argument('--cmdline')
args.parse_args()
if args.max_seed:
pass
if args.cmdline:
pass
My source file is called "fuzz.py"
You should first initialize the parser and arguments and only then get the actual arguments from parse_args() (see example from the docs):
import argparse
import ConfigParser
CFG_FILE='/my.cfg'
# Get command line arguments
parser = argparse.ArgumentParser()
parser.add_argument('verb', choices=['new'])
parser.add_argument('--max_seed', type=int, default=1000)
parser.add_argument('--cmdline')
args = parser.parse_args()
if args.max_seed:
pass
if args.cmdline:
pass
Hope that helps.
If you use argparse parsed arguments inside another class (somewhere you do self.args = parser.parse_args() ), you might need to explicitly tell your lint parser to ignore Namespace type checking. As told by #frans at Avoid Pylint warning E1101: 'Instance of .. has no .. member' for class with dynamic attributes
:
Just to provide the answer that works for me now - as [The
Compiler][1] suggested you can add a rule for the problematic class in
your projects .pylintrc:
[TYPECHECK]
ignored-classes=Namespace
[1]: https://stackoverflow.com/users/2085149/the-compiler