how to pass argparse arguments to a class - python

I'm very new to coding in general and Python in particular. I'm trying to learn how to pass argparse arguments I have created into a class for use the right/recommend way. In addition to learning python, I'm trying to learn how to do things in an OOP manner so that learning other, OOP-type languages comes a bit easier.
So here's a sample of what I am trying to do:
import argparse
class passyourcliargstome():
def __init__(self, whatdoiputheretogetmycliargs):
#how do I get my cli args here?
pass
def otherfunctionsthatdothings():
pass
if __name__ == '__main__':
#grab the arguments when the script is ran
parser = argparse.ArgumentParser(
description='Make things happen.')
parser.add_argument('-f', '--foo', action='store_true', default=False, help='here be dragons')
parser.add_argument('-b', '--bar', action='store_true', default=False, help='here be more dragons')
passyourcliargstome.otherfunctionsthatdothings()
So, I'm defining argparse arguments outside of the main class, and want to know how to get them inside the class. Is this even the right way to do it? should I just make argparse a function under my class?
Thank you in advance for any assistance, references, etc.
Edit: 11/16 2:18 EST
Note: Since I don't have enough rep to answer my own question, this is my only recourse for posting a proper answer.
Okay, it took me some doing, but I managed to piece this together. RyPeck's answers helped me in getting my arguments (something my code was missing), but then afterwards I was getting unbound method errors When I was trying to test the code. I had no idea what that meant. Did I mention that I live up to my screen name?
It didn't really click until I found and read this. Here is my working code. If anyone has anything to add to this, up to and including "You're still doing it wrong, do it this way, the right way." I'm all ears. In the meantime, thanks for your help.
import argparse
class Passyourcliargstome(object):
def __init__(self):
#here's how I got my args here
self.foo = args.foo
self.bar = args.bar
def otherfunctionsthatdothings(self):
print "args inside of the main class:"
print self.foo
print self.bar
if __name__ == '__main__':
#grab the arguments when the script is ran
parser = argparse.ArgumentParser(description='Make things happen.')
parser.add_argument('-f', '--foo', action='store_true', default=False, help='here be dragons')
parser.add_argument('-b', '--bar', action='store_true', default=False, help='here be more dragons')
args = parser.parse_args()
print "args outside of main:"
print args.foo
print args.bar
#this was the part that I wasn't doing, creating an instance of my class.
shell = Passyourcliargstome()
shell.otherfunctionsthatdothings()
Running this code with no arguments prints False four times. two times outside of the class instance, two times within the class instance.

Use parser.parse_args and wrap it with vars to convert the special argparse Namespace type to a regular Python dict. In general, you want this pattern:
def main():
parser = argparse.ArgumentParser()
parser.add_argument('foo')
parser.add_argument('bar')
args = parser.parse_args()
args_dict = vars(args)
After that, you can pass arguments explicitly or all at once to whatever class or function will take it. For a class, it is best to write the required arguments explicitly. Like so:
class MyClass(object):
def __init__(self, foo, bar):
self.foo = foo
self.bar = bar
def Print(self):
print self.foo
print self.bar
Now you can put those two together like this:
import argparse
class MyClass(object):
def __init__(self, foo, bar):
self.foo = foo
self.bar = bar
def Print(self):
print self.foo
print self.bar
def main():
parser = argparse.ArgumentParser()
parser.add_argument('foo')
parser.add_argument('bar')
args = parser.parse_args()
c1 = MyClass(args.foo, args.bar)
args_dict = vars(args)
c2 = MyClass(**args_dict)
Both c1 and c2 will be created. Best approach, though, is to create classes explicitly, as is done with c1.

A simple for loop can pass argument (or- set your attributes):
args_dict = vars(self.parser.parse_args())
# using argparse arguments as attributes of this (self) class
for item in args_dict:
setattr(self, item, args_dict[item])
but... maybe the elegant way would be to initialize your class with argparse and set them directly to the class by namespace:
class Foo:
def __init__(self)
self.parser = ArgumentParser()
self.parser.add_argument('-f', '--foo, default=False, action='store_true', help='foo or not?')
self.parser.add_argument('-b', '--bar', default=0, action='store', help='set the bar')
self.parser.parse_args(namespace=self)
an empty input is equivalent to:
class Foo:
def __init__(self)
self.foo = False
self.bar = 0

You have to do the following after you add your arguments.
args = parser.parse_args()
If you do a print on args, you'll see that you have all the arguments in a namespace argument.
You can then access them like so -
print args.foo
print args.bar
From there, you can treat them like normal variables. See the argparse documentation for greater detail and more info.

Related

Python: The proper way to declare class instance parameters when initialising argparse.ArgumentParser object inside a class

I'm trying to write a Python script to create a CLI command. To make it more readable, I'm creating argparse.ArgumentParser object directly in the class __init__() (not in the '__main__' part of the script as is usually the case in tutorials).
The approach included in the sample code below works but it looks messy.
I wonder whether this would be the correct way to do it or I should do something else. (Sorry for newbie question.)
import argparse
class Command:
def __init__(self, *args):
self.parser = argparse.ArgumentParser(usage='command [--first] [--second]', description='This is a sample command')
self.parser.add_argument('--first', type=str, required=True, help='first argument', dest='first_argument')
self.parser.add_argument('--second', type=str, required=True, help='second argument', dest='second_argument')
args = self.parser.parse_args()
self.first_argument = args.first_argument
self.second_argument = args.second_argument
def some_operation(self):
concatenated_str = self.first_argument + self.second_argument
return concatenated_str
if __name__ == '__main__':
command = Command() # creating class instance
print(command.some_operation())
Putting it in a separate class method in this way doesn't work and produces AttributeError:
# Code above
def set_params(self):
args = self.parser.parse_args()
self.first_argument = args.first_argument
self.second_argument = args.second_argument
return self.first_argument, self.second_argument
# Code below

How do I wrap argparse options into the python class?

I am pretty new to python OOP, so I got some confusion.
Currently I have:
parser = argparse.ArgumentParser(description='script 1.0')
parser.add_argument('-a', '--foo', help='specify foo')
parser.add_argument('-b', '--bar', type=int, help='specify bar')
parser.add_argument('-c', '--baz', help='specify baz')
parser.add_argument('-d', '--bar2', help='bar2')
args = parser.parse_args()
foo = args.foo
bar = args.bar
baz = args.baz
bar2 = args.bar2
which works pretty well, but I wan to create a class for the whole of my script and make argparse as a class's method (is it possible at all?).
So I tried:
import argparse
....
Class Program:
def __init__(self, foo, bar, baz, bar2):
self.foo = foo
self.bar = bar
...(so on for each arg)
def main():
parser = argparse.ArgumentParser(description='script 1.0')
parser.add_argument('-a', '--foo', help='specify foo')
parser.add_argument('-b', '--bar', type=int, help='specify bar')
parser.add_argument('-c', '--baz', help='specify baz')
parser.add_argument('-d', '--bar2', help='bar2')
args = parser.parse_args()
foo = self.foo
bar = self.bar
baz = self.baz
bar2 = self.bar2
I do not think I am doing right, though. I have not found too much info about it but one post on SO which did not clarify situation to me, so I want to have opinions for my specific case
argparse is already makes use of classes. argparse.ArgumentParser(...) creates a parser object. parser.add_argument(...) creates an Action object, and puts it in a parser list. It also returns it to your code, which may in some advanced uses be handy. parse_args() returns a argparse.Namespace object.
You can subclass the argparse classes to customize their performance.
But usually you don't need to create your own parser class - unless you need a lot of special behavior.
Here's what I'd consider to be a clean use of argparse:
import argparse
# your code
def main(args):
foo = args.foo
# other uses of args
def other(foo, bar):
# code using values
def make_parser():
parser = argparse.ArgumentParser(description='script 1.0')
parser.add_argument('-a', '--foo', help='specify foo')
parser.add_argument('-b', '--bar', type=int, help='specify bar')
parser.add_argument('-c', '--baz', help='specify baz')
parser.add_argument('-d', '--bar2', help='bar2')
return parser
if __name__ == '__main__':
parser = make_parser()
args = parser.parse_args()
Wrapping the parser definition in a function is a good idea. In more complex cases it could be a class. But normally you only need one parser per script, so a function is enough. That function can be in the main body of the script, but actual use should be in a if __name__ block, so the script can be imported without using the parser.
This puts the args namespace in the global environment, where is accessible to all your code. It can be passed as to your functions, or selected values can be passed:
main(args)
other(args.foo, args.bar)
foo = args.foo
do_things3(foo)
d = vars(args) # args in dictionary form
It's a good idea to write your code so it works as imported classes and/or functions, and when run as a script. The argparse part sets values when run as a script. When imported, the importing script sets the necessary control values, possibly with its own parser.
I would do it this way:
import argparse
Class Program:
def __init__(self, foo, bar, baz, bar2):
self.foo = foo
self.bar = bar
...(so on for each arg)
def do_things():
pass
def get_args():
parser = argparse.ArgumentParser(description='script 1.0')
parser.add_argument('-a', '--foo', help='specify foo')
parser.add_argument('-b', '--bar', type=int, help='specify bar')
parser.add_argument('-c', '--baz', help='specify baz')
parser.add_argument('-d', '--bar2', help='bar2')
return parser.parse_args()
def main(args):
instance = Program(args.foo,
args.bar,
args.baz,
args.bar2)
instance.do_things()
if __name__ == '__main__':
args = get_args()
main(args)
Explanation:
Try to separate things...
Parsing the arguments is done in the get_args function, and then pass the "args" to the main.
The main function is in charge to initialize the object by the parameters that been passed as args and perform some work on the object.

Is there a way that I can translate the argseparse module into some type of object?

I made a python runner program and package like so:
The runner program takes in arguments from the command-line and uses argseparse to parse them, like:
parser = argparse.ArgumentParser()
parser.add_argument(...
parser.add_argument(...
args = parser.parse_args()
Then it sends them into my package module like this:
packageObject = PackageModule(params=args)
Now I'm making this into a program that does not take in command-line arguments, but I want to keep initialising the PackageObject with that same line.
How can I make something like,
args = ()
args.arguments_1 = 'user_name'
?
You can provide a list to argparse.parse_args() which does not need to come from the command line:
args_list = ['-n', '10', 'hello']
args = parser.parse_args(args_list)
Then you should be able to run the rest of your script as before.
You could define a new class to store the states of the arguments:
class Args(object):
def __init__(self, arg1, arg2):
self.arg1 = arg1
self.arg2 = arg2
args = Args('user_name', 'something else')
packageObject = PackageModule(params=args)
If you're creating multiple args instances, you may consider using __slots__ in the class definition since you won't be assigning new attributes on the instance, also making the instances ligthweight:
class Args(object):
__slots__ = ['arg1', 'arg2']
def __init__(self, arg1, arg2):
self.arg1 = arg1
self.arg2 = arg2
There are several solutions:
Make the PackageModule object accept arguments, and unpack the args object PackageModule(**vars(args))
Just make a lightweight namedtuple
Make a lightweight class
Pass args as a dict params=vars(args), and substitute for a normal dict

How to avoid using a global variable as a parameter to a decorator?

I am writing a python script where I am using a decorator(retry is the one I am using) that takes a parameter (tries). I want the parameter to be configurable from a command line argument. The only way I can figure out how to set the parameter for the decorator is by reading my arguments into a global variable. I hate this from a design perspective. It makes writing unit tests and anything else that wants to import any functions from my script reliant on the command line arguments being all the same.
Here is a dumbed down example of the problem I am having:
import argparse
from functools import wraps
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value', required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
def decorator_example(test_value):
def deco_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
return deco_example
#decorator_example(args.test_value)
def test():
print "running test"
if __name__ == '__main__':
test()
If anyone can think of a better way to do this without having args be a global, please share! I have exhausted the internet searching for a better way... I want to call getargs() in main and pass the arguments around as needed....
I don't think a decorator is appropriate here. A class seems more suitable precisely because of the problem you're facing. Something like this:
class Test(object):
def __init__(self, test_value):
self.test_value = test_value
def test(self):
print "The value I need is", self.test_value
self._inner_test()
def _inner_test():
print "running test"
if __name__ == '__main__':
args = get_args()
t = TestClass(args.test_value)
t.test()
How exactly to structure the class is not clear from the example you have given and would depend on what you're actually doing, but I think something in this vein will provide you with a more robust solution than trying to shoehorn this into decorators.
Classes are designed to maintain state and provide modified behavior based on that state. That is what you're doing. Your application has state that modifies its behavior. Decorators are designed to wrap extra, static functionality around an existing method.
However, if that is unsuitable, another alternative is to simply allow your arguments to be global. Something along the lines of this:
config.py
import argparse
test_value = None
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value', required=True, default="sample value")
args = parser.parse_args()
return args
def configure():
global test_value
args = parse_args()
test_value = args.test_value
main.py
from functools import wraps
import config
def decorator_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", config.test_value
return f(*args, **kwargs)
return f_example
#decorator_example
def test():
print "running test"
if __name__ == '__main__':
config.configure()
test()
One nice side of this solution is that it gives you an obvious way to also supplement your arguments with a configuration file. Note that this should actually work since config.test_value is not actually read until test is called.
Separate those things which are useful upon import from those things which are only relevant when run as a script:
from functools import wraps
def decorator_example(test_value):
def deco_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
return deco_example
def base_test():
print "running test"
if __name__ == '__main__':
import argparse
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value',
required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
test = decorator_example(args.test_value)(base_test)
test()
I think the issue of global variables is a red-herring here. There is nothing wrong with constant global variables. Every time you import a module, the module name is a global variable. Every time you define a function (at the module level), the function name becomes a global variable.
Problems arise only when functions modify global variables. When that happens, understanding the behavior of functions that depend on the global can become much more complex. If a chain of functions each modify the same global, then you can no longer understand each function as an isolated unit. You have to grok all the functions and how they interact with that global at the same time. This can get complicated quickly and it is why this path often leads to spaghetti code.
This is why modifying global variables should be avoided. But you aren't modifying any global variables here, so I think this is a non-issue. My beef with args.test_value is not that it is global, but rather that there was not sufficient separation of module code versus script code. args.test_value belongs with the script code.
Parse your args in the "if name" section and pass them to your function as an arg. That way other scripts can specify a value for args instead of relying on command line arguments.
The problem is your can't define the decorator until after the function you want to apply it to is defined. One workaround would be to postpone decorating the functions until the value is defined, which in turns requires that they be stored somewhere until that happens.
It also means a global variable will be required temporarily and its use isolated from the rest of the program. Here's how that could be done with your sample code:
from functools import wraps
class decorator_example(object):
def __init__(self, f):
global _funcs
self.f = f
try:
_funcs.append(self) # remember decorator_example instances
except NameError:
_funcs = [self] # first one
def __call__(self, *args, **kwargs):
print 'running decoratored {}() function'.format(self.f.__name__)
return self.f(*args, **kwargs)
def apply_decorator(deco, test_value):
global _funcs
for d in _funcs:
print 'decorating function {}()'.format(d.f.__name__)
d.f = deco(d.f, test_value)
del _funcs # no longer needed
#decorator_example
def test():
print "running test"
def deco_example(f, test_value):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
if __name__ == '__main__':
import argparse
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value',
required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
apply_decorator(deco_example, args.test_value)
test()
If I understand it correctly, usage of global variables can be alleviated with class and member variables. But in Python unless you design carefully, you can not avoid global variables

Python: How do I usefully unittest a method invoking OptionParser?

In my Python Unit Tests (via standard nosetests / unittest.TestCase) I have a class where I've created a method that does option parsing. I would like to write a nice unit testing routine that exercises it nicely.
More than that, with 4 questions below, I'm looking for the OPTIMAL way to do this. ... ?
from optparse import OptionParser
class Something(object):
def getOptionParser(self):
return OptionParser()
def parseMyOptions(self):
parser = self.getOptionParser()
parser.add_option('--debug', action='store_true', help='Run debug mode.')
parser.add_option('--logfile', default='log/logfile.log', help='what filename')
parser.add_option('--url', type='string', help='desired url')
(options, args) = parser.parse_args()
self.debug = options.debug
self.logfile = options.logfile
self.url = options.url
def main(self):
self.parseMyOptions()
class MockOptionParser(object):
def __init__(self):
self.url = None
self.debug = None
self.logfile = None
def add_option(self, *args, action=None, help=None, default=None):
pass
def parse_args(self):
pass
def error(self):
pass
class TestSomething(unittest.TestCase):
def test_something(self):
s = Something()
s.getOptionParser = MockOptionParser()
s.parseMyOptions()
assert True, "assert something useful?"
So the problems here are:
What useful thing can I assert?
How do I auto-generate MockOptionParser without reproducing the skeleton of OptionParser?
Notice that in MockOptionParser's add_option, I DON'T handle 'type' keyword, but I should. I'm sure I won't predict all the methods and keyword args that OptionParser uses and implement them in MockOptionParser. Is there a simpler way?
Initializing MockOptionParser with self vars for each option is UGLY. is there a way to do this nicely?
Is there an easy way to verify all the add_option() vars (options.debug, for instance) are used to set something like self.debug?

Categories

Resources