Python: How do I usefully unittest a method invoking OptionParser? - python

In my Python Unit Tests (via standard nosetests / unittest.TestCase) I have a class where I've created a method that does option parsing. I would like to write a nice unit testing routine that exercises it nicely.
More than that, with 4 questions below, I'm looking for the OPTIMAL way to do this. ... ?
from optparse import OptionParser
class Something(object):
def getOptionParser(self):
return OptionParser()
def parseMyOptions(self):
parser = self.getOptionParser()
parser.add_option('--debug', action='store_true', help='Run debug mode.')
parser.add_option('--logfile', default='log/logfile.log', help='what filename')
parser.add_option('--url', type='string', help='desired url')
(options, args) = parser.parse_args()
self.debug = options.debug
self.logfile = options.logfile
self.url = options.url
def main(self):
self.parseMyOptions()
class MockOptionParser(object):
def __init__(self):
self.url = None
self.debug = None
self.logfile = None
def add_option(self, *args, action=None, help=None, default=None):
pass
def parse_args(self):
pass
def error(self):
pass
class TestSomething(unittest.TestCase):
def test_something(self):
s = Something()
s.getOptionParser = MockOptionParser()
s.parseMyOptions()
assert True, "assert something useful?"
So the problems here are:
What useful thing can I assert?
How do I auto-generate MockOptionParser without reproducing the skeleton of OptionParser?
Notice that in MockOptionParser's add_option, I DON'T handle 'type' keyword, but I should. I'm sure I won't predict all the methods and keyword args that OptionParser uses and implement them in MockOptionParser. Is there a simpler way?
Initializing MockOptionParser with self vars for each option is UGLY. is there a way to do this nicely?
Is there an easy way to verify all the add_option() vars (options.debug, for instance) are used to set something like self.debug?

Related

Python: The proper way to declare class instance parameters when initialising argparse.ArgumentParser object inside a class

I'm trying to write a Python script to create a CLI command. To make it more readable, I'm creating argparse.ArgumentParser object directly in the class __init__() (not in the '__main__' part of the script as is usually the case in tutorials).
The approach included in the sample code below works but it looks messy.
I wonder whether this would be the correct way to do it or I should do something else. (Sorry for newbie question.)
import argparse
class Command:
def __init__(self, *args):
self.parser = argparse.ArgumentParser(usage='command [--first] [--second]', description='This is a sample command')
self.parser.add_argument('--first', type=str, required=True, help='first argument', dest='first_argument')
self.parser.add_argument('--second', type=str, required=True, help='second argument', dest='second_argument')
args = self.parser.parse_args()
self.first_argument = args.first_argument
self.second_argument = args.second_argument
def some_operation(self):
concatenated_str = self.first_argument + self.second_argument
return concatenated_str
if __name__ == '__main__':
command = Command() # creating class instance
print(command.some_operation())
Putting it in a separate class method in this way doesn't work and produces AttributeError:
# Code above
def set_params(self):
args = self.parser.parse_args()
self.first_argument = args.first_argument
self.second_argument = args.second_argument
return self.first_argument, self.second_argument
# Code below

Python coding technique: class methods that use another method as interface to do system calls

Is there a way to better implement my code leveraging python coding techniques considering that I have a few methods that are all wrapping one specific method and adding a bit of pre/post processing to it?
Looking to leverage python coding techniques (thinking python decorators might help clean this a bit) to implement the class below.
I have a class that has one method to interface with the outside world and that other methods in the class use to execute actions and do pre/post processing of some data.
import subprocess as sp
class MyClass():
def _system_interface(self, command):
hello = ["echo", "'hello'"]
start = ["echo", "'start'"]
stop = ["echo", "'reset'"]
reset = ["echo", "'reset'"]
cmd = sp.Popen(locals()[command], stdout=sp.PIPE)
output = cmd.stdout.readlines()
print(output)
return cmd.stdout.readlines()
def call_one(self):
# Do some processing
self._system_interface("hello")
def call_two(self):
# Do some processing
self._system_interface("start")
def call_three(self):
# Do some processing
self._system_interface("stop")
if __name__ == "__main__":
c = MyClass()
c.call_one()
c.call_two()
c.call_three()
You can use a class that takes a command when instantiated, and when called, returns a decorator function that calls Popen with a command derived from the command given during the instantiation:
import subprocess as sp
class system_interface:
def __init__(self, command):
self.command = command
def __call__(self, func):
def wrapper(*args, **kwargs):
func(*args, **kwargs)
cmd = sp.Popen(['echo', self.command], stdout=sp.PIPE)
output = cmd.stdout.readlines()
print(output)
return cmd.stdout.readlines()
return wrapper
class MyClass():
#system_interface('hello')
def call_one(self):
print('one: do some processing')
#system_interface('start')
def call_two(self):
print('two: do some processing')
#system_interface('stop')
def call_three(self):
print('three: do some processing')
if __name__ == "__main__":
c = MyClass()
c.call_one()
c.call_two()
c.call_three()
This outputs:
one: do some processing
[b'hello\n']
two: do some processing
[b'start\n']
three: do some processing
[b'stop\n']

How to avoid using a global variable as a parameter to a decorator?

I am writing a python script where I am using a decorator(retry is the one I am using) that takes a parameter (tries). I want the parameter to be configurable from a command line argument. The only way I can figure out how to set the parameter for the decorator is by reading my arguments into a global variable. I hate this from a design perspective. It makes writing unit tests and anything else that wants to import any functions from my script reliant on the command line arguments being all the same.
Here is a dumbed down example of the problem I am having:
import argparse
from functools import wraps
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value', required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
def decorator_example(test_value):
def deco_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
return deco_example
#decorator_example(args.test_value)
def test():
print "running test"
if __name__ == '__main__':
test()
If anyone can think of a better way to do this without having args be a global, please share! I have exhausted the internet searching for a better way... I want to call getargs() in main and pass the arguments around as needed....
I don't think a decorator is appropriate here. A class seems more suitable precisely because of the problem you're facing. Something like this:
class Test(object):
def __init__(self, test_value):
self.test_value = test_value
def test(self):
print "The value I need is", self.test_value
self._inner_test()
def _inner_test():
print "running test"
if __name__ == '__main__':
args = get_args()
t = TestClass(args.test_value)
t.test()
How exactly to structure the class is not clear from the example you have given and would depend on what you're actually doing, but I think something in this vein will provide you with a more robust solution than trying to shoehorn this into decorators.
Classes are designed to maintain state and provide modified behavior based on that state. That is what you're doing. Your application has state that modifies its behavior. Decorators are designed to wrap extra, static functionality around an existing method.
However, if that is unsuitable, another alternative is to simply allow your arguments to be global. Something along the lines of this:
config.py
import argparse
test_value = None
def parse_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value', required=True, default="sample value")
args = parser.parse_args()
return args
def configure():
global test_value
args = parse_args()
test_value = args.test_value
main.py
from functools import wraps
import config
def decorator_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", config.test_value
return f(*args, **kwargs)
return f_example
#decorator_example
def test():
print "running test"
if __name__ == '__main__':
config.configure()
test()
One nice side of this solution is that it gives you an obvious way to also supplement your arguments with a configuration file. Note that this should actually work since config.test_value is not actually read until test is called.
Separate those things which are useful upon import from those things which are only relevant when run as a script:
from functools import wraps
def decorator_example(test_value):
def deco_example(f):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
return deco_example
def base_test():
print "running test"
if __name__ == '__main__':
import argparse
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value',
required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
test = decorator_example(args.test_value)(base_test)
test()
I think the issue of global variables is a red-herring here. There is nothing wrong with constant global variables. Every time you import a module, the module name is a global variable. Every time you define a function (at the module level), the function name becomes a global variable.
Problems arise only when functions modify global variables. When that happens, understanding the behavior of functions that depend on the global can become much more complex. If a chain of functions each modify the same global, then you can no longer understand each function as an isolated unit. You have to grok all the functions and how they interact with that global at the same time. This can get complicated quickly and it is why this path often leads to spaghetti code.
This is why modifying global variables should be avoided. But you aren't modifying any global variables here, so I think this is a non-issue. My beef with args.test_value is not that it is global, but rather that there was not sufficient separation of module code versus script code. args.test_value belongs with the script code.
Parse your args in the "if name" section and pass them to your function as an arg. That way other scripts can specify a value for args instead of relying on command line arguments.
The problem is your can't define the decorator until after the function you want to apply it to is defined. One workaround would be to postpone decorating the functions until the value is defined, which in turns requires that they be stored somewhere until that happens.
It also means a global variable will be required temporarily and its use isolated from the rest of the program. Here's how that could be done with your sample code:
from functools import wraps
class decorator_example(object):
def __init__(self, f):
global _funcs
self.f = f
try:
_funcs.append(self) # remember decorator_example instances
except NameError:
_funcs = [self] # first one
def __call__(self, *args, **kwargs):
print 'running decoratored {}() function'.format(self.f.__name__)
return self.f(*args, **kwargs)
def apply_decorator(deco, test_value):
global _funcs
for d in _funcs:
print 'decorating function {}()'.format(d.f.__name__)
d.f = deco(d.f, test_value)
del _funcs # no longer needed
#decorator_example
def test():
print "running test"
def deco_example(f, test_value):
#wraps(f)
def f_example(*args, **kwargs):
print "The value I need is", test_value
return f(*args, **kwargs)
return f_example
if __name__ == '__main__':
import argparse
def get_args():
parser = argparse.ArgumentParser()
parser.add_argument('-t', '--test_value', dest='test_value',
required=True, default="sample value")
args = parser.parse_args()
return args
args = get_args()
apply_decorator(deco_example, args.test_value)
test()
If I understand it correctly, usage of global variables can be alleviated with class and member variables. But in Python unless you design carefully, you can not avoid global variables

how to pass argparse arguments to a class

I'm very new to coding in general and Python in particular. I'm trying to learn how to pass argparse arguments I have created into a class for use the right/recommend way. In addition to learning python, I'm trying to learn how to do things in an OOP manner so that learning other, OOP-type languages comes a bit easier.
So here's a sample of what I am trying to do:
import argparse
class passyourcliargstome():
def __init__(self, whatdoiputheretogetmycliargs):
#how do I get my cli args here?
pass
def otherfunctionsthatdothings():
pass
if __name__ == '__main__':
#grab the arguments when the script is ran
parser = argparse.ArgumentParser(
description='Make things happen.')
parser.add_argument('-f', '--foo', action='store_true', default=False, help='here be dragons')
parser.add_argument('-b', '--bar', action='store_true', default=False, help='here be more dragons')
passyourcliargstome.otherfunctionsthatdothings()
So, I'm defining argparse arguments outside of the main class, and want to know how to get them inside the class. Is this even the right way to do it? should I just make argparse a function under my class?
Thank you in advance for any assistance, references, etc.
Edit: 11/16 2:18 EST
Note: Since I don't have enough rep to answer my own question, this is my only recourse for posting a proper answer.
Okay, it took me some doing, but I managed to piece this together. RyPeck's answers helped me in getting my arguments (something my code was missing), but then afterwards I was getting unbound method errors When I was trying to test the code. I had no idea what that meant. Did I mention that I live up to my screen name?
It didn't really click until I found and read this. Here is my working code. If anyone has anything to add to this, up to and including "You're still doing it wrong, do it this way, the right way." I'm all ears. In the meantime, thanks for your help.
import argparse
class Passyourcliargstome(object):
def __init__(self):
#here's how I got my args here
self.foo = args.foo
self.bar = args.bar
def otherfunctionsthatdothings(self):
print "args inside of the main class:"
print self.foo
print self.bar
if __name__ == '__main__':
#grab the arguments when the script is ran
parser = argparse.ArgumentParser(description='Make things happen.')
parser.add_argument('-f', '--foo', action='store_true', default=False, help='here be dragons')
parser.add_argument('-b', '--bar', action='store_true', default=False, help='here be more dragons')
args = parser.parse_args()
print "args outside of main:"
print args.foo
print args.bar
#this was the part that I wasn't doing, creating an instance of my class.
shell = Passyourcliargstome()
shell.otherfunctionsthatdothings()
Running this code with no arguments prints False four times. two times outside of the class instance, two times within the class instance.
Use parser.parse_args and wrap it with vars to convert the special argparse Namespace type to a regular Python dict. In general, you want this pattern:
def main():
parser = argparse.ArgumentParser()
parser.add_argument('foo')
parser.add_argument('bar')
args = parser.parse_args()
args_dict = vars(args)
After that, you can pass arguments explicitly or all at once to whatever class or function will take it. For a class, it is best to write the required arguments explicitly. Like so:
class MyClass(object):
def __init__(self, foo, bar):
self.foo = foo
self.bar = bar
def Print(self):
print self.foo
print self.bar
Now you can put those two together like this:
import argparse
class MyClass(object):
def __init__(self, foo, bar):
self.foo = foo
self.bar = bar
def Print(self):
print self.foo
print self.bar
def main():
parser = argparse.ArgumentParser()
parser.add_argument('foo')
parser.add_argument('bar')
args = parser.parse_args()
c1 = MyClass(args.foo, args.bar)
args_dict = vars(args)
c2 = MyClass(**args_dict)
Both c1 and c2 will be created. Best approach, though, is to create classes explicitly, as is done with c1.
A simple for loop can pass argument (or- set your attributes):
args_dict = vars(self.parser.parse_args())
# using argparse arguments as attributes of this (self) class
for item in args_dict:
setattr(self, item, args_dict[item])
but... maybe the elegant way would be to initialize your class with argparse and set them directly to the class by namespace:
class Foo:
def __init__(self)
self.parser = ArgumentParser()
self.parser.add_argument('-f', '--foo, default=False, action='store_true', help='foo or not?')
self.parser.add_argument('-b', '--bar', default=0, action='store', help='set the bar')
self.parser.parse_args(namespace=self)
an empty input is equivalent to:
class Foo:
def __init__(self)
self.foo = False
self.bar = 0
You have to do the following after you add your arguments.
args = parser.parse_args()
If you do a print on args, you'll see that you have all the arguments in a namespace argument.
You can then access them like so -
print args.foo
print args.bar
From there, you can treat them like normal variables. See the argparse documentation for greater detail and more info.

Using 'super' when subclassing python class that is not derived from `object`(old-style?)

I'm playing with subclassing OptionParser from the std library module optparser. (Python 2.5.2) When I attempt it I get the exception:
TypeError: super() argument 1 must be type, not classobj
Looking at OptionParser, it is not derived from object. So I added object as a parent class, (shown below) and super works properly.
from optparse import OptionParser, Option
class MyOptionParser(OptionParser, object):
"""Class to change
"""
def __init__(self,
usage=None,
option_list=None,
option_class=Option,
version=None,
conflict_handler="error",
description=None,
formatter=None,
add_help_option=True,
prog=None,
epilog=None,
):
super(MyOptionParser, self).__init__(usage, option_list, option_class, version, conflict_handler, description, formatter, add_help_option, prog, epilog)
if __name__ == '__main__':
"""Run a quick test
"""
parser = MyOptionParser()
parser.add_option("-t", "--test", type="string", dest="test")
(options, args) = parser.parse_args()
print "The test option is: %s" % options.test
Is this the correct way to go about it?
Yes I dont see why it would not work. You just need to add couple of spaces right before that super call - as it's written right now, it is not part of your custom init method. Also, a shortcut you might want to use is **kwargs - you can do kwargs key check in your method if thats what you desire to do:
class MyOptionParser(OptionParser, object):
"""Class to change
"""
def __init__(self, **kwargs):
# You can limit kwargs keys here
super(MyOptionParser, self).__init__(**kwargs)

Categories

Resources