I cannot figure out how to actually pass arguments to a fabric custom task.
I have a bunch of tasks that all need to do the same setup, so I was hoping to subclass the task and have the base class do the setup and then run the specific subtasks. Both the setup code and the subtasks need access to some arguments that are passed in from the command-line to the task. I also need to be able to set default values for the arguments.
Original Attempt
My original attempt shows what I am trying to do without any sub classes.
This code works correctly.
The code below is in file tmp1.py:
from fabric.api import task
def do_setup(myarg):
''' common setup for all tasks '''
print "in do_setup(myarg=%s)" % myarg
# do setup using myarg for something important
#task
def actual_task1(myarg='default_value', alias='at'):
print "In actual_task1(myarg=%s)" % myarg
do_setup(myarg)
# do rest of work ...
#task
def actual_task2(myarg='default_value', alias='at'):
print "In actual_task2(myarg=%s)" % myarg
do_setup(myarg)
# do rest of work ...
I run it from the command-line without any args and correctly see the default for myarg of 'default_value'
fab -f ./tmp1.py actual_task1
Prints:
In actual_task1(myarg=default_value)
in do_setup(myarg=default_value)
Done.
Then I call it with myarg='hello' and see that 'hello' gets passed through correctly
fab -f ./tmp1.py actual_task1:myarg='hello'
It outputs:
In actual_task1(myarg=hello)
in do_setup(myarg=hello)
Done.
Attempt with a custom task
My next attempt is to make a common task to encapsulate the setup part.
This is copied from http://docs.fabfile.org/en/1.5/usage/tasks.html
The code below is in the file tmp2.py:
from fabric.api import task
from fabric.tasks import Task
def do_setup(myarg):
''' common setup for all tasks '''
print "in do_setup(myarg=%s)" % myarg
# do setup using myarg for something important
'''
Attempt to make a common task to encapsulate the setup part
copied from http://docs.fabfile.org/en/1.5/usage/tasks.html
'''
class CustomTask(Task):
def init(self, func, myarg, args, *kwargs):
super(CustomTask, self).init(args, *kwargs)
print("=> init(myarg=%s, args=%s, kwargs=%s" % (myarg, args, kwargs))
self.func = func
self.myarg = myarg
print "in init: self.func=",self.func,"self.myarg=",self.myarg
def run(self, *args, **kwargs):
return self.func(self.myarg, *args, **kwargs)
#task(task_class=CustomTask, myarg='default_value', alias='at')
def actual_task1():
print "In actual_task1(myarg=%s)" % myarg
# do rest of work ...
When run, there are 2 problems:
__init__ gets "default_value" instead of "Hello"
It complains that actual_task1() expects 0 arguments
I run it this way:
fab -f ./tmp2.py actual_task1:myarg="Hello"
Prints:
=> init(myarg=default_value, args=(), kwargs={'alias': 'at'}
in init: self.func= self.myarg= default_value
Traceback (most recent call last):
File "/home/xxx/Documents/pyenvs/xxx/local/lib/python2.7/site-packages/fabric/main.py", line 743, in main args, *kwargs
File "/home/xxx/Documents/pyenvs/xxx/local/lib/python2.7/site-packages/fabric/tasks.py", line 405, in execute results[''] = task.run(args, *new_kwargs)
File "/home/xxx/test_fab/tmp2.py", line 21, in run
return self.func(self.myarg, args, *kwargs)
TypeError: actual_task1() takes no arguments (1 given)
I spent quite a bit of time trying to make this work but I cannot seem to solve the default_value issue. I must be missing something?
I would appreciate some help figuring out how to make this sample program run. The second version with the custom task needs to behave just like the original version I showed.
Thank you for any help with this issue.
Fixed example with setup:
from fabric.api import task
from fabric.tasks import Task
def do_setup(foo, verbose):
''' common setup for all tasks '''
print "IN do_setup(foo=%s, verbose=%s)" % (foo, verbose)
# do setup using foo and verbose...
class CustomTask(Task):
def __init__(self, func, *args, **kwargs):
'''
The special args like hosts and roles do not show up in
args, and kwargs, they are stripped already.
args and kwargs may contain task specific special arguments
(e.g. aliases, alias, default, and name) to customize the
task. They are set in the #task decorator and cannot be passed
on the command-line. Note also that these special task
arguments are not passed to the run method.
Non-special arguments (there are none in this example) are
set in the task decorator. These other arguments are not
passed to the run method and cannot be overridden from the
command-line.
Note that if you pass any "task specific special arguments" or
"non-special arguments declared in the task decorator" from the
command-line, they are treated as different arguments and the
command-line values are passed to the run method but not to
this method.
'''
super(CustomTask, self).__init__(*args, **kwargs)
print "IN __init__(args=%s, kwargs=%s)" % (args, kwargs)
self.func = func
def run(self, foo='foo_default_val', verbose='verbose_default_val',
*args, **kwargs):
'''
The arguments to this method will be:
1) arguments from the actual task (e.g. foo and verbose). This method
is where you set a default value for the arguments from the
actual_task, not on the actual_task.
2) task specific arguments from the command-line
(e.g. actual_task:bar='xxx'). This example is not expecting any,
so it strips them and does not pass them to the
actual_function one (e.g. it calls self.func with only foo
and verbose and does not pass args and kwargs)
'''
print "IN run(foo=%s, verbose=%s, args=%s, kwargs=%s)" % \
(foo, verbose, args, kwargs)
do_setup(foo, verbose)
return self.func(foo, verbose)
#task(task_class=CustomTask, alias="RUNME")
def actual_task(foo, verbose):
print 'IN task actual_task(foo=%s, verbose=%s)' % (foo, verbose)
Run with only host specified on the command-line:
fab -f ./example_with_setup.py actual_task:host='hhh'
IN __init__(args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(foo=foo_default_val, verbose=verbose_default_val, args=(), kwargs={})
IN do_setup(foo=foo_default_val, verbose=verbose_default_val)
IN task actual_task(foo=foo_default_val, verbose=verbose_default_val)
Run specifying foo on the commandline:
fab -f ./example_with_setup.py actual_task:host='hhh',foo='bar'
IN __init__(args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(foo=bar, verbose=verbose_default_val, args=(), kwargs={})
IN do_setup(foo=bar, verbose=verbose_default_val)
IN task actual_task(foo=bar, verbose=verbose_default_val)
Run specifying both foo and verbose on the command-line:
fab -f ./example_with_setup.py actual_task:host='hhh',foo='bar',verbose=True
IN __init__(args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(foo=bar, verbose=True, args=(), kwargs={})
IN do_setup(foo=bar, verbose=True)
IN task actual_task(foo=bar, verbose=True)
In the custom class section, the function actual_task1 doesn't actually take arguments, so the only valid way to invoke your fabric file is:
fab -f ./tmp2.py actual_task1
Furthermore, I don't think you're actually calling do_setup in either CustomTask or actual_task1
This is the fixed example.
# fixed the example from http://docs.fabfile.org/en/1.8/usage/tasks.html
from fabric.api import task
from fabric.tasks import Task
class CustomTask(Task):
def __init__(self, func, myarg1, *args, **kwargs):
'''
The special args like hosts and roles do not show up in
args, and kwargs, they are stripped already.
args and kwargs may contain task specific special arguments
(e.g. aliases, alias, default, and name) to customize the
task. They are set in the #task decorator and cannot be passed
on the command-line. Note also that these special task
arguments are not passed to the run method.
Non-special arguments (in this example myarg1) are set in the task
decorator. These other arguments are not passed to the run
method and cannot be overridden from the command-line.
Note that if you pass any "task specific special arguments" or
"non-special arguments declared in the task decorator" from the
command-line, they are treated as different arguments and the
command-line values are passed to the run method but not to
this method.
'''
super(CustomTask, self).__init__(*args, **kwargs)
print "IN __init__(myarg1=%s, args=%s, kwargs=%s)" % \
(myarg1, args, kwargs)
self.func = func
self.myarg1 = myarg1
def run(self, myarg2='default_value2', *args, **kwargs):
'''
The arguments to this method will be:
1) arguments from the actual task (e.g. myarg2). This method
is where you set a default value for the arguments from the
actual_task, not on the actual_task.
2) task specific arguments from the command-line
(e.g. actual_host:foo='foo'). This example is not expecting
any, so it strips them and does not pass them to the
actual_function (e.g. it calls self.func with only myarg2 and
does not pass args and kwargs)
'''
print "IN run(myarg2=%s, args=%s, kwargs=%s)" % \
(myarg2, args, kwargs)
return self.func(myarg2)
#task(task_class=CustomTask, myarg1='special_value', alias='RUNME')
def actual_task(myarg2):
print "IN actual_task(myarg2=%s)" % myarg2
Run with only hosts specified on the command-line:
fab -f ./fixed_example actual_task:hosts="hhh"
IN __init__(myarg1=special_value, args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(myarg2=default_value2, args=(), kwargs={})
IN actual_task(myarg2=default_value2)
Run specifying myarg2 on the command-line:
fab -f ./fixed_example actual_task:hosts="hhh",myarg2="good_value"
IN __init__(myarg1=special_value, args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(myarg2=good_value, args=(), kwargs={})
IN actual_task(myarg2=good_value)
Bad run specifying myarg1 and alias on the command-line. Notice that init gets the values specified in the task decorator and not the values from the command-line. Notice that run gets myarg1 and alias as arguments now.
fab -f ./fixed_example actual_task:hosts="hhh",myarg1="myarg1_from_commandline",alias="alias_from_commandline"
IN __init__(myarg1=special_value, args=(), kwargs={'alias': 'RUNME'})
[hhh] Executing task 'actual_task'
IN run(myarg2=default_value2, args=(), kwargs={'alias': 'alias_from_commandline', 'myarg1': 'myarg1_from_commandline'})
IN actual_task(myarg2=default_value2)
Related
I understand the purpose of functools.wraps - to carry over attributes like __doc__, __name__, etc. But I'm struggling to explain why the below is happening. I have this decorator function in Python:
from functools import update_wrapper
class Environment:
...
def requires_environment(function: Callable) -> Callable:
def wrapped_function(*args, environment, **kwargs):
if not isinstance(environment, Environment):
raise TypeError("The environment keyword argument must be of type Environment.")
kwargs["environment"] = environment
return function(*args, **kwargs)
update_wrapper(wrapped_function, function)
return wrapped_function
Then I decorate a function with it:
#requires_environment
def function_with_args(arg1, arg2, environment):
pass
When I do not provide an environment keyword to my function_with_args, I expect that the error message should say TypeError: function_with_args():
function_with_args(1, 2, Environment.PROD)
I get
TypeError: wrapped_function() missing 1 required keyword-only argument: 'environment'
Why is wrapped_function showing instead of function_with_args? If I print get rid of the syntactic sugar of the # decorator and print the function name, even after decoration, it is properly saying that the new wrapped function's name is function_with_args:
function_with_args = requires_environment(function_with_args)
print("After", function_with_args.__name__) # After function_with_args
function_with_args(1, 2, Environment.PROD)
Yet I still see my wrapped_function name in the TypeError. Why is this occuring?
Below is my complete reproducible example:
from time import perf_counter
from typing import Callable
from enum import Enum
from functools import update_wrapper
class Environment(Enum):
PROD = "PROD"
STAGING = "STAGING"
DEV = "DEV"
def requires_environment(function: Callable) -> Callable:
def wrapped_function(*args, **kwargs):
if "environment" not in kwargs:
raise TypeError(f"{function.__name__}() missing 1 required keyword-only argument: 'environment'")
if not isinstance(kwargs['environment'], Environment):
raise TypeError("The environment keyword argument must be of type Environment.")
return function(*args, **kwargs)
update_wrapper(wrapped_function, function)
return wrapped_function
#requires_environment
def function_with_args(arg1, arg2, environment):
print(f"Inside function {function_with_args.__name__}")
print(f"Docstrings: {function_with_args.__doc__}")
function_with_args(1, 2, Environment.PROD)
This error message uses the code object's name, not the function object's name. Code objects can't be renamed; even after functools.update_wrapper sets the function's __name__, the function's code object's co_name is still wrapped_function. You can see this by examining function_with_args.__code__.co_name.
This behavior is scheduled to change in Python 3.10 to use the function's __qualname__, to better disambiguate identically-named methods of different classes.
Let me try:
When you pass you function argument are below
function_with_args(1, 2, Environment.PROD)
the wrapper function is taking all the inputs are as args only!
def wrapped_function(*args, environment, **kwargs):
As you can see, the wrapper function is unpacking all input arguments based on a key-value argument environment. All args before this key becomes part of *args and all args after this becomes *kwargs
Wrapper function looks up for environment=<something>! So you always have to pass like
function_with_args(1, 2, environment=Environment.PROD)
Suggestion: I would try to do like this
def wrapped_function(*args, **kwargs): # this is good! dont change it here
...
if `environment` in locals(): # checking if environment key is passed/defined as args
print(environment) # do you stuff here
I saw the below function provided for mocking multi-processing call in python
class MockPoolApplyResult:
def __init__(self, func, args):
self._func = func
self._args = args
def get(self, timeout=0):
return self._func(*self._args)
monkeypatch.setattr("multiprocessing.pool.Pool.starmap",
lambda self, func, args=(), kwds={}, callback=None, error_callback=None:
MockPoolApplyResult(func, args))
What does the below lambda function do and how can I check the number of times it is called?
monkeypatch.setattr("multiprocessing.pool.Pool.starmap",
lambda self, func, args=(), kwds={}, callback=None, error_callback=None:
MockPoolApplyResult(func, args))
The lambda is equivalent to this:
def mock_starmap(self, func, args=(), kwds={}, callback=None, error_callback=None):
return MockPoolApplyResult(func, args)
monkeypatch.setattr("multiprocessing.pool.Pool.starmap", mock_starmap)
It replaces the starmap method on the Pool class object here with a function that takes the exact same arguments.
The intended behavior is probably that when you have my_pool = Pool(3) and run my_pool.starmap(myfunc, [1,2,3]); my_pool.get() it would execute myfunc(1), myfunc(2) and myfunc(3) within the current process, as opposed to these calls being performed by processes in the pool. But it looks like it will actually call myfunc(1, 2, 3) because of how MockPoolApplyResult.get() is written.
Edit. Forgot to answer part 2 of your question: you can't trace the number of calls to a monkeypatched function. For this you'll need to mock the starmap function instead, and then the mocked object will track the number of calls. You can do this either using the built-in unittest.mock module directly or with the pytest-mock wrapper package.
I am wondering if someone can help me in how to pass the argument while calling the job function using the schedule library. I see there are couple of example on the same but nothing when you are using the threading and run_threaded function.
In the below code snippet i am trying to pass the 'sample_input' as an argument and confused how to define this parameter.
def run_threaded(job_func):
job_thread = threading.Thread(target=job_func)
job_thread.start()
#with_logging
def job(input_name):
print("I'm running on thread %s" % threading.current_thread())
main(input_name)
schedule.every(10).seconds.do(run_threaded, job(‘sample_input’))
You could get by altering the method definitions and invoke signatures to something similar below.
# run_threaded method accepts arguments of job_func
def run_threaded(job_func, *args, **kwargs):
print "======", args, kwargs
job_thread = threading.Thread(target=job_func, args=args, kwargs=kwargs)
job_thread.start()
# Invoke the arguments while scheduling.
schedule.every(10).seconds.do(run_threaded, job, "sample_input")
I use a simple command to run my tests with nose:
#manager.command
def test():
"""Run unit tests."""
import nose
nose.main(argv=[''])
However, nose supports many useful arguments that now I could not pass.
Is there a way to run nose with manager command (similar to the call above) and still be able to pass arguments to nose? For example:
python manage.py test --nose-arg1 --nose-arg2
Right now I'd get an error from Manager that --nose-arg1 --nose-arg2 are not recognised as valid arguments. I want to pass those args as nose.main(argv= <<< args comming after python manage.py test ... >>>)
Flask-Script has an undocumented capture_all_flags flag, which will pass remaining args to the Command.run method. This is demonstrated in the tests.
#manager.add_command
class NoseCommand(Command):
name = 'test'
capture_all_args = True
def run(self, remaining):
nose.main(argv=remaining)
python manage.py test --nose-arg1 --nose-arg2
# will call nose.main(argv=['--nose-arg1', '--nose-arg2'])
In the sources of flask_script you can see that the "too many arguments" error is prevented when the executed Command has the attribute capture_all_args set to True which isn't documented anywhere.
You can set that attribute on the class just before you run the manager
if __name__ == "__main__":
from flask.ext.script import Command
Command.capture_all_args = True
manager.run()
Like this additional arguments to the manager are always accepted.
The downside of this quick fix is that you cannot register options or arguments to the commands the normal way anymore.
If you still need that feature you could subclass the Manager and override the command decorator like this
class MyManager(Manager):
def command(self, capture_all=False):
def decorator(func):
command = Command(func)
command.capture_all_args = capture_all
self.add_command(func.__name__, command)
return func
return decorator
Then you can use the command decorator like this
#manager.command(True) # capture all arguments
def use_all(*args):
print("args:", args[0])
#manager.command() # normal way of registering arguments
def normal(name):
print("name", name)
Note that for some reason flask_script requires use_all to accept a varargs but will store the list of arguments in args[0] which is a bit strange. def use_all(args): does not work and fails with TypeError "got multiple values for argument 'args'"
Ran into an issue with davidism's soln where only some of the args were being received
Looking through the docs a bit more it is documented that nose.main automatically picks up the stdin
http://nose.readthedocs.io/en/latest/api/core.html
So we are now just using:
#manager.add_command
class NoseCommand(Command):
name = 'nose'
capture_all_args = True
def run(self, remaining):
nose.main()
I use celery in my application to run periodic tasks. Let's see simple example below
from myqueue import Queue
#perodic_task(run_every=timedelta(minutes=1))
def process_queue():
queue = Queue()
uid, questions = queue.pop()
if uid is None:
return
job = group(do_stuff(q) for q in questions)
job.apply_async()
def do_stuff(question):
try:
...
except:
...
raise
As you can see in the example above, i use celery to run async task, but (since it's a queue) i need to do queue.fail(uid) in case of exception in do_stuff or queue.ack(uid) otherwise. In this situation it would be very clear and usefull to have some callback from my task in both cases - on_failure and on_success.
I saw some documentation, but never seen practices of using callbacks with apply_async. Is it possible to do that?
Subclass the Task class and overload the on_success and on_failure functions:
from celery import Task
class CallbackTask(Task):
def on_success(self, retval, task_id, args, kwargs):
'''
retval – The return value of the task.
task_id – Unique id of the executed task.
args – Original arguments for the executed task.
kwargs – Original keyword arguments for the executed task.
'''
pass
def on_failure(self, exc, task_id, args, kwargs, einfo):
'''
exc – The exception raised by the task.
task_id – Unique id of the failed task.
args – Original arguments for the task that failed.
kwargs – Original keyword arguments for the task that failed.
'''
pass
Use:
#celery.task(base=CallbackTask) # this does the trick
def add(x, y):
return x + y
You can specify success and error callbacks via the link and link_err kwargs when you call apply_async. The celery docs include a clear example: http://docs.celeryproject.org/en/latest/userguide/calling.html#linking-callbacks-errbacks