Listing Expected Parameters for Fabric Tasks - python

I have looked at the documentation and haven't found this. fab -l or fab -d do not display the expected parameters. I also played with*fab -l <task>* and the like to see if there was some undocumented support for this. Anyone know how or have suggestions?

I haven't found any automated way. What I do is put it into the docstring as in:
#task
def sometask(parma='Foo'):
"""Does some common, tedious task.
sometask:parma=Foo
"""
So when you execute fab -d sometask you get:
Displaying detailed information for task 'sometask':
Does some common, tedious task.
sometask:parma=Foo

I have fabric version 1.8.2 and when I run fab -d sometask I get the following
⇒ fab -d build
Displaying detailed information for task 'build':
Build task description
Arguments: version='dev'
And I have not added anything about the arguments in the docstring. So I suppose that Fabric's dev have added that feature.

This doesn't seem to work with any tasks that are decorated with anything other than the default decorators like #task.
fab -d hostName
Displaying detailed information for task 'hostName':
No docstring provided
Arguments: arg='test'
And this is the Fabric task.
#task
def hostName(arg='test'):
run("hostname -f")
I guess it's probably something to do with the bubbling up of arguments

You can make it work with a bit of fiddling. I am running 1.11.1. It appears to have a lot to do with the order of the decorators. For example, here are various combinations and the results of running fab -d <task> for these code blocks:
Here is works as expected with JUST the #task decorator:
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #runs_once decorator BELOW #task and no args are shown:
#task
#runs_once
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments:
Add a #runs_once decorator ABOVE #task and args are shown:
#runs_once
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #parallel decorator BELOW #task and args are not shown (jut like #runs_once) however, trying to add #parallel ABOVE #task makes fabric think this is not even a task anymore. If you replace #parallel with #serial (above #task) it DOES show the args.
#parallel
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Fatal error: Task 'setupDNS' does not appear to exist. Valid task names:
addNode
auditHost
But, there is a workaround for the #parallel decorator. You MUST provide an argument to the decorator. I just use the default of None so that, internally, no behavior changes occur during execution but fabric is happy and displays the args for me. You still do need to stack #parallel ABOVE #task.
#parallel(pool_size=None)
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface

Related

#task(hosts=[...]) yields Context but -H yields Connection?

Python 3.8.2, Fabric 2.5.0, Paramiko 2.7.2, Invoke 1.4.1
Hello,
I have a fabfile which needs to handle hosts passed at the command-line (using -H) and hosts defined in the fabfile if -H was not passed. Here's an example of the issue I'm facing:
target_group = None
#task
def prod(c):
_env_handler(c, "prod")
def _env_handler(c, env_name):
global target_group
if not hasattr(c, 'host'):
target_group = Group("somehost1.tld", "somehost2.tld")
#task(hosts=target_group)
def test(c):
print(c)
If I run fab prod test:
<Context: <Config: {'run': {'asynch ...
If I run fab -H 1,2 test:
<Connection host=1>
<Connection host=2>
So, passing hosts using the #task(hosts=[...] decorator produces a c Context object, and using -H produces a c Connection object.
I know using a task (prod(c)) to wrap environment logic may be questionable...but is there a way to ensure the task (test(c)) always receives a Connection object...or am I fundamentally misunderstanding something?
Thanks.
Edit: I've also tried directly passing a hosts list (e.g. #task(hosts=["somehost1.tld", "somehost2.tld"])) with the same result.
Edit: Here's the current workaround, but it's obviously not ideal if you have a lot of tasks:
#task
def test(c):
if not hasattr(c, 'host'):
for c in target_group:
test(c)
else:
logging.info(f"Targeting {c.host}")
Workaround using a custom task decorator:
def _task_handler(func):
#task
#functools.wraps(func)
def wrapper(c):
if not hasattr(c, 'host'):
for c in target_group:
func(c)
else:
func(c)
return wrapper

In Fabric, how to access the name of the requested task?

I have a Fabric script with many tasks. I'd like to add a simple "yes/no" confirmation at the very beginning that includes the hosts to be run on and the task requested.
hosts = env.hosts
task_name = ?
if not confirm('Run "%s" on %s?' % (task_name, hosts)):
abort('Aborting per user request.')
So, when I run fab -H user#1.2.3.4 deploy, the confirmation will be Run "deploy" on user#1.2.3.4?
Unlike the well-documented env.hosts, I cannot find an env.task_name variable to achieve this.

Fabric - possible to have local and remote task execution in one method?

Is it possible to have local and remote tasks execute from within the same task method?
e.g., I want to do something like the following:
#fabric.api.task
def Deploy():
PrepareDeploy()
PushDeploy()
execute(Extract())
execute(Start())
Where PrepareDeploy and PushDeploy are local tasks (executing only locally, via the fabric.api.local() method):
#fabric.api.task
#fabric.decorators.runs_once
def PrepareDeploy():
#fabric.api.task
#fabric.decorators.runs_once
def PushDeploy():
And Extract/Start are methods that should be run on the remote hosts themselves:
#fabric.api.task
def Extract():
#fabric.api.task
def Start():
However, when I try to do fab Deploy, I get something like:
[remote1.serv.com] Executing task 'Deploy'
[localhost] local: find . -name "*.java" > sources.txt
...
The first line seems wrong to me (and in fact, causes errors).
You can spawn new task and defining on what hosts should it run, for example - how to create rabbitmq of all hosts are provisioned with puppet with the same erlang cookie.
See around line 114 - there is an executon of the tasks on specific hosts.
https://gist.github.com/nvtkaszpir/17d2e2180771abd93c46
I hope this helps.

Execute fabric task for specific host only once

Im trying to execute main task that needs to execute tasks differently for each host. In the following setup, task 'sometask' will get execute twice for each host. What is the best way to prevent that?
#task
#hosts('host1', 'host2')
def test():
execute(do_something_everywhere) # execute on both hosts
execute(sometask, 'arg1', host='host1') # execute on host1 only
execute(sometask, 'arg2', host='host2') # execute on host2 only
You can user the #runs_once decorator to remedy this but I find that can cause extra work making wrapper functions to get the execution order you want, so here's a quick fix using the env.host_string value to evaluate which server you are deploying to and adjust your script accordingly:
#hosts('host1', 'host2')
#task
def checkout(branch='master'):
execute(_test_task_w_arg, 'all-servers')
execute(_test_task_w_arg, 'arg1' if env.host_string == 'host1' else 'arg2')
def _test_task_w_arg(arg1):
local("touch test-file-" + arg1)
Results in this output which seems to achieve what you want:
[host1] Executing task 'checkout'
[localhost] local: touch test-file-all-servers
[localhost] local: touch test-file-arg1
[host2] Executing task 'checkout'
[localhost] local: touch test-file-all-servers
[localhost] local: touch test-file-arg2

Python/Fabric misunderstanding [duplicate]

This question already has answers here:
How can I properly set the `env.hosts` in a function in my Python Fabric `fabfile.py`?
(5 answers)
Closed 9 years ago.
I'm cutting my teeth on Python as I work with Fabric. Looks like I have a basic misunderstanding of how Python and/or Fabric works. Take a look at my 2 scripts
AppDeploy.py
from fabric.api import *
class AppDeploy:
# Environment configuration, all in a dictionary
environments = {
'dev' : {
'hosts' : ['localhost'],
},
}
# Fabric environment
env = None
# Take the fabric environment as a constructor argument
def __init__(self, env):
self.env = env
# Configure the fabric environment
def configure_env(self, environment):
self.env.hosts.extend(self.environments[environment]['hosts'])
fabfile.py
from fabric.api import *
from AppDeploy import AppDeploy
# Instantiate the backend class with
# all the real configuration and logic
deployer = AppDeploy(env)
# Wrapper functions to select an environment
#task
def env_dev():
deployer.configure_env('dev')
#task
def hello():
run('echo hello')
#task
def dev_hello():
deployer.configure_env('dev')
run('echo hello')
Chaining the first 2 tasks works
$ fab env_dev hello
[localhost] Executing task 'hello'
[localhost] run: echo hello
[localhost] out: hello
Done.
Disconnecting from localhost... done.
however, running the last task, which aims to configure the environment and do something in a single task, it appears fabric does not have the environment configured
$ fab dev_hello
No hosts found. Please specify (single) host string for connection:
I'm pretty lost though, because if I tweak that method like so
#task
def dev_hello():
deployer.configure_env('dev')
print(env.hosts)
run('echo hello')
it looks like env.hosts is set, but still, fabric is acting like it isn't:
$ fab dev_hello
['localhost']
No hosts found. Please specify (single) host string for connection:
What's going on here?
I'm not sure what you're trying to do, but...
If you're losing info on the shell/environment -- Fabric runs each command in a separate shell statement, so you need to either manually chain the commands or use the prefix context manager.
See http://docs.fabfile.org/en/1.8/faq.html#my-cd-workon-export-etc-calls-don-t-seem-to-work
If you're losing info within "python", it might be tied into this bug/behavior that I ran into recently [ https://github.com/fabric/fabric/issues/1004 ] where the shell i entered into Fabric with seems to be obliterated.

Categories

Resources