Python 3.8.2, Fabric 2.5.0, Paramiko 2.7.2, Invoke 1.4.1
Hello,
I have a fabfile which needs to handle hosts passed at the command-line (using -H) and hosts defined in the fabfile if -H was not passed. Here's an example of the issue I'm facing:
target_group = None
#task
def prod(c):
_env_handler(c, "prod")
def _env_handler(c, env_name):
global target_group
if not hasattr(c, 'host'):
target_group = Group("somehost1.tld", "somehost2.tld")
#task(hosts=target_group)
def test(c):
print(c)
If I run fab prod test:
<Context: <Config: {'run': {'asynch ...
If I run fab -H 1,2 test:
<Connection host=1>
<Connection host=2>
So, passing hosts using the #task(hosts=[...] decorator produces a c Context object, and using -H produces a c Connection object.
I know using a task (prod(c)) to wrap environment logic may be questionable...but is there a way to ensure the task (test(c)) always receives a Connection object...or am I fundamentally misunderstanding something?
Thanks.
Edit: I've also tried directly passing a hosts list (e.g. #task(hosts=["somehost1.tld", "somehost2.tld"])) with the same result.
Edit: Here's the current workaround, but it's obviously not ideal if you have a lot of tasks:
#task
def test(c):
if not hasattr(c, 'host'):
for c in target_group:
test(c)
else:
logging.info(f"Targeting {c.host}")
Workaround using a custom task decorator:
def _task_handler(func):
#task
#functools.wraps(func)
def wrapper(c):
if not hasattr(c, 'host'):
for c in target_group:
func(c)
else:
func(c)
return wrapper
Related
In fabric2's task definition, how do I get the command line argument "-H --hosts"? Since I need to create connection myself.
eg.
fab -H web1,db1 task1
#task
def task1(c):
# HOW TO GET 'web1,db1' HERE???
with Connection(host=???, connect_kwargs={}) as conn:
conn.put('a','b')
do_my_stuff()
After a short while I figured it out.
#task
def task1(ctx):
with Connection(host=ctx.host, connect_kwargs={}) as c:
c.put('a', '/tmp/a')
Is it possible to have local and remote tasks execute from within the same task method?
e.g., I want to do something like the following:
#fabric.api.task
def Deploy():
PrepareDeploy()
PushDeploy()
execute(Extract())
execute(Start())
Where PrepareDeploy and PushDeploy are local tasks (executing only locally, via the fabric.api.local() method):
#fabric.api.task
#fabric.decorators.runs_once
def PrepareDeploy():
#fabric.api.task
#fabric.decorators.runs_once
def PushDeploy():
And Extract/Start are methods that should be run on the remote hosts themselves:
#fabric.api.task
def Extract():
#fabric.api.task
def Start():
However, when I try to do fab Deploy, I get something like:
[remote1.serv.com] Executing task 'Deploy'
[localhost] local: find . -name "*.java" > sources.txt
...
The first line seems wrong to me (and in fact, causes errors).
You can spawn new task and defining on what hosts should it run, for example - how to create rabbitmq of all hosts are provisioned with puppet with the same erlang cookie.
See around line 114 - there is an executon of the tasks on specific hosts.
https://gist.github.com/nvtkaszpir/17d2e2180771abd93c46
I hope this helps.
This question already has answers here:
How can I properly set the `env.hosts` in a function in my Python Fabric `fabfile.py`?
(5 answers)
Closed 9 years ago.
I'm cutting my teeth on Python as I work with Fabric. Looks like I have a basic misunderstanding of how Python and/or Fabric works. Take a look at my 2 scripts
AppDeploy.py
from fabric.api import *
class AppDeploy:
# Environment configuration, all in a dictionary
environments = {
'dev' : {
'hosts' : ['localhost'],
},
}
# Fabric environment
env = None
# Take the fabric environment as a constructor argument
def __init__(self, env):
self.env = env
# Configure the fabric environment
def configure_env(self, environment):
self.env.hosts.extend(self.environments[environment]['hosts'])
fabfile.py
from fabric.api import *
from AppDeploy import AppDeploy
# Instantiate the backend class with
# all the real configuration and logic
deployer = AppDeploy(env)
# Wrapper functions to select an environment
#task
def env_dev():
deployer.configure_env('dev')
#task
def hello():
run('echo hello')
#task
def dev_hello():
deployer.configure_env('dev')
run('echo hello')
Chaining the first 2 tasks works
$ fab env_dev hello
[localhost] Executing task 'hello'
[localhost] run: echo hello
[localhost] out: hello
Done.
Disconnecting from localhost... done.
however, running the last task, which aims to configure the environment and do something in a single task, it appears fabric does not have the environment configured
$ fab dev_hello
No hosts found. Please specify (single) host string for connection:
I'm pretty lost though, because if I tweak that method like so
#task
def dev_hello():
deployer.configure_env('dev')
print(env.hosts)
run('echo hello')
it looks like env.hosts is set, but still, fabric is acting like it isn't:
$ fab dev_hello
['localhost']
No hosts found. Please specify (single) host string for connection:
What's going on here?
I'm not sure what you're trying to do, but...
If you're losing info on the shell/environment -- Fabric runs each command in a separate shell statement, so you need to either manually chain the commands or use the prefix context manager.
See http://docs.fabfile.org/en/1.8/faq.html#my-cd-workon-export-etc-calls-don-t-seem-to-work
If you're losing info within "python", it might be tied into this bug/behavior that I ran into recently [ https://github.com/fabric/fabric/issues/1004 ] where the shell i entered into Fabric with seems to be obliterated.
How do I NOT parallelize local commands inside a task with #parallel decorator:
#parallel
def myTask():
local('initialize localhost')
run('command on remote host')
local('clean up localhost')
I want commands on local host only to execute once, and commands for remote hosts run in parallel. Right now I'm seeing local host commands running for each remote host instance. What is the cleanest way to do this?
Thanks
Group your parallel commands into a decorated function. Then use execute() to call it inbetween the local calls.
Does the following work for you?
def local_init():
local('initialize')
#parallel
def myTask():
run('remote command')
def local_cleanup():
local('clean up')
And then:
fab local_init myTask local_cleanup
I have looked at the documentation and haven't found this. fab -l or fab -d do not display the expected parameters. I also played with*fab -l <task>* and the like to see if there was some undocumented support for this. Anyone know how or have suggestions?
I haven't found any automated way. What I do is put it into the docstring as in:
#task
def sometask(parma='Foo'):
"""Does some common, tedious task.
sometask:parma=Foo
"""
So when you execute fab -d sometask you get:
Displaying detailed information for task 'sometask':
Does some common, tedious task.
sometask:parma=Foo
I have fabric version 1.8.2 and when I run fab -d sometask I get the following
⇒ fab -d build
Displaying detailed information for task 'build':
Build task description
Arguments: version='dev'
And I have not added anything about the arguments in the docstring. So I suppose that Fabric's dev have added that feature.
This doesn't seem to work with any tasks that are decorated with anything other than the default decorators like #task.
fab -d hostName
Displaying detailed information for task 'hostName':
No docstring provided
Arguments: arg='test'
And this is the Fabric task.
#task
def hostName(arg='test'):
run("hostname -f")
I guess it's probably something to do with the bubbling up of arguments
You can make it work with a bit of fiddling. I am running 1.11.1. It appears to have a lot to do with the order of the decorators. For example, here are various combinations and the results of running fab -d <task> for these code blocks:
Here is works as expected with JUST the #task decorator:
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #runs_once decorator BELOW #task and no args are shown:
#task
#runs_once
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments:
Add a #runs_once decorator ABOVE #task and args are shown:
#runs_once
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #parallel decorator BELOW #task and args are not shown (jut like #runs_once) however, trying to add #parallel ABOVE #task makes fabric think this is not even a task anymore. If you replace #parallel with #serial (above #task) it DOES show the args.
#parallel
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Fatal error: Task 'setupDNS' does not appear to exist. Valid task names:
addNode
auditHost
But, there is a workaround for the #parallel decorator. You MUST provide an argument to the decorator. I just use the default of None so that, internally, no behavior changes occur during execution but fabric is happy and displays the args for me. You still do need to stack #parallel ABOVE #task.
#parallel(pool_size=None)
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface