Im trying to execute main task that needs to execute tasks differently for each host. In the following setup, task 'sometask' will get execute twice for each host. What is the best way to prevent that?
#task
#hosts('host1', 'host2')
def test():
execute(do_something_everywhere) # execute on both hosts
execute(sometask, 'arg1', host='host1') # execute on host1 only
execute(sometask, 'arg2', host='host2') # execute on host2 only
You can user the #runs_once decorator to remedy this but I find that can cause extra work making wrapper functions to get the execution order you want, so here's a quick fix using the env.host_string value to evaluate which server you are deploying to and adjust your script accordingly:
#hosts('host1', 'host2')
#task
def checkout(branch='master'):
execute(_test_task_w_arg, 'all-servers')
execute(_test_task_w_arg, 'arg1' if env.host_string == 'host1' else 'arg2')
def _test_task_w_arg(arg1):
local("touch test-file-" + arg1)
Results in this output which seems to achieve what you want:
[host1] Executing task 'checkout'
[localhost] local: touch test-file-all-servers
[localhost] local: touch test-file-arg1
[host2] Executing task 'checkout'
[localhost] local: touch test-file-all-servers
[localhost] local: touch test-file-arg2
Related
I have a Fabric script with many tasks. I'd like to add a simple "yes/no" confirmation at the very beginning that includes the hosts to be run on and the task requested.
hosts = env.hosts
task_name = ?
if not confirm('Run "%s" on %s?' % (task_name, hosts)):
abort('Aborting per user request.')
So, when I run fab -H user#1.2.3.4 deploy, the confirmation will be Run "deploy" on user#1.2.3.4?
Unlike the well-documented env.hosts, I cannot find an env.task_name variable to achieve this.
Is it possible to have local and remote tasks execute from within the same task method?
e.g., I want to do something like the following:
#fabric.api.task
def Deploy():
PrepareDeploy()
PushDeploy()
execute(Extract())
execute(Start())
Where PrepareDeploy and PushDeploy are local tasks (executing only locally, via the fabric.api.local() method):
#fabric.api.task
#fabric.decorators.runs_once
def PrepareDeploy():
#fabric.api.task
#fabric.decorators.runs_once
def PushDeploy():
And Extract/Start are methods that should be run on the remote hosts themselves:
#fabric.api.task
def Extract():
#fabric.api.task
def Start():
However, when I try to do fab Deploy, I get something like:
[remote1.serv.com] Executing task 'Deploy'
[localhost] local: find . -name "*.java" > sources.txt
...
The first line seems wrong to me (and in fact, causes errors).
You can spawn new task and defining on what hosts should it run, for example - how to create rabbitmq of all hosts are provisioned with puppet with the same erlang cookie.
See around line 114 - there is an executon of the tasks on specific hosts.
https://gist.github.com/nvtkaszpir/17d2e2180771abd93c46
I hope this helps.
This question already has answers here:
How can I properly set the `env.hosts` in a function in my Python Fabric `fabfile.py`?
(5 answers)
Closed 9 years ago.
I'm cutting my teeth on Python as I work with Fabric. Looks like I have a basic misunderstanding of how Python and/or Fabric works. Take a look at my 2 scripts
AppDeploy.py
from fabric.api import *
class AppDeploy:
# Environment configuration, all in a dictionary
environments = {
'dev' : {
'hosts' : ['localhost'],
},
}
# Fabric environment
env = None
# Take the fabric environment as a constructor argument
def __init__(self, env):
self.env = env
# Configure the fabric environment
def configure_env(self, environment):
self.env.hosts.extend(self.environments[environment]['hosts'])
fabfile.py
from fabric.api import *
from AppDeploy import AppDeploy
# Instantiate the backend class with
# all the real configuration and logic
deployer = AppDeploy(env)
# Wrapper functions to select an environment
#task
def env_dev():
deployer.configure_env('dev')
#task
def hello():
run('echo hello')
#task
def dev_hello():
deployer.configure_env('dev')
run('echo hello')
Chaining the first 2 tasks works
$ fab env_dev hello
[localhost] Executing task 'hello'
[localhost] run: echo hello
[localhost] out: hello
Done.
Disconnecting from localhost... done.
however, running the last task, which aims to configure the environment and do something in a single task, it appears fabric does not have the environment configured
$ fab dev_hello
No hosts found. Please specify (single) host string for connection:
I'm pretty lost though, because if I tweak that method like so
#task
def dev_hello():
deployer.configure_env('dev')
print(env.hosts)
run('echo hello')
it looks like env.hosts is set, but still, fabric is acting like it isn't:
$ fab dev_hello
['localhost']
No hosts found. Please specify (single) host string for connection:
What's going on here?
I'm not sure what you're trying to do, but...
If you're losing info on the shell/environment -- Fabric runs each command in a separate shell statement, so you need to either manually chain the commands or use the prefix context manager.
See http://docs.fabfile.org/en/1.8/faq.html#my-cd-workon-export-etc-calls-don-t-seem-to-work
If you're losing info within "python", it might be tied into this bug/behavior that I ran into recently [ https://github.com/fabric/fabric/issues/1004 ] where the shell i entered into Fabric with seems to be obliterated.
Given a file myapp.py
from celery import Celery
celery = Celery("myapp")
celery.config_from_object("celeryconfig")
#celery.task(default_retry_delay=5 * 60, max_retries=12)
def add(a, b):
with open("try.txt", "a") as f:
f.write("A trial = {}!\n".format(a + b))
raise add.retry([a, b])
Configured with a celeryconfig.py
CELERY_IMPORTS = ["myapp"]
BROKER_URL = "amqp://"
CELERY_RESULT_BACKEND = "amqp"
I call in the directory that have both files:
$ celeryd -E
And then
$ python -c "import myapp; myapp.add.delay(2, 5)"
or
$ celery call myapp.add --args="[2, 5]"
So the try.txt is created with
A trial = 7!
only once.
That means, the retry was ignored.
I tried many other things:
Using MongoDB as broker and backend and inspecting the database (strangely enough, I can't see anything in my broker "messages" collection even in a "countdown"-scheduled job)
The PING example in here, both with RabbitMQ and MongoDB
Printing on the screen with both print (like the PING example) and logging
Make the retry call in an except block after an enforced Exception is raised, raising or returning the retry(), changing the "throw" parameter to True/False/not specified.
Seeing what's happening with celery flower (in which the "broker" link shows nothing)
But none happened to work =/
My celery report output:
software -> celery:3.0.19 (Chiastic Slide) kombu:2.5.10 py:2.7.3
billiard:2.7.3.28 py-amqp:N/A
platform -> system:Linux arch:64bit, ELF imp:CPython
loader -> celery.loaders.default.Loader
settings -> transport:amqp results:amqp
Is there anything wrong above? What I need to do to make the retry() method work?
I have looked at the documentation and haven't found this. fab -l or fab -d do not display the expected parameters. I also played with*fab -l <task>* and the like to see if there was some undocumented support for this. Anyone know how or have suggestions?
I haven't found any automated way. What I do is put it into the docstring as in:
#task
def sometask(parma='Foo'):
"""Does some common, tedious task.
sometask:parma=Foo
"""
So when you execute fab -d sometask you get:
Displaying detailed information for task 'sometask':
Does some common, tedious task.
sometask:parma=Foo
I have fabric version 1.8.2 and when I run fab -d sometask I get the following
⇒ fab -d build
Displaying detailed information for task 'build':
Build task description
Arguments: version='dev'
And I have not added anything about the arguments in the docstring. So I suppose that Fabric's dev have added that feature.
This doesn't seem to work with any tasks that are decorated with anything other than the default decorators like #task.
fab -d hostName
Displaying detailed information for task 'hostName':
No docstring provided
Arguments: arg='test'
And this is the Fabric task.
#task
def hostName(arg='test'):
run("hostname -f")
I guess it's probably something to do with the bubbling up of arguments
You can make it work with a bit of fiddling. I am running 1.11.1. It appears to have a lot to do with the order of the decorators. For example, here are various combinations and the results of running fab -d <task> for these code blocks:
Here is works as expected with JUST the #task decorator:
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #runs_once decorator BELOW #task and no args are shown:
#task
#runs_once
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments:
Add a #runs_once decorator ABOVE #task and args are shown:
#runs_once
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface
Add a #parallel decorator BELOW #task and args are not shown (jut like #runs_once) however, trying to add #parallel ABOVE #task makes fabric think this is not even a task anymore. If you replace #parallel with #serial (above #task) it DOES show the args.
#parallel
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Fatal error: Task 'setupDNS' does not appear to exist. Valid task names:
addNode
auditHost
But, there is a workaround for the #parallel decorator. You MUST provide an argument to the decorator. I just use the default of None so that, internally, no behavior changes occur during execution but fabric is happy and displays the args for me. You still do need to stack #parallel ABOVE #task.
#parallel(pool_size=None)
#task
def setupDNS(search, nameserver, interface):
'''
Configure search and nameserver in /etc/resolv.conf
'''
[mpetronic#mpws.ws fabric]$ fab -d setupDNS
Displaying detailed information for task 'setupDNS':
Configure search and nameserver in /etc/resolv.conf
Arguments: search, nameserver, interface