I want to give some input (initial configuration to a fabric file)
Something like:
fab deploy MyProject
where MyProject is the string I want to input. Is it possible to do so?
You will have to specify "MyProject" thus:
fab deploy:MyProject
And your function (in your fabfile.py) will look thus:
def deploy(project):
...
where project equals "MyProject".
(more info)
In your fabfile:
# fabfile.py
# ...
if not hasattr(env, 'my_setting'):
env.my_setting = 'default'
# ...
Run fab script:
fab --set=my_setting=my_value task
More info:
fab --help
Related
I have a python script that take two path, one as input folder and other as output folder via sys.argv. For example
python script.py from to
If no path is provided let say python script.py. It take default folder which is from and to.
I have created a docker image and i am mounting my local folder this way
docker run -v "$(pwd):/folder" myimage
As in this case I am not providing folder name argument, it take them by default and put them in folder folder of docker. This is working
But if i want to pass custom path,let how can i do that?
EDIT:
Let say here is the code
argl = len(sys.argv)
if argl==1:
dir_from = 'from'
dir_to = 'to'
elif argl == 3:
dir_from = sys.argv[1]
dir_to = sys.argv[2]
So if i pass
python script.py the first if condition will work, and if i pass argument like python script.py abc/from abc/to the second elif condition work.
docker run -v "$(pwd):/folder" myimage This command pick the first condition, but how to pass custom path to it.
For example some thing link that
docker run -v "abc/from abc/to:/folder" myimage
Here's how to pass in default values for your from and to path parameters at the time that you launch your container.
Define the two env vars as part of launching your container:
FROM_PATH = <get the from path default from wherever is appropriate>
TO_PATH = <get the from path default from wherever is appropriate>
Launch your container:
docker run -e FROM_PATH -e TO_PATH ... python script.py
You could specify the values for the env vars in the run command itself via something like -e FROM_PATH=/a/b/c. If you don't provide values, the values are assumed to be already defined in local env vars with the same name, as I did above.
Then inside your container code:
argl = len(sys.argv)
if argl==1:
dir_from = os.getenv('FROM_PATH')
dir_to = os.getenv('TO_PATH')
elif argl == 3:
dir_from = sys.argv[1]
dir_to = sys.argv[2]
I want to reuse the fabfile for multiple projects.
config.ini
[project1]
git_repo = git#github/project1
project_path = '/path/project1'
[project2]
git_repo = git#github/project22
project_path = '/path/project2'
fabfile.py
from fabric import task
config = configparser.ConfigParser()
config.read("conf.ini")
#task
def getcode(connection, project, git_repo):
args = config['project]
connection.run("git clone {}".format(git_repo))
#task
def pushcode(connection, project, git_repo):
args = config['project]
connection.run("git push {}".format(git_repo))
How can i avoid using args = config['project] in every method. Can I pass custom args with fab command fab -H web1 --project=project1 pushcode . Need help.
Sure, you can pass arguments to fab tasks which call under the roof task from invoke.task.
I will give an example how you can do it:
fabfile.py
from fabric import task
#task
def sampleTask(connection, name, laste_name, age):
print("The firstname is ", name)
print("The lastname is ", laste_name)
print("The age is ", age)
and then you call it from the command line like this:
Command-line
fab sampleTask -n peshmerge -l Mo -a 28
The output should be like this:
[vagrant#localhost fabric]$ fab sampleTask -n peshmerge -l Mo -a 28
The firstname is Peshmerge
The lastname is Mo
The age is 28
Note: Giving your task a name which contains an underscore ( _ ) will result in an error
No idea what 'sample_task' is!
The same goes with naming the task arguments.
Yes.
Indeed, the fab CLI tool has the same options as Invoke's inv CLI tool.
And having a look at that part of Invoke's docs, you can see that it is the same syntax that you proposed :)
Is there any way to get this to work with env.hosts? As opposed to having to loop manually whenever I have multiple hosts to run this on?
I am trying to use the fabric api, to not have to use the very inconvenient and kludgey fabric command line call. I set the env.hosts variable in one module/class and then call a another class instance method to run a fabric command. In the called class instance I can print out the env.hosts list. Yet when I try to run a command it tells me it can't find a host.
If I loop through the env.hosts array and manually set the env.host variable for each host in the env.hosts array, I can get the run command to work. What is odd is that I also set the env.user variable in the calling class and it is picked up.
e.g. this works:
def upTest(self):
print('env.hosts = ' + str(env.hosts))
for host in env.hosts:
env.host_string = host
print('env.host_string = ' + env.host_string)
run("uptime")
output from this:
env.hosts = ['ec2-....amazonaws.com']
env.host_string = ec2-....amazonaws.com
[ec2-....amazonaws.com] run: uptime
[ec2-....amazonaws.com] out: 18:21:15 up 2 days, 2:13, 1 user, load average: 0.00, 0.01, 0.05
[ec2-....amazonaws.com] out:
This doesn't work... but it does work if you run it from a "fab" file... makes no sense to me.
def upTest(self):
print('env.hosts = ' + str(env.hosts))
run("uptime")
This is the output:
No hosts found. Please specify (single) host string for connection:
I did try putting an #task decorator on the method (and removing the 'self' reference since the decorator didn't like that). But to no help.
Is there any way to get this to work with env.hosts? As opposed to having to loop manually whenever I have multiple hosts to run this on?
Finally, I fixed this problem by using execute() and exec.
main.py
#!/usr/bin/env python
from demo import FabricSupport
hosts = ['localhost']
myfab = FabricSupport()
myfab.execute("df",hosts)
demo.py
#!/usr/bin/env python
from fabric.api import env, run, execute
class FabricSupport:
def __init__(self):
pass
def hostname(self):
run("hostname")
def df(self):
run("df -h")
def execute(self,task,hosts):
get_task = "task = self.%s" % task
exec get_task
execute(task,hosts=hosts)
python main.py
[localhost] Executing task 'hostname'
[localhost] run: hostname
[localhost] out: heydevops-workspace
I've found that it's best not to set env.hosts in code but instead to define roles based on your config file and use the fab tool to specify a role. It worked for me
my_roles.json
{
"web": [ "user#web1.example.com", "user#web2.example.com" ],
"db": [ "user#db1.example.com", "user#db2.example.com" ]
}
fabfile.py
from fabric.api import env, run, task
import json
def load_roles():
with open('my_roles.json') as f:
env.roledefs = json.load(f)
load_roles()
#task
def my_task():
run("hostname")
CLI
fab -R web my_task
output from running my_task for each of web1 and web2 is here
I can't seem to figure this one out but when I do a very simple test to localhost to have fabric execute this command run('history'), the resulting output on the command line is blank.
Nor will this work either: run('history > history_dump.log')
Here is the complete FabFile script below, obviously I'm missing something here.
-- FabFile.py
from fabric.api import run, env, hosts, roles, parallel, cd, task, settings, execute
from fabric.operations import local,put
deploymentType = "LOCAL"
if (deploymentType == "LOCAL"):
env.roledefs = {
'initial': ['127.0.0.1'],
'webservers': ['127.0.0.1'],
'dbservers' : ['127.0.0.1']
}
env.use_ssh_config = False
# Get History
# -------------------------------------------------------------------------------------
#task
#roles('initial')
def showHistoryCommands():
print("Logging into %s and accessing the command history " % env.host_string)
run('history') #does not display anything
run('history > history_dump.log') #does not write anything out
print "Completed displaying the command history"
Any suggestions/solutions would be most welcomed.
History is a shell builtin, so it doesn't work like a normal command. I think your best bet would be to try and read the history file from the filesystem.
local('cat ~/.bash_history')
or
run('cat ~/.bash_history')
Substitute for the appropriate history file path.
To expand a bit after some research, the command succeeds when run, but for some reason, be it that fabric neither captures or prints the output. Or the way history prints it's output. While other builtins commands like env work fine. So for now I don't know what exactly is going on.
I would like to pass a few values from fabric into the remote environment, and I'm not seeing a great way to do it. The best I've come up with so far is:
with prefix('export FOO=BAR'):
run('env | grep BAR')
This does seem to work, but it seems like a bit of a hack.
I looked in the GIT repository and it looks like this is issue #263.
As of fabric 1.5 (released), fabric.context_managers.shell_env does what you want.
with shell_env(FOO1='BAR1', FOO2='BAR2', FOO3='BAR3'):
local("echo FOO1 is $FOO1")
I think your prefix-based solution is perfectly valid. However, if you want to have a shell_env context manager as the one proposed in issue#263, you can use the following alternative implementation in your fab files:
from fabric.api import run, env, prefix
from contextlib import contextmanager
#contextmanager
def shell_env(**env_vars):
orig_shell = env['shell']
env_vars_str = ' '.join('{0}={1}'.format(key, value)
for key, value in env_vars.items())
env['shell']='{0} {1}'.format(env_vars_str, orig_shell)
yield
env['shell']= orig_shell
def my_task():
with prefix('echo FOO1=$FOO1, FOO2=$FOO2, FOO3=$FOO3'):
with shell_env(FOO1='BAR1', FOO2='BAR2', FOO3='BAR3'):
run('env | grep BAR')
Note that this context manager modifies env['shell'] instead of env['command_prefixes'] (as prefix context manager does), so you:
can still use prefix (see example output below) without the interaction problems mentioned in issue#263.
have to apply any changes to env['shell'] before using shell_env. Otherwise, shell_env changes will be overwritten and environment variables won't be available for your commands.
When executing the fab file above, you get the following output:
$ fab -H localhost my_task
[localhost] Executing task 'my_task'
[localhost] run: env | grep BAR
[localhost] out: FOO1=BAR1, FOO2=BAR2, FOO3=BAR3
[localhost] out: FOO1=BAR1
[localhost] out: FOO2=BAR2
[localhost] out: FOO3=BAR3
[localhost] out:
Done.
Disconnecting from localhost... done.
Fabric 1.5.0 (currently in Git) takes shell as local() named argument.
If you pass '/bin/bash' there it passes it to executable argument of Popen.
It won't execute your .bashrc though because .bashrc is sourced on interactive invocation of bash. You can source any file you want inside local:
local('. /usr/local/bin/virtualenvwrapper.sh && workon focus_tests && bunch local output', shell='/bin/bash')
Another way is to pass a value through command line with --set:
--set=domain=stackoverflow.com
Then, you can address to it in script with env.domain
see http://docs.fabfile.org/en/1.11/usage/fab.html#cmdoption--set
Try using decorator
from fabric.context_managers import shell_env
from functools import wraps
def set_env():
def decorator(func):
#wraps(func)
def inner(*args, **kwargs):
with shell_env(DJANGO_CONFIGURATION=env.config):
run("echo $DJANGO_CONFIGURATION")
return func(*args, **kwargs)
return inner
return decorator
#task
#set_env()
def testme():
pass