when I run my python code it is asking for host.
No hosts found. Please specify (single) host string for connection:
I have the following code:
from fabric.api import *
from fabric.contrib.console import confirm
env.hosts = [ 'ipaddress' ]
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
remoteRun();
I even tried running fab with -H option and I am getting the same message. I'm using Ubuntu 10.10 any help is appreciated. Btw I am a newbie in Python.
In order to get hosts to work in a script outside of the fab command-line tool and fabfile.py, you'll have to use execute():
from fabric.api import run
from fabric.tasks import execute
def mytask():
run('uname -a')
results = execute(mytask)
If it's only one host, you can use env.host_string = 'somehost or ipaddress'.
You also don’t need the ; at the end of your remoteRun.
from __future__ import with_statement
from fabric.api import *
from fabric.contrib.console import confirm
from fabric.api import env, run
env.host_string = 'ipaddress'
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
remoteRun()
I am not exactly sure what remoteRun(); is supposed to do in your example.
Is it part of your fabfile or is this your terminal command to invoke the script?
The correct way would be a command like this in your shell:
fab remoteRun
Generally it's better to specify the concrete hosts your command is supposed to run on like this:
def localhost():
env.hosts = [ '127.0.0.1']
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
You can run it like this from a terminal (assuming you are in the directory that contains your fabfile):
fab localhost remoteRun
As an alternative you could specify the host with the -H parameter:
fab -H 127.0.0.1 remoteRun
If you have a list of hosts you want to invoke the command for, do it like this:
http://readthedocs.org/docs/fabric/latest/usage/execution.html
Adjusted to your example:
env.hosts = [ 'localhost', '127.0.0.1']
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
And called via: fab remoteRun
This way the remoteRun is performed on all hosts in env.hosts.
#Nerdatastic is right, for simple: don't use env.hosts, use env.host_string instead. e.g.
def setup_db_server
env.host_string = 'db01.yoursite.com' # or the ip address
run("mysqladmin ...")
end
and running $ fab setup_db_server will execute the script on the target server.
Nerdatastic is right, you need to specify the env.host_string varaible for fabric to know what host string to use. I came across this problem trying to use a subclass of Task and call the run() method. It seemed to ignore env.hosts except when using execute from fabric.tasks in version 1.3.
i have same issue.
I think this is a bug. Because all work before today.
I store my env in .fabricrc.
Now i have same message as yours. Don't know why.
Related
I'm trying to copy a file from my local fabric system to the multiple remote hosts using fabric put command, So, when I ran it doesn't complain anything but it does not copy the file.
Secondly, I see my remote Server already has the file, may that be an issue here? below the code.
import sys
from fabric.api import env
from fabric.operations import run, put
env.skip_bad_hosts=True
env.command_timeout=160
env.user = 'jaggle'
env.shell = "/bin/sh -c"
env.warn_only = True
env.password = '!#heyNaSaFreak'
use_sudo = True
def readhost():
env.hosts = [line.strip() for line in sys.stdin.readlines()]
def copyLDAP():
put("/Karn/ldap.conf","/etc/ldap.conf", use_sudo = True)
Below is output of the run ...
$ echo "tt-server01" | fab readhost -f OpenDgCopy.py copyLDAP
[tt-server0] Executing task 'copyLDAP'
[tt-server0] put: /Karn/ldap.conf -> /etc/ldap.conf
Done.
Disconnecting from tt-server0... done.
Instead of the readhost task, you can directly use the -H option with a comma-separated list of hosts:
fab -f OpenDgCopy.py copyLDAP -H tt-server0,tt-sever1
According to the documentation of put:
As with the OpenSSH sftp program, .put will overwrite
pre-existing remote files without requesting confirmation.
Hope you can help. Totally new to fabric, know a little python. I'm trying to iterate through an external file of IP's to update 40 odd remote servers.
This isn't working,stops after the first IP.
Terminal command:
fab -p Password hosts update
from fabric.api import env, run, sudo
def hosts():
env.hosts = open('sat_ip_list', 'r').readlines()
def update():
sudo('apt-get update -y')
I tried the following list of IPs + Fabric script and had no problems running fab -p <password> hosts uname:
# ip_list.txt
192.168.xxx.x
127.0.0.1:xxxx
174.xxx.xxx.xxx:xxxx
# fabfile.py
from fabric.api import env, run, sudo
def hosts():
# Read ip list from ip_list.txt
env.hosts = open('ip_list.txt', 'r').readlines()
def uname():
sudo('uname -a')
What does your sat_ip_list file look like - is it one IP address per line?
Have you tried your script with just a very small number of hosts, like 2-3 IP addresses? Definitely no reason you shouldn't be able to do what you're trying to accomplish, your script basically works for me just as it is.
As a sanity check, you might want to print out the value of env.hosts, like so:
def hosts():
env.hosts = open('sat_ip_list', 'r').readlines()
print('Hosts:', env.hosts)
In my case, that results in the following output:
me#machine:~$ fab hosts
('Hosts:', ['192.168.xxx.x\n', '127.0.0.1:xxxx\n', '174.xxx.xxx.xxx:xxxx\n'])
In my fabfile, I have set env.use_ssh_config to True. Whenever I run the fabfile, it will get the correct hostname and user from the ssh config, but not the correct key. It will go though my keys(all stored in ~/.ssh/) at random, requiring me to enter the passphrase for all of them, till it gets to the correct key.
It's only fabric that gives me this problem. Running scp as a local command in the fabfile uses the correct key.
Host example
HostName example.com
User elssar
IdentityFile ~/.ssh/id_example
PreferredAuthentications publickey
Entries in my ssh config look like this.
I'm, using Fabric 1.10.1 and Paramiko 1.14.1, Python 2.7.3 and Ubuntu 12.04.
Edit - There is a related open issue in the fabric repository - https://github.com/fabric/fabric/issues/1282
Edit - basic structure of my fabfile, and how I run it
from fabric.api import env, run
def do_something():
run("echo test")
def setup(host):
env.hosts = [host]
# command
fab server:hostname do_something
I tried to check on my setup; here is what I did to debug:
>>> from fabric.network import key_filenames
>>> key_filenames()
[]
>>> from fabric.state import env
>>> env.use_ssh_config = True
>>> env.host_string = 'salt-states'
>>> key_filenames()
['/Users/damien/.ssh/salt.rsa.priv']
update: you could update your fabfile to instrument your task:
from fabric.api import env, run
from fabric.network import key_filenames
def do_something_debug():
env.use_ssh_config = True
print key_filenames()
run("echo test")
def server(host):
env.hosts = [host]
then run the command
fab server:hostname do_something_debug
Fabric has a hosts setting to specify which computers to SSH into.
Amazon Web Services has more of a dynamic inventory that can be queried in python using tools like boto.
Is there a way to combine these two services? Ideally, I wanted something as simple as ansible's approach with an inventory file and using an external file like ec2.py.
More specifically, is there a prebaked solution for this use case? Ideally, I would like to run something straightforward like this:
from fabric.api import env, task
import ec2
env.roledefs = ec2.Inventory()
#task
def command():
run("lsb_release -a")
And run it like so, assuming env.roledefs['nginx'] exists:
$ fab -R nginx command
You can use fabric and boto concurrently.
First you need to export the aws_secret_key, aws_secret_access_key and default regions from your console. Fabric file name should be fabfile.py and should not ec2.py/other.
import boto, urllib2
from boto.ec2 import connect_to_region
from fabric.api import env, run, cd, settings, sudo
from fabric.api import parallel
import os
import sys
REGION = os.environ.get("AWS_EC2_REGION")
env.user = "ec2-user"
env.key_filename = ["/home/user/uswest.pem"]
#task
def command():
run("lsb_release -a")
def _create_connection(region):
print "Connecting to ", region
conn = connect_to_region(
region_name = region,
aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY")
)
print "Connection with AWS established"
return connection
Finally this program can be executed by using below command.
$ fab command
From http://docs.python-guide.org/en/latest/scenarios/admin/
You can see that if you set
env.hosts = ['my_server1', 'my_server2']
You'll then be able to target those hosts.
With boto, if you just have a function that does
ec2_connection.get_only_instances(filter={'tag':< whatever>})
and returns a list of their dns names, you'll be able to then set
env.hosts = [< list of dns names from ec2>]
Piece of cake!
This is related to my previous question, but a different one.
I have the following fabfile:
from fabric.api import *
host1 = '192.168.200.181'
offline_host2 = '192.168.200.199'
host3 = '192.168.200.183'
env.hosts = [host1, offline_host2, host3]
env.warn_only = True
def df_h():
with settings(warn_only=True):
run("df -h | grep sda3")
And the output is:
[192.168.200.199] run: df -h | grep sda3
Fatal error: Low level socket error connecting to host 192.168.200.199: No route to host
Aborting.
After the execution hits the offline server, it aborts immediately, regardless of the other servers in the env.hosts list.
I have used the env setting "warn_only=True", but maybe I'm using it improperly.
How can I modify this behavior so that it will only prints the error and continue executing?
As of version 1.4 Fabric has a --skip-bad-hosts option that can be set from the command line, or by setting the variable in your fab file.
env.skip_bad_hosts = True
Documentation for the option is here:
http://docs.fabfile.org/en/latest/usage/fab.html#cmdoption--skip-bad-hosts
Don't forget to explicitly set the timeout value also.
According to the Fabric documentation on warn_only,
env.warn_only "specifies whether or not to warn, instead of abort, when run/sudo/local encounter error conditions.
This will not help in the case of a server being down, since the failure occurs during the SSH attempt before executing run/sudo/local.
One solution would be to create a function to check if each server is up prior to executing your tasks. Below is the code that I used.
from __future__ import print_function
from fabric.api import run, sudo, local, env
import paramiko
import socket
host1 = '192.168.200.181'
offline_host2 = '192.168.200.199'
host3 = '192.168.200.183'
env.hosts = [host1, offline_host2, host3]
def df_h():
if _is_host_up(env.host, int(env.port)) is True:
run("df -h | grep sda1")
def _is_host_up(host, port):
# Set the timeout
original_timeout = socket.getdefaulttimeout()
new_timeout = 3
socket.setdefaulttimeout(new_timeout)
host_status = False
try:
transport = paramiko.Transport((host, port))
host_status = True
except:
print('***Warning*** Host {host} on port {port} is down.'.format(
host=host, port=port)
)
socket.setdefaulttimeout(original_timeout)
return host_status
You're not using it improperly. You can even just provide --warn-only=true on the command line. It's the documented method suggested by the development team.
Based on Matthew's answer, I came up with a decorator that accomplishes just that:
from __future__ import with_statement
from paramiko import Transport
from socket import getdefaulttimeout, setdefaulttimeout
from fabric.api import run, cd, env, roles
roledefs = {
'greece': [
'alpha',
'beta'
],
'arabia': [
'kha',
'saad'
]
}
env.roledefs = roledefs
def if_host_offline_ignore(fn):
def wrapped():
original_timeout = getdefaulttimeout()
setdefaulttimeout(3)
try:
Transport((env.host, int(env.port)))
return fn()
except:
print "The following host appears to be offline: " + env.host
setdefaulttimeout(original_timeout)
return wrapped
#roles('greece')
#if_host_offline_ignore
def hello_greece():
with cd("/tmp"):
run("touch hello_greece")
#roles('arabia')
#if_host_offline_ignore
def hello_arabia():
with cd("/tmp"):
run("touch hello_arabia")
It is especially useful when you have multiple hosts and roles.