In my fabfile, I have set env.use_ssh_config to True. Whenever I run the fabfile, it will get the correct hostname and user from the ssh config, but not the correct key. It will go though my keys(all stored in ~/.ssh/) at random, requiring me to enter the passphrase for all of them, till it gets to the correct key.
It's only fabric that gives me this problem. Running scp as a local command in the fabfile uses the correct key.
Host example
HostName example.com
User elssar
IdentityFile ~/.ssh/id_example
PreferredAuthentications publickey
Entries in my ssh config look like this.
I'm, using Fabric 1.10.1 and Paramiko 1.14.1, Python 2.7.3 and Ubuntu 12.04.
Edit - There is a related open issue in the fabric repository - https://github.com/fabric/fabric/issues/1282
Edit - basic structure of my fabfile, and how I run it
from fabric.api import env, run
def do_something():
run("echo test")
def setup(host):
env.hosts = [host]
# command
fab server:hostname do_something
I tried to check on my setup; here is what I did to debug:
>>> from fabric.network import key_filenames
>>> key_filenames()
[]
>>> from fabric.state import env
>>> env.use_ssh_config = True
>>> env.host_string = 'salt-states'
>>> key_filenames()
['/Users/damien/.ssh/salt.rsa.priv']
update: you could update your fabfile to instrument your task:
from fabric.api import env, run
from fabric.network import key_filenames
def do_something_debug():
env.use_ssh_config = True
print key_filenames()
run("echo test")
def server(host):
env.hosts = [host]
then run the command
fab server:hostname do_something_debug
Related
I'm trying to copy a file from my local fabric system to the multiple remote hosts using fabric put command, So, when I ran it doesn't complain anything but it does not copy the file.
Secondly, I see my remote Server already has the file, may that be an issue here? below the code.
import sys
from fabric.api import env
from fabric.operations import run, put
env.skip_bad_hosts=True
env.command_timeout=160
env.user = 'jaggle'
env.shell = "/bin/sh -c"
env.warn_only = True
env.password = '!#heyNaSaFreak'
use_sudo = True
def readhost():
env.hosts = [line.strip() for line in sys.stdin.readlines()]
def copyLDAP():
put("/Karn/ldap.conf","/etc/ldap.conf", use_sudo = True)
Below is output of the run ...
$ echo "tt-server01" | fab readhost -f OpenDgCopy.py copyLDAP
[tt-server0] Executing task 'copyLDAP'
[tt-server0] put: /Karn/ldap.conf -> /etc/ldap.conf
Done.
Disconnecting from tt-server0... done.
Instead of the readhost task, you can directly use the -H option with a comma-separated list of hosts:
fab -f OpenDgCopy.py copyLDAP -H tt-server0,tt-sever1
According to the documentation of put:
As with the OpenSSH sftp program, .put will overwrite
pre-existing remote files without requesting confirmation.
I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.
This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.
I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Fabric has a hosts setting to specify which computers to SSH into.
Amazon Web Services has more of a dynamic inventory that can be queried in python using tools like boto.
Is there a way to combine these two services? Ideally, I wanted something as simple as ansible's approach with an inventory file and using an external file like ec2.py.
More specifically, is there a prebaked solution for this use case? Ideally, I would like to run something straightforward like this:
from fabric.api import env, task
import ec2
env.roledefs = ec2.Inventory()
#task
def command():
run("lsb_release -a")
And run it like so, assuming env.roledefs['nginx'] exists:
$ fab -R nginx command
You can use fabric and boto concurrently.
First you need to export the aws_secret_key, aws_secret_access_key and default regions from your console. Fabric file name should be fabfile.py and should not ec2.py/other.
import boto, urllib2
from boto.ec2 import connect_to_region
from fabric.api import env, run, cd, settings, sudo
from fabric.api import parallel
import os
import sys
REGION = os.environ.get("AWS_EC2_REGION")
env.user = "ec2-user"
env.key_filename = ["/home/user/uswest.pem"]
#task
def command():
run("lsb_release -a")
def _create_connection(region):
print "Connecting to ", region
conn = connect_to_region(
region_name = region,
aws_access_key_id=os.environ.get("AWS_ACCESS_KEY_ID"),
aws_secret_access_key=os.environ.get("AWS_SECRET_ACCESS_KEY")
)
print "Connection with AWS established"
return connection
Finally this program can be executed by using below command.
$ fab command
From http://docs.python-guide.org/en/latest/scenarios/admin/
You can see that if you set
env.hosts = ['my_server1', 'my_server2']
You'll then be able to target those hosts.
With boto, if you just have a function that does
ec2_connection.get_only_instances(filter={'tag':< whatever>})
and returns a list of their dns names, you'll be able to then set
env.hosts = [< list of dns names from ec2>]
Piece of cake!
I would like to deploy an application using fabric to a proxied server. Normally we ssh to a proxy server then ssh to the production server, however fabric doesn’t seem to allow for this directly.
An example of the setup would be local --> server A (Proxy) --> Server B (App server)
The destination is server B.
I have tried using the fab file below to test.
import os.path
from fabric.api import env, run, sudo, cd, local, put, settings
from fabric.contrib.files import sed, exists
from datetime import datetime
def proxy():
env.user = "root"
env.hosts = ['proxy']
env.key_filename = "/home/root/monitorserver.pem"
def production():
"""Defines production environment ."""
env.is_debuggable = False
env.user = "root"
env.hosts = ['appserver']
env.key_filename = "/home/root/appserver.pem"
def createfile():
"""Start Excecute test commands"""
sudo("touch /tmp/test_%s" % datetime.now().strftime('%H:%M:%S'))
but running the commands
fab proxy createfile production createfile
only seems to work as
fab proxy createfile
fab production createfile
Is there a way I can run fabric locally and deploy to server B with the proxy in place?
I think this can be done by creating 2 fabfiles: 1 on local, and 1 on the proxy server.
from fabric.api import env, run, sudo, cd
from datetime import datetime
def proxy():
env.user = "root"
env.hosts = ['proxy']
env.key_filename = "/home/root/monitorserver.pem"
with cd('/home/root/'):
createfile()
run("fab production")
def production():
"""Defines production environment ."""
env.is_debuggable = False
env.user = "root"
env.hosts = ['appserver']
env.key_filename = "/home/root/appserver.pem"
createfile()
def createfile():
"""Start Excecute test commands"""
sudo("touch /tmp/test_%s" % datetime.now().strftime('%H:%M:%S'))
Run fab proxy.
(Haven't tested the code, but something like this should work.)
when I run my python code it is asking for host.
No hosts found. Please specify (single) host string for connection:
I have the following code:
from fabric.api import *
from fabric.contrib.console import confirm
env.hosts = [ 'ipaddress' ]
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
remoteRun();
I even tried running fab with -H option and I am getting the same message. I'm using Ubuntu 10.10 any help is appreciated. Btw I am a newbie in Python.
In order to get hosts to work in a script outside of the fab command-line tool and fabfile.py, you'll have to use execute():
from fabric.api import run
from fabric.tasks import execute
def mytask():
run('uname -a')
results = execute(mytask)
If it's only one host, you can use env.host_string = 'somehost or ipaddress'.
You also don’t need the ; at the end of your remoteRun.
from __future__ import with_statement
from fabric.api import *
from fabric.contrib.console import confirm
from fabric.api import env, run
env.host_string = 'ipaddress'
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
remoteRun()
I am not exactly sure what remoteRun(); is supposed to do in your example.
Is it part of your fabfile or is this your terminal command to invoke the script?
The correct way would be a command like this in your shell:
fab remoteRun
Generally it's better to specify the concrete hosts your command is supposed to run on like this:
def localhost():
env.hosts = [ '127.0.0.1']
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
You can run it like this from a terminal (assuming you are in the directory that contains your fabfile):
fab localhost remoteRun
As an alternative you could specify the host with the -H parameter:
fab -H 127.0.0.1 remoteRun
If you have a list of hosts you want to invoke the command for, do it like this:
http://readthedocs.org/docs/fabric/latest/usage/execution.html
Adjusted to your example:
env.hosts = [ 'localhost', '127.0.0.1']
def remoteRun():
print "ENV %s" %(env.hosts)
out = run('uname -r')
print "Output %s"%(out)
And called via: fab remoteRun
This way the remoteRun is performed on all hosts in env.hosts.
#Nerdatastic is right, for simple: don't use env.hosts, use env.host_string instead. e.g.
def setup_db_server
env.host_string = 'db01.yoursite.com' # or the ip address
run("mysqladmin ...")
end
and running $ fab setup_db_server will execute the script on the target server.
Nerdatastic is right, you need to specify the env.host_string varaible for fabric to know what host string to use. I came across this problem trying to use a subclass of Task and call the run() method. It seemed to ignore env.hosts except when using execute from fabric.tasks in version 1.3.
i have same issue.
I think this is a bug. Because all work before today.
I store my env in .fabricrc.
Now i have same message as yours. Don't know why.