Fabric to copy the File from local host to multiple remote hosts - python

I'm trying to copy a file from my local fabric system to the multiple remote hosts using fabric put command, So, when I ran it doesn't complain anything but it does not copy the file.
Secondly, I see my remote Server already has the file, may that be an issue here? below the code.
import sys
from fabric.api import env
from fabric.operations import run, put
env.skip_bad_hosts=True
env.command_timeout=160
env.user = 'jaggle'
env.shell = "/bin/sh -c"
env.warn_only = True
env.password = '!#heyNaSaFreak'
use_sudo = True
def readhost():
env.hosts = [line.strip() for line in sys.stdin.readlines()]
def copyLDAP():
put("/Karn/ldap.conf","/etc/ldap.conf", use_sudo = True)
Below is output of the run ...
$ echo "tt-server01" | fab readhost -f OpenDgCopy.py copyLDAP
[tt-server0] Executing task 'copyLDAP'
[tt-server0] put: /Karn/ldap.conf -> /etc/ldap.conf
Done.
Disconnecting from tt-server0... done.

Instead of the readhost task, you can directly use the -H option with a comma-separated list of hosts:
fab -f OpenDgCopy.py copyLDAP -H tt-server0,tt-sever1
According to the documentation of put:
As with the OpenSSH sftp program, .put will overwrite
pre-existing remote files without requesting confirmation.

Related

How to make changes/edit on a file present on remote server using python fabric?

I have a .yml file present in a remote server , I want to make changes on it using python fabric. If it can be done with other python libraries feel free to share.
Thank you
You are trying to edit a line in the middle of a file which is imo is not possible.
What you can do, is making a copy of the remote file on your local machine with the desired values you want to change, and then send it back to the remote server.
from fabric import Connection as connection, task
#task
def executeTask(ctx):
with connection(host=dev_server, user=myuser) as c:
c.put('PATH_TO_YOUR_YML_FILE_LOCALLY', 'PATH_TO_YOUR_REMOTE_YML_FILE')
Don't forget to :
Replacedev_server and myuser with the remote server IP and username on it
put the code above in a file called fabfile.py and you run from your command line fab executeTask
The code above is fabric 2.4 compatible
EDIT:
Because of permissions issue you can do the following :
#task
def executeTask(ctx):
with connection(host=dev_server, user=myuser) as c:
c.put("PATH_TO_YOUR_YML_FILE_LOCALLY") # implicit to remote $HOME
c.sudo("mv YOUR_FILE_NAME YOUR_DESIRED_LOCATION") # again implicitly with a CWD of $HOME
c.sudo("chown root:root YOUR_REMOTE_FILE")
Referenc:
https://github.com/fabric/fabric/issues/1750#issuecomment-406043571
If you just need to change port number you can use sed like this
def change_port(filename):
with cd('/location'):
run('sed -i "s/old_port_number/new_port_number/g" ' +filename)

python fabric, Iterate through IP list to update servers

Hope you can help. Totally new to fabric, know a little python. I'm trying to iterate through an external file of IP's to update 40 odd remote servers.
This isn't working,stops after the first IP.
Terminal command:
fab -p Password hosts update
from fabric.api import env, run, sudo
def hosts():
env.hosts = open('sat_ip_list', 'r').readlines()
def update():
sudo('apt-get update -y')
I tried the following list of IPs + Fabric script and had no problems running fab -p <password> hosts uname:
# ip_list.txt
192.168.xxx.x
127.0.0.1:xxxx
174.xxx.xxx.xxx:xxxx
# fabfile.py
from fabric.api import env, run, sudo
def hosts():
# Read ip list from ip_list.txt
env.hosts = open('ip_list.txt', 'r').readlines()
def uname():
sudo('uname -a')
What does your sat_ip_list file look like - is it one IP address per line?
Have you tried your script with just a very small number of hosts, like 2-3 IP addresses? Definitely no reason you shouldn't be able to do what you're trying to accomplish, your script basically works for me just as it is.
As a sanity check, you might want to print out the value of env.hosts, like so:
def hosts():
env.hosts = open('sat_ip_list', 'r').readlines()
print('Hosts:', env.hosts)
In my case, that results in the following output:
me#machine:~$ fab hosts
('Hosts:', ['192.168.xxx.x\n', '127.0.0.1:xxxx\n', '174.xxx.xxx.xxx:xxxx\n'])

Python Fabric perform remote database task

I need to ssh to a remote Ubuntu server to do some routine job, in following steps:
ssh in as userA
sudo su - userB
run daliy_python.py script with use psycopg2 to read some info from the database (via local connection (non-TCP/IP))
scp readings to my local machine
The question is: How to do that automatically?
I've try to use Fabric, but I run into a problem with psycopg2, after I run the Fabric script below, I received error from my daliy_python.py
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/xxx/.s.xxxx"?
My fabfile.py code is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)
Any suggestions? My database can only be access via userB locally, but only userA can ssh to the server. That might be a limitation. Both local and remote machine is running Ubuntu 14.04.
This is what I do to read my root accessible logfiles without extra login
ssh usera#12.34.56.78 "echo hunter2 | sudo -S tail -f /var/log/nginx/access.log"
That is: ssh usera#12.34.56.78 "..run this code on the remote.."
Then on the remote, you pipe the sudo password into sudo -S echo hunter2 | sudo -S
Add a -u userb to sudo to switch to a particular user, I am using root in my case. So then as the sudo'ed user, run your script. In my case tail -f /var/log/nginx/access.log.
But, reading your post, I would probably simply set up a cronjob on the remote, so it runs automatically. I actually do that for all my databases. A cronjob dumps them once a day to a certain directory, with the date as filename. Then I download them to my local PC with rsync an hour later.
I finally find out where my problem is.
Thanks #chishake and #C14L, I look at the problem in another way.
After inspired by this posts link1 link2, I start to think this problem is related to environmental variables.
Thus I add a with statement to alter $HOME and it worked.
fabfile.py is as below:
from fabric.api import *
import os
import socket
import pwd
# Target machine setting
srv = 'server.hostname.com'
env.hosts = [srv]
env.user = 'userA'
env.key_filename = '/location/to/my/key'
env.timeout = 2
# Force fabric abort at timeout
env.skip_bad_hosts = False
def run_remote():
user = 'userB'
with settings(warn_only=True):
run('whoami')
with shell_env(HOME='/home/%s' % user):
sudo('echo $HOME', user=user)
with cd('/home/%s/script/script_folder' % user):
sudo('whoami')
sudo('pwd', user=user)
sudo('ls', user=user)
sudo('python daliy_python.py', user=user)

Fabric not using the correct key

In my fabfile, I have set env.use_ssh_config to True. Whenever I run the fabfile, it will get the correct hostname and user from the ssh config, but not the correct key. It will go though my keys(all stored in ~/.ssh/) at random, requiring me to enter the passphrase for all of them, till it gets to the correct key.
It's only fabric that gives me this problem. Running scp as a local command in the fabfile uses the correct key.
Host example
HostName example.com
User elssar
IdentityFile ~/.ssh/id_example
PreferredAuthentications publickey
Entries in my ssh config look like this.
I'm, using Fabric 1.10.1 and Paramiko 1.14.1, Python 2.7.3 and Ubuntu 12.04.
Edit - There is a related open issue in the fabric repository - https://github.com/fabric/fabric/issues/1282
Edit - basic structure of my fabfile, and how I run it
from fabric.api import env, run
def do_something():
run("echo test")
def setup(host):
env.hosts = [host]
# command
fab server:hostname do_something
I tried to check on my setup; here is what I did to debug:
>>> from fabric.network import key_filenames
>>> key_filenames()
[]
>>> from fabric.state import env
>>> env.use_ssh_config = True
>>> env.host_string = 'salt-states'
>>> key_filenames()
['/Users/damien/.ssh/salt.rsa.priv']
update: you could update your fabfile to instrument your task:
from fabric.api import env, run
from fabric.network import key_filenames
def do_something_debug():
env.use_ssh_config = True
print key_filenames()
run("echo test")
def server(host):
env.hosts = [host]
then run the command
fab server:hostname do_something_debug

how to edit hostname file using fabric

I have change my hosts file,so how to change hostname.my system is ubuntu.
eg my hosts file:
192.168.0.100 host1.mydomain.com
192.168.0.101 host2.mydomain.com
I wanna the hostname file under /etc/hostname of host1 to host1.mydomain.com,the hostname file of host2 to host2.mydomain.com
how to do that using fabric?
I have to ssh every host and edit the hostname file,does fabric can do this?
I didn't mean to use hostname command but to edit the /etc/hostname file.
I mean how to use fabric to do that:
such as:
def update_hostname():
get("/etc/hosts","hosts")
hosts_content = file("hosts")
**hostname = ·get the hostname corespond to ip·**
get("/etc/hostname","hostname")
update `hostname file`
put("hostname","/etc/hostname")
how get the ip? because fabric do the job on every host, and the hostname is correspond to each host. I need to know the which host the job is working and then get the ip back,then get the hostname correspond the the ip,and final update the hostname file.
Fabric is just a SSH wrapper, so what you're looking at is LINUX specific, not frabric or python specific.
from fabric.api import run
run('hostname your-new-name')
run('echo your-new-hostname > /etc/hostname')
And just do a run(..edit..) according to your linux dist?
Or just do:
from subprocess import Popen, PIPE
hosts = open('/etc/networking/hosts', 'rb')
for hostline in hosts.readlines():
ip, name = hostline.split(' ')
command = ['ssh', '-t', 'root#' + host.strip('\r\n ,;), ' ', "echo " + name.strip('\r\n ,;) + " > /etc/hostname",]
stdout, stderr = Popen(command, stdout=PIPE, stderr=PIPE).communicate()
hosts.close()
Note: /etc/networking/hosts might be placed somewhere else for you.
The important part here is that you loop through the /hosts file, and ssh to each machine echoing the given hostname to that machine.
def hostname():
'''
function to change the hostname of the ubuntu server
'''
server_hostname = prompt ("The Hostname for the server is :")
sed ("/etc/hostname", before='current hostname', after='%s' % (server_hostname), use_sudo=True,backup='')
sudo ("init 6")
This will change the hostname according to your choice.
in your fabric script you'll need to...
ssh into the machine as a user permitted to edit the hosts file ( via permissions or groups ). if you need to sudo into a user, search StackOverflow for issues regarding sudo and Fabric -- you'll need to tweak your fabfile to not prompt for a password.
fabric can have an awkward way to deal with reading/writing/opening files. you'll may be best off by cd into the right directory. something like...
with cd('/etc/')
run('echo new_hostname hostname')

Categories

Resources