python ssh script for linux ssh command - python

My Linux script is as below to ssh to host & search patch update which has kernel update
for host in `cat patch.csv`
do
echo "Host $host" >> /tmp/patching
ssh -o "UserKnownHostsFile=/dev/null" -o "StrictHostKeyChecking=no" $host 'sudo yum check-update | grep "kernel.x86"'>>/tmp/patching
done
Now i am trying to write a python script equivalent to this & its showing error(not able to conenct to ssh). I tried to use subprocess comamnd which is not working - its not able to pick up hostname & public key error.
import subprocess
import os
def read_file():
# Read and print the entire file line by line
with open('patch.csv', 'r') as reader:
with open('server_names.txt', 'w') as writer:
for host in reader:
writer.write(host)
p = subprocess.Popen(["ssh -o 'UserKnownHostsFile=/dev/null' -o 'StrictHostKeyChecking=no' host 'sudo yum check-update | grep kernel.x86'"], shell=True, stdout=subprocess.PIPE)
output, err = p.communicate()
print(host)
print("file read done")
read_file()

You can use paramiko to solve that problem to ssh to another host using python, we use it on an internal project, works very well.
http://www.paramiko.org/
$ pip install paramiko
You can use password or passphrase as input if it's required by the host, so authentication can be automated.
http://docs.paramiko.org/en/stable/api/client.html#paramiko.client.SSHClient.connect
connect(hostname, port=22, username=None, password=None, pkey=None, key_filename=None, timeout=None, allow_agent=True, look_for_keys=True, compress=False, sock=None, gss_auth=False, gss_kex=False, gss_deleg_creds=True, gss_host=None, banner_timeout=None, auth_timeout=None, gss_trust_dns=True, passphrase=None, disabled_algorithms=None)
Connect to an SSH server and authenticate to it. The server’s host key is checked against the system host keys (see load_system_host_keys) and any local host keys (load_host_keys). If the server’s hostname is not found in either set of host keys, the missing host key policy is used (see set_missing_host_key_policy). The default policy is to reject the key and raise an SSHException.
Code sample adapted from https://gist.github.com/mlafeldt/841944
import paramiko
hostname = host
password = pass123
command = 'sudo yum check-update | grep kernel.x86'
username = "admin"
port = 22
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy)
client.connect(hostname, port=port, username=username, password=password)
stdin, stdout, stderr = client.exec_command(command)
print stdout.read()
finally:
client.close()

I got some idea from your given code #lucasgvarela & came up with this using some other posts in stackoverflow which actually worked.
import subprocess
import sys
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uname -a"
COMMAND2="UserKnownHostsFile=/dev/null"
COMMAND3="StrictHostKeyChecking=no"
with open('patch.csv', 'r') as reader:
with open('server_names.txt', 'w') as writer:
for HOST in reader:
print(HOST)
HOST=HOST.strip()
ssh = subprocess.Popen(["ssh","-o",COMMAND2,"-o",COMMAND3, "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
# print >>sys.stderr, "ERROR: %s" % error
writer.write(f'The Host failed is {HOST} & the error is {error}')
print(error)
else:
print (result)

You can use "sshscript" package. For the above shell script, it might be able to convert to python script of sshscript-syntax like this:
# filename: test.spy
with open('patch.csv') as fd:
for host in fd:
host = host.strip()
account = f'user#{host}'
$$echo "Host #{host}" >> /tmp/patching
$$ssh -o "UserKnownHostsFile=/dev/null" -o "StrictHostKeyChecking=no" #{account} 'sudo yum check-update | grep "kernel.x86"'>>/tmp/patching
Executing command on console:
$ sshscript test.spy
The link to documents of the sshscript package:
https://iapyeh.github.io/sshscript/index

Related

how can i run subprocess.Popen() on remote machine locally? [duplicate]

I am writing a program in python on Ubuntu, to execute a command ls -l on RaspberryPi, connect with Network.
Can anybody guide me on how do I do that?
Sure, there are several ways to do it!
Let's say you've got a Raspberry Pi on a raspberry.lan host and your username is irfan.
subprocess
It's the default Python library that runs commands.
You can make it run ssh and do whatever you need on a remote server.
scrat has it covered in his answer. You definitely should do this if you don't want to use any third-party libraries.
You can also automate the password/passphrase entering using pexpect.
paramiko
paramiko is a third-party library that adds SSH-protocol support, so it can work like an SSH-client.
The example code that would connect to the server, execute and grab the results of the ls -l command would look like that:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('raspberry.lan', username='irfan', password='my_strong_password')
stdin, stdout, stderr = client.exec_command('ls -l')
for line in stdout:
print line.strip('\n')
client.close()
fabric
You can also achieve it using fabric.
Fabric is a deployment tool which executes various commands on remote servers.
It's often used to run stuff on a remote server, so you could easily put your latest version of the web application, restart a web-server and whatnot with a single command. Actually, you can run the same command on multiple servers, which is awesome!
Though it was made as a deploying and remote management tool, you still can use it to execute basic commands.
# fabfile.py
from fabric.api import *
def list_files():
with cd('/'): # change the directory to '/'
result = run('ls -l') # run a 'ls -l' command
# you can do something with the result here,
# though it will still be displayed in fabric itself.
It's like typing cd / and ls -l in the remote server, so you'll get the list of directories in your root folder.
Then run in the shell:
fab list_files
It will prompt for an server address:
No hosts found. Please specify (single) host string for connection: irfan#raspberry.lan
A quick note: You can also assign a username and a host right in a fab command:
fab list_files -U irfan -H raspberry.lan
Or you could put a host into the env.hosts variable in your fabfile. Here's how to do it.
Then you'll be prompted for a SSH password:
[irfan#raspberry.lan] run: ls -l
[irfan#raspberry.lan] Login password for 'irfan':
And then the command will be ran successfully.
[irfan#raspberry.lan] out: total 84
[irfan#raspberry.lan] out: drwxr-xr-x 2 root root 4096 Feb 9 05:54 bin
[irfan#raspberry.lan] out: drwxr-xr-x 3 root root 4096 Dec 19 08:19 boot
...
Simple example from here:
import subprocess
import sys
HOST="www.example.org"
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uname -a"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
It does exactly what you want: connects over ssh, executes command, returns output. No third party library needed.
You may use below method with linux/ Unix 's built in ssh command.
import os
os.system('ssh username#ip bash < local_script.sh >> /local/path/output.txt 2>&1')
os.system('ssh username#ip python < local_program.py >> /local/path/output.txt 2>&1')
Paramiko module can be used to run multiple commands by invoking shell. Here I created class to invoke ssh shell
class ShellHandler:
def __init__(self, host, user, psw):
logger.debug("Initialising instance of ShellHandler host:{0}".format(host))
try:
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect(host, username=user, password=psw, port=22)
self.channel = self.ssh.invoke_shell()
except:
logger.error("Error Creating ssh connection to {0}".format(host))
logger.error("Exiting ShellHandler")
return
self.psw=psw
self.stdin = self.channel.makefile('wb')
self.stdout = self.channel.makefile('r')
self.host=host
time.sleep(2)
while not self.channel.recv_ready():
time.sleep(2)
self.initialprompt=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
self.initialprompt=self.initialprompt+str(tmp.decode())
def __del__(self):
self.ssh.close()
logger.info("closed connection to {0}".format(self.host))
def execute(self, cmd):
cmd = cmd.strip('\n')
self.stdin.write(cmd + '\n')
#self.stdin.write(self.psw +'\n')
self.stdin.flush()
time.sleep(1)
while not self.stdout.channel.recv_ready():
time.sleep(2)
logger.debug("Waiting for recv_ready")
output=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
output=output+str(tmp.decode())
return output
If creating different shell each time does not matter to you then you can use method as below.
def run_cmd(self,cmd):
try:
cmd=cmd+'\n'
#self.ssh.settimeout(60)
stdin,stdout,stderr=self.ssh.exec_command(cmd)
while not stdout.channel.eof_received:
time.sleep(3)
logger.debug("Waiting for eof_received")
out=""
while stdout.channel.recv_ready():
err=stderr.read()
if err:
print("Error: ",my_hostname, str(err))
return False
out=out+stdout.read()
if out:
return out
except:
error=sys.exc_info()
logger.error(error)
return False

How is it different to start a service using paramiko in linux rather than executing another commands such as pkill?

Following this post (Running Sudo Command with paramiko) I was able to run commands as sudo remotely.
I can execute sudo pkill -2 pure-ftpd successfully, but when I try to execute sudo service pure-ftpd start I can't see any effect on the server although I see that the output in stdout and stderr is correct.
Here is my code:
class RemoteCmdSender(object):
def __init__(self, host, usr=None, passwd=None):
self.host = host
self.usr = usr
self.passwd = str(passwd)
def send_cmd_as_bash(self, cmd):
client = SSHClient()
client.set_missing_host_key_policy(AutoAddPolicy())
client.connect(hostname=self.host, username=self.usr,
password=self.passwd)
transport = client.get_transport()
session = transport.open_session()
session.get_pty('bash')
session.exec_command(cmd)
stdin = session.makefile('wb', -1)
stdin.write(self.passwd.strip() + '\n')
stdin.flush()
stdout = session.makefile('rb', -1).read()
stderr = session.makefile_stderr('rb', -1).read()
client.close()
return stdout, stderr
and the execution:
print cmd_sender.send_cmd_as_bash("sudo service pure-ftpd")
output:
Starting ftp server: Running: /usr/sbin/pure-ftpd -l pam -l puredb:/etc/pure-ftpd/pureftpd.pdb -E -O clf:/var/log/pure-ftpd/transfer.log -8 UTF-8 -u 1000 -B\r\n
Which is consistent with the output that I get if I log to the server using ssh and write sudo service pure-ftpd start in the bash.
PS: I want to make clear that both commands works correctly when run from an ssh session using bash

Pipe the ouput of a hive query run on a remote machine to the local machine using python

All,
I'm trying to run a hive command ( run a query on a table) on a remote machine using ssh protocol ( remote address is xx.xxx.xxx.51 ) and then pipe the csv file to the local machine using python and pxssh (pexpect). I have it using subprocess and paramiko also.
1) Code1 using pxssh
from pexpect import pxssh
import getpass
try:
s = pxssh.pxssh()
hostname, username, password = ('xx.xxx.xxx.51', 'username', 'passwd')
s.login(hostname, username, password)
cmd = "hive -e 'use dbname; set hive.resultset.use.unique.column.names=false ; select * from tablename limit 50' > ./myfile.csv"
print cmd
s.sendline(cmd)
s.prompt() # match the prompt
print(s.before) # print everything before the prompt.
s.logout()
except pxssh.ExceptionPxssh as e:
print("pxssh failed on login.")
print(e)
Error: It outputs a file myfile.csv on the remote server but not on the local machine. I would like the csv file to be saved on my local machine from where the python script is being run.
The actual commmand is:
hive -e 'use dbname; set hive.resultset.use.unique.column.names=false ; select * from tablename limit 50' | sed 's/[\t]/,/g ' > ./myfile.csv"
But the options | sed 's/[\t]/,/g give an error.
2) Solution using subprocess
import subprocess
command = "ssh username#xx.xxx.xxx.51 ls -tal " # an example using ls -tal
print "submitting command", command
result = subprocess.Popen(command, shell = True stdout=subprocess.PIPE, stdin=subprocess.PIPE)
result.stdin.write('passwd\n')
result.stdin.flush()
print "got response"
response,err = result.communicate()
print response
3) Solution using paramiko
import paramiko
#Configuration for remote machine
server, username, password = ('xx.xxx.xxx.51', 'username', 'passwd')
ssh1 = paramiko.SSHClient()
ssh1.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh1.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
#Loads the user's local known host file.
ssh1.connect(server, username=username, password=password)
# doing config for local machine
server2, username2, password2 = ('yy.yyy.yyy.112', 'username', 'passwd2')
ssh2 = paramiko.SSHClient()
ssh2.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh2.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh2.connect(server2, username=username2, password=password2)
_, stdout2, _ = ssh2.exec_command('hostname')
hostname2 = stdout2.read()
command = " hive -e 'use dbname; set hive.resultset.use.unique.column.names=false ; \
select * from tablename limit 20' | ssh username#yy.yyy.yyy.112 'cat > myfile.csv' "
#print cmd
ssh_stdin, ssh_stdout, ssh_stderr = ssh1.exec_command(command)
print "output", ssh_stdout.read() #Reading output of the executed command
error = ssh_stderr.read()
#Reading the error stream of the executed command
print "err", error, len(error)
This gives the following error output:
output
err Permission denied, please try again.
Permission denied, please try again.
Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
How did I pipe the results of the output of the hive query run on remote server to my local machine ?

Execute a command on Remote Machine in Python

I am writing a program in python on Ubuntu, to execute a command ls -l on RaspberryPi, connect with Network.
Can anybody guide me on how do I do that?
Sure, there are several ways to do it!
Let's say you've got a Raspberry Pi on a raspberry.lan host and your username is irfan.
subprocess
It's the default Python library that runs commands.
You can make it run ssh and do whatever you need on a remote server.
scrat has it covered in his answer. You definitely should do this if you don't want to use any third-party libraries.
You can also automate the password/passphrase entering using pexpect.
paramiko
paramiko is a third-party library that adds SSH-protocol support, so it can work like an SSH-client.
The example code that would connect to the server, execute and grab the results of the ls -l command would look like that:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('raspberry.lan', username='irfan', password='my_strong_password')
stdin, stdout, stderr = client.exec_command('ls -l')
for line in stdout:
print line.strip('\n')
client.close()
fabric
You can also achieve it using fabric.
Fabric is a deployment tool which executes various commands on remote servers.
It's often used to run stuff on a remote server, so you could easily put your latest version of the web application, restart a web-server and whatnot with a single command. Actually, you can run the same command on multiple servers, which is awesome!
Though it was made as a deploying and remote management tool, you still can use it to execute basic commands.
# fabfile.py
from fabric.api import *
def list_files():
with cd('/'): # change the directory to '/'
result = run('ls -l') # run a 'ls -l' command
# you can do something with the result here,
# though it will still be displayed in fabric itself.
It's like typing cd / and ls -l in the remote server, so you'll get the list of directories in your root folder.
Then run in the shell:
fab list_files
It will prompt for an server address:
No hosts found. Please specify (single) host string for connection: irfan#raspberry.lan
A quick note: You can also assign a username and a host right in a fab command:
fab list_files -U irfan -H raspberry.lan
Or you could put a host into the env.hosts variable in your fabfile. Here's how to do it.
Then you'll be prompted for a SSH password:
[irfan#raspberry.lan] run: ls -l
[irfan#raspberry.lan] Login password for 'irfan':
And then the command will be ran successfully.
[irfan#raspberry.lan] out: total 84
[irfan#raspberry.lan] out: drwxr-xr-x 2 root root 4096 Feb 9 05:54 bin
[irfan#raspberry.lan] out: drwxr-xr-x 3 root root 4096 Dec 19 08:19 boot
...
Simple example from here:
import subprocess
import sys
HOST="www.example.org"
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uname -a"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
It does exactly what you want: connects over ssh, executes command, returns output. No third party library needed.
You may use below method with linux/ Unix 's built in ssh command.
import os
os.system('ssh username#ip bash < local_script.sh >> /local/path/output.txt 2>&1')
os.system('ssh username#ip python < local_program.py >> /local/path/output.txt 2>&1')
Paramiko module can be used to run multiple commands by invoking shell. Here I created class to invoke ssh shell
class ShellHandler:
def __init__(self, host, user, psw):
logger.debug("Initialising instance of ShellHandler host:{0}".format(host))
try:
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect(host, username=user, password=psw, port=22)
self.channel = self.ssh.invoke_shell()
except:
logger.error("Error Creating ssh connection to {0}".format(host))
logger.error("Exiting ShellHandler")
return
self.psw=psw
self.stdin = self.channel.makefile('wb')
self.stdout = self.channel.makefile('r')
self.host=host
time.sleep(2)
while not self.channel.recv_ready():
time.sleep(2)
self.initialprompt=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
self.initialprompt=self.initialprompt+str(tmp.decode())
def __del__(self):
self.ssh.close()
logger.info("closed connection to {0}".format(self.host))
def execute(self, cmd):
cmd = cmd.strip('\n')
self.stdin.write(cmd + '\n')
#self.stdin.write(self.psw +'\n')
self.stdin.flush()
time.sleep(1)
while not self.stdout.channel.recv_ready():
time.sleep(2)
logger.debug("Waiting for recv_ready")
output=""
while self.channel.recv_ready():
rl, wl, xl = select.select([ self.stdout.channel ], [ ], [ ], 0.0)
if len(rl) > 0:
tmp = self.stdout.channel.recv(24)
output=output+str(tmp.decode())
return output
If creating different shell each time does not matter to you then you can use method as below.
def run_cmd(self,cmd):
try:
cmd=cmd+'\n'
#self.ssh.settimeout(60)
stdin,stdout,stderr=self.ssh.exec_command(cmd)
while not stdout.channel.eof_received:
time.sleep(3)
logger.debug("Waiting for eof_received")
out=""
while stdout.channel.recv_ready():
err=stderr.read()
if err:
print("Error: ",my_hostname, str(err))
return False
out=out+stdout.read()
if out:
return out
except:
error=sys.exc_info()
logger.error(error)
return False

Creating and logging into a linux virtual machine in automation with python

I currently have a working python script that SSHs into a remote Linux machine and executes commands on that machine. I'm using paramiko to handle ssh connectivity. Here is the code in action, executing an hostname -s command:
blade = '192.168.1.15'
username='root'
password=''
# now, connect
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy())
print '*** Connecting...'
client.connect(blade, 22, username, password)
# print hostname for verification
stdin, stdout, stderr = client.exec_command('hostname --short')
print stdout.readlines()
except Exception, e:
print '*** Caught exception: %s: %s' % (e.__class__, e)
traceback.print_exc()
try:
client.close()
except:
pass
sys.exit(1)
This works fine, but what I'm actually trying to do is more complicated. What I would actually like to do is SSH into that same Linux machine, as I did above, but then create a temporary virtual machine on it, and execute a command on that virtual machine. Here is my (nonworking) attempt:
blade='192.168.1.15'
username='root'
password=''
# now, connect
try:
# client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy())
print '*** Connecting...'
client.connect(blade, 22, username, password)
# create VM, log in, and print hostname for verification
stdin, stdout, stderr = client.exec_command('sudo kvm -m 1024 -drive file=/var/lib/libvirt/images/oa4-vm$
time.sleep(60) #delay to allow VM to initialize
stdin.write(username + '\n') #log into VM
stdin.write(password + '\n') #log into VM
stdin, stdout, stderr = client.exec_command('hostname --short')
print stdout.readlines()
except Exception, e:
print '*** Caught exception: %s: %s' % (e.__class__, e)
traceback.print_exc()
try:
client.close()
except:
pass
sys.exit(1)
When I run this, I get the following:
joe#computer:~$ python automata.py
*** Connecting...
/home/joe/.local/lib/python2.7/site-packages/paramiko/client.py:95: UserWarning: Unknown ssh-rsa host key for 192.168.1.15: 25f6a84613a635f6bcb5cceae2c2b435
(key.get_name(), hostname, hexlify(key.get_fingerprint())))
*** Caught exception: <class 'socket.error'>: Socket is closed
Traceback (most recent call last):
File "automata.py", line 32, in function1
stdin.write(username + '\n') #log into VM
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/file.py", line 314, in write
self._write_all(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/file.py", line 439, in _write_all
count = self._write(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/channel.py", line 1263, in _write
self.channel.sendall(data)
File "/home/joe/.local/lib/python2.7/site-packages/paramiko/channel.py", line 796, in sendall
raise socket.error('Socket is closed')
error: Socket is closed
I'm not sure how to interpret this error -- "socket is closed" makes me think the SSH connection is terminating one I try to create the VM. Does anyone have any pointers?
update
I'm attempting to use the pexpect wrapper and having trouble getting it to interact with the un/pw prompt. I'm testing the process by ssh'ing into a remote machine and running a test.py script which prompts me for a username, then saves the username in a text file. Here is my fab file:
env.hosts = ['hostname']
env.user = 'userame'
env.password = 'password'
def vm_create():
run("python test.py")
And the contents of test.py on the remote machine are:
#! /usr/bin/env python
uname = raw_input("Enter Username: ")
f = open('output.txt','w')
f.write(uname + "\n")
f.close
So, I can execute "fab vm_create" on the local machine and it successfully establishes the SSH connection and prompts me for the username, as defined by test.py. However, if I execute a third python file on my local machine with the pexpect wrapper, like this:
import pexpect
child = pexpect.spawn('fab vm_create')
child.expect ('Enter Username: ')
child.sendline ('password')
Nothing seems to happen. I get no errors, and no output.txt is created on the remote machine. Am I using pexpect incorrectly?
As much as I love paramiko, this may be better suited to using Fabric.
Here's a sample fabfile.py:
from fabric.api import run
from fabric.api import sudo
from fabric.api import env
env.user = 'root'
env.password = ''
env.host = ='192.168.1.15'
def vm_up():
sudo("kvm -m 1024 -drive file=/var/lib/libvirt/images/oa4-vm$...")
run("hostname --short")
To then run this, use
$ fab vm_up
If you don't set the host and password in the fabfile itself (rightly so), then you can set these at the command line:
$ fab -H 192.168.1.15 -p PASSWORD vm_up
However, your kvm line is still expecting input. To send input (and wait for the expected prompts), write another script that uses pexpect to call fab:
child = pexpect.spawn('fab vm_up')
child.expect('username:') # Put this in the format you're expecting
child.send('root')
use fabric http://docs.fabfile.org/en/1.8/
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks
from fabric.api import run
def host_name():
run('hostname -s')

Categories

Resources