Paramiko change IP of remote machine - python

I have to change the IP of a remote machine running Gentoo. For this I use a code that looks like
try:
guest.connect("10.22.254.200", username='root',password='root')
except SSHException as detail:
session.flash = detail.message
else:
sftp = guest.open_sftp()
sftp.put('./scripts/change-ip', '/root/change-ip')
sftp.close()
guest.exec_command('chmod +x /root/change-ip')
time.sleep(5)
try:
stdin,stdout,stderr = guest.exec_command('/root/change-ip 10.22.254.200 &')
My change-ip script looks like
set -x
cp /etc/conf.d/net /etc/conf.d/net.bak
sed "s/10.22.254.200/$1/g" /etc/conf.d/net.bak > /etc/conf.d/net
/etc/init.d/net.eth0 restart
I am able to ssh directly and execute the script successfully changing the ip, but for some reason cant do it in the code.
Any help would be greatly appreciated

Restarting the network over the network is a dangerous thing. What happens here is that after stopping the network interface in the command /etc/init.d/net.eth0 restart the script receive a HANGUP signal stating that the connection has been closed, and stops before starting the network interface.
To avoid this problem, you can use nohup, which "run a command immune to hangups".

Related

Gracefully abort remote Windows command executed over SSH from Windows Python Paramiko script when Ctrl+C is pressed

I have a follow up question that builds off the question I asked here: Run multiple commands in different SSH servers in parallel using Python Paramiko, which was already answered.
Thanks to the answer on the link above, my python script is as follows:
# SSH.py
import paramiko
import argparse
import os
path = "path"
python_script = "worker.py"
# definitions for ssh connection and cluster
ip_list = ['XXX.XXX.XXX.XXX', 'XXX.XXX.XXX.XXX', 'XXX.XXX.XXX.XXX']
port_list = [':XXXX', ':XXXX', ':XXXX']
user_list = ['user', 'user', 'user']
password_list = ['pass', 'pass', 'pass']
node_list = list(map(lambda x: f'-node{x + 1} ', list(range(len(ip_list)))))
cluster = ' '.join([node + ip + port for node, ip, port in zip(node_list, ip_list, port_list)])
# run script on command line of local machine
os.system(f"cd {path} && python {python_script} {cluster} -type worker -index 0 -batch 64 > {path}/logs/'command output'/{ip_list[0]}.log 2>&1")
# loop for IP and password
stdouts = []

clients = []
for i, (ip, user, password) in enumerate(zip(ip_list[1:], user_list[1:], password_list[1:]), 1):
try:
print("Open session in: " + ip + "...")
client = paramiko.SSHClient()
client.connect(ip, user, password)
except paramiko.SSHException:
print("Connection Failed")
quit()
try:
path = f"C:/Users/{user}/Desktop/temp-ines"
stdin, stdout, stderr = ssh.exec_command(
f"cd {path} && python {python_script} {cluster} -type worker -index {i} -batch 64>"

 f"C:/Users/{user}/Desktop/{ip}.log 2>&1 &"
)

clients.append(ssh)
stdouts.append(stdout)
except paramiko.SSHException:
print("Cannot run file. Continue with other IPs in list...")
client.close()
continue
# Wait for commands to complete
for i in range(len(stdouts)):
print("hello")
stdouts[i].read()
print("hello1")
clients[i].close()
print('hello2")
print("\n\n***********************End execution***********************\n\n")
This script, which is run locally, is able to SSH into the servers and run the command (i.e., run a python script called worker.py and log the command output to a log file). I.e., it is able to go through the first for loop with no issues.
My issue is related to the second for loop. Please see the print statements I added in the second for loop to be clear. When I run SSH.py locally, this is what I observe:
As you can see, I ssh into each of the servers and then stay at reading the command output of the first server I ssh over to. The worker.py script can take 30 mins or so to complete and the command output is the same on each server -- so it will take 30 mins to read the command output of the first server, then close the SSH connection of the first server, take a couple seconds to read the command output of the second server (as it is the same as the first one and would already be entirely printed), close its SSH connection, and so on. Please see below some of the command line output, if this helps.
Now, my question is, what if I don't want to wait until the worker.py script finishes, i.e., those entire 30 mins? I cannot/do not know how to raise a KeyboardInterrupt exception. What I have tried is quitting the local SSH.py script. However, as you can see from the print statements, this will not close the SSH connections although the training, and thus the log files, will stop logging info. In addition, after I quit the local SSH.py script, if I try to delete any of the log files, I get an error saying "cannot delete file because it is being used in cmd.exe" -- this only happens sometimes and I believe it is because of not closing the SSH connections?
First run in python console:
It hangs: Local python and log file running and saving but no print statements and no python and log file being run/saved in servers.
I run it again so second process starts:
Now, the first process doesn't hang anymore (python running and log files being saved in server). And can close this second run/process. It is like the second run/process helps with the hang of the first run/process.
If I were to run python SSH.py in the terminal it would just hang.
This was not happening before.
If you know that SSHClient.close cleanly close the connection and abort the remote command, call it on response to KeyboardInterrupt.
For this you cannot use the simple solution with stdout.read, as it blocks and prevents handling of the Ctrl+C on Windows.
Use the waiting code from my answer to Run multiple commands in different SSH servers in parallel using Python Paramiko (the while any(x is not None for x in stdouts): snippet).
And wrap it to try:...except (KeyboardInterrupt):.
try:
while any(x is not None for x in stdouts):
for i in range(len(stdouts)):
stdout = stdouts[i]
if stdout is not None:
channel = stdout.channel
# To prevent losing output at the end, first test for exit,
# then for output
exited = channel.exit_status_ready()
while channel.recv_ready():
s = channel.recv(1024).decode('utf8')
print(f"#{i} stdout: {s}")
while channel.recv_stderr_ready():
s = channel.recv_stderr(1024).decode('utf8')
print(f"#{i} stderr: {s}")
if exited:
print(f"#{i} done")
clients[i].close()
stdouts[i] = None
time.sleep(0.1)
except (KeyboardInterrupt):
print("Aborting")
for i in range(len(clients)):
print(f"#{i} closing")
clients[i].close()
If you do not need to separate the stdout and stderr, you can greatly simplify the code by using Channel.set_combine_stderr. See Paramiko ssh die/hang with big output.

How to keep paramiko ssh session open after loggin in using python?

I am trying to ssh to a test cisco router in a test environment using python paramiko, and run cisco commands in that test router.
Everything works great except for 1 small detail.
After running the script I want the ssh session to remain open. (so I can run other commands manually).
I want to keep the ssh session open until I type "exit"
I found another link with a similar issue but I cant understand the solution.
(See here Python ssh - keep connection open after script terminates)
I would appreciate if someone can help me out here
My code
import paramiko
import time
def ssh_session(ip):
try:
session = paramiko.SSHClient() #Open the session
session.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session.connect(ip, username = "ciscouser1", password = "password")
connection = session.invoke_shell()
####Running Cisco IOS commands###
connection.send("enable\n")
connection.send("password1") #sending
connection.send("\n")
connection.send("configure terminal\n\n")
time.sleep(1)
connection.send("do show ip int brief\n")
time.sleep(1)
except paramiko.AuthenticationException:
print "wrong credentials"
ssh_session("10.10.10.1")
The session timeout would be controlled by the SSH server. To the best of my knowledge, the only way to keep your session alive on the client side is to not be inactive, which can be accomplished by sending null packets. As to how to do this specifically with paramiko I am not certain. Perhaps you could send some kind of dummy command (or maybe even an empty string?) every so often?

Python ssh - keep connection open after script terminates

I'm trying to write a script that will ssh into a box for me. I'm using Python and leveraging the paramiko library. I can successfully ssh on the box, but as soon as the script terminates, the ssh connection also terminates. I want to keep the connection open after the script has completed running.
Python:
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect(host, username=self.username, password=self.password)
stdout = execute(self.ssh, 'pwd') # test command for now to verify i'm on box
print stdout
sys.exit()
Console:
$ ssh.py
[u'/home/myuser\n']
myuser#xxxx ~
$
I haven't been able to find similar examples online, so any help would be appreciated.
Try this:
import subprocess
subprocess.call(["ssh", "myuser#myserver"])

Executing reboot command over SSH using Paramiko

I use Paramiko for establishing SSH connection with some target device and I want to execute reboot command.
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(zip_hostname, username=username, password=password, timeout=1)
try:
stdin, stdout, stderr = ssh.exec_command("/sbin/reboot -f")
# .........
# some code
# .........
except AuthenticationException, e:
print ''
finally:
ssh.close()
But after executing ssh.exec_command("/sbin/reboot -f") "some code" does not execute because program is stuck in exec_command (the disconnection takes place caused by rebooting). What should I do to solve my problem?
Try this:
ssh.exec_command("/sbin/reboot -f > /dev/null 2>&1 &")
All the output of reboot is redirected to /dev/null to make it produce no output and it is started in the background thanks to the '&' sign in the end. Hopefully the program won't hang on that line this way, because the remote shell gives the prompt back.
Get the transport from the ssh and set the keepalive using:
transport = ssh.get_transport()
transport.set_keepalive(5)
This sets the keepalive to 5 seconds; mind you I would have expected the timeout=1 to have achieved the same thing.
All you need to do is to call channel.exec_command() instead of the high-level interface client.exec_command()
# exec fire and forget
timeout=0.5
transport = ssh.get_transport()
chan = ssh.get_transport().open_session(timeout=timeout)
chan.settimeout(timeout)
try:
chan.exec_command(command)
except socket.timeout:
pass
I was having this issue and managed to avoid it by switching to this command:
/sbin/shutdown -r now
Note this command does not result in any STDOUT or STDERR output
In case you or anyone else gets stuck trying to reboot host with sudo using forwarding agents (ssh keys) or in my case (yubikey)
If you look at this as bash you would reboot a host as non root user like this.
ssh -t -A user#hostname sudo /sbin/reboot
For the -A flag, from ssh man page
Enables forwarding of the authentication agent connection. This can also be specified on a per-host basis in a
configuration file.
Agent forwarding should be enabled with caution. Users with the ability to bypass file permissions on the
remote host (for the agent’s Unix-domain socket) can access the local agent through the forwarded connection.
An attacker cannot obtain key material from the agent, however they can perform operations on the keys that
enable them to authenticate using the identities loaded into the agent.*
For the -t flag, from ssh man page
Force pseudo-tty allocation. This can be used to execute arbitrary screen-based programs on a remote machine,
which can be very useful, e.g. when implementing menu services. Multiple -t options force tty allocation, even
if ssh has no local tty.*
So lets break this down into how you would do this in paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, username=username)
s = ssh.get_transport().open_session()
paramiko.agent.AgentRequestHandler(s)
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
For authentication forwarding (-A flag in bash ssh command) for paramiko
ssh = paramiko.SSHClient() #'ssh' is client variable
s = ssh.get_transport().open_session() #get 'ssh' transport and open sessions assigned to 's' variable
paramiko.agent.AgentRequestHandler(s) #call in 's' to the forwarding agent for current ssh session
Now for force pseudo-tty allocation (-t flag in bash ssh command) for paramiko
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
Adding 'get_pty=True' to exec_command will allow you execute sudo /sbin/reboot
Hope this helps, everyone's environments are different but this should work as it the exact same thing as if you ran it as bash.

Python run mutiple ssh commands in the same session

My goal is to connect to SSH with python and authenticate which i can do with Paramiko or Fabric. But i would like to keep the session open after each execution and read the input/output. With paramiko i can only run 1 command before the session is closed and i am asked to authenticate again and the session hangs. And since fabric is using the paramiko library its giving me the same issue. For example if my directory structure is like this
-home
--myfolder1
--myfolder2
I would like to execute the below commands without having to re-authenticate because the sessions closes.
(make connection)
run cmd: 'pwd'
output: /home
run cmd: 'cd myfolder2'
run cmd: 'pwd'
output: /home/myfolder2
Is this possible with any module that is out there right now? Could it be made from scratch with native python? And also is this just not possible...?
Edit Added code. Without the new open_session it closes and i cannot run any command. After running the first command with this i will be prompted again to authenticate and it creates an infinite loop.
Edit2 If it closes after each command then there is no way this will work at all correct?
edit3 If i run this on a different server and exec_command with the paramikio.SSHClient it wont ask me to reauthenticate but if i 'cd somedir' and then 'pwd' it will output that i am back in the root directory of where i created.
class connect:
newconnection = ''
def __init__(self,username,password):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
ssh.connect('someserver', username=username,password=password,port=22,timeout=5)
except:
print "Count not connect"
sys.exit()
self.newconnection = ssh
def con(self):
return self.newconnection
#This will create the connection
sshconnection = connect('someuser','somepassword').con()
while True:
cmd = raw_input("Command to run: ")
if cmd == "":
break
try:
transport = sshconnection.get_transport()
transport.set_keepalive(999999)
chan = transport.open_session()
chan.settimeout(3)
chan.setblocking(0)
except:
print "Failed to open a channel"
chan.get_exception()
sys.exit()
print "running '%s'" % cmd
stdout_data = []
stderr_data = []
pprint.pprint(chan)
nbytes = 4096
chan.settimeout(5)
chan.get_pty()
chan.exec_command(cmd)
while True:
print "Inside loop " , chan.exit_status_ready()
time.sleep(1.2)
if chan.recv_ready():
print "First if"
stdout_data.append(chan.recv(nbytes))
if chan.recv_stderr_ready():
print "Recv Ready"
stderr_data.append(chan.recv_stderr(nbytes))
if chan.exit_status_ready():
print "Breaking"
break
print 'exit status: ', chan.recv_exit_status()
print ''.join(stdout_data)
This is possible by using the normal modules when you can concatenate the commands into one. Try
pwd ; cd myfolder2 ; pwd
as command. This should work but quickly becomes tedious when you have more complex commands which need arguments and horrible when the arguments contain spaces. The next step then is to copy a script with all the commands to the remote side and tell ssh to execute said script.
Another problem of this approach is that SSH doesn't return until all commands have executed.
Alternatively, you could build a "command server", i.e. a simple TCP server that listens for incoming connections and executes commands sent to it. It's pretty simple to write but also pretty insecure. Again, the solution is to turn the server into a (Python) script which reads commands from stdin and start that script remotely via SSH and then send commands.

Categories

Resources