How to turn off local echo when using Paramiko? - python

I'm using Paramiko to connect to remote Cisco routers and switches. When connecting to these devices, I'd like to be able to turn off echo when entering "configuration" mode. That way, I can issue commands to the remote system and avoid seeing them come back (and thereby concentrate only on looking for error messages).
I'm performing the following commands to get a shell with the Cisco device:
self.chan = self.transport.open_session()
self.chan.get_pty()
self.chan.invoke_shell()
Now, I'd like to be able to take paramiko's file descriptor for the pty and issue something like the following:
fd = self.chan.fileno()
old = termios.tcgetattr(fd)
old[3] = old[3] | termios.ECHO
termios.tcsetattr(fd, termios.TCSADRAIN, old)
However, termios chokes on the file-descriptor returned by chan.fileno().
Most suggestions for turning off echo that I have seen require issuing a bash command like "stty -echo" on the remote box, but a Cisco router is not running bash.

After spending a lot of time with this, I ended up going back to the pxssh library. This library explicitly has a way to turn echo off:
connection.setecho(False)
...which was exactly what I needed. It also (via the parent module, pexpect) has a way to deal with telnet using the exact same library infrastructure (which unfortunately is still necessary often in the Cisco world), so you can have a connection object that uses telnet or ssh and it works in exactly the same way.
Whereas Paramiko seems like a much cleaner, better maintained library, the consensus in the Paramiko community seems to be that if you want to stop echo you need to tell the remote system not to echo. But when the remote system is not a linux/bash process, that becomes difficult or impossible. Pxssh is the library you need for more fine-grained control of your ssh session.

Related

How to log each SSH session packet with Paramiko?

I am working with Paramiko 2.7.1, using a simple client implementation for running commands on remote SSH servers.
On most of my hosts, it works great. Input commands go out, output (if exists) comes back.
One specific type of host (an IBM VIOS partition to be precise) is giving me headaches in that the commands execute, but the output is always empty.
I have used PuTTY in an interactive session to log all SSH packets and check for any differences and, at least during an interactive session, no differences present between a working and a non-working host.
I have enabled Paramiko logging with:
basicConfig(level=DEBUG)
logging.getLogger("paramiko").setLevel(logging.DEBUG)
log_to_file('ssh.log')
But the output doesn't dump each packet. I have done a search for any parameters or methods that would dump those packets but I've come up empty.
Wireshark is not an option since we are talking about an encrypted connection.
I would prefer to keep using exec_command instead of having to refactor everything and adapt to using an SSH shell.
So, in the end. Is there any way to dump the entire SSH session with Paramiko? I can handle either SSH packets or raw data.
Edit 1: I have remembered that PuTTY's plink.exe does ssh exec commands, so I used it to compare both SSH server's output and stumbled onto the solution to my base problem: https://www.ibm.com/support/pages/unable-execute-commands-remotely-vio-server-padmin-user-ssh
Still, I'd rather have captured the session with Paramiko, since I will not always be able to simulate with other tools...
In addition to enabling logging, call Transport.set_hexdump():
client.get_transport().set_hexdump(True)
Regarding your original problem, see also:
Command executed with Paramiko does not produce any output

Identical command sent over paramiko and ssh works over ssh but not paramiko [duplicate]

I am connecting to SSH via terminal (on Mac) and run a Paramiko Python script and for some reason, the two sessions seem to behave differently. The PATH environment variable is different in these cases.
This is the code I run:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('host', username='myuser',password='mypass')
stdin, stdout, stderr =ssh.exec_command('echo $PATH')
print (stdout.readlines())
Any idea why the environment variables are different?
And how can I fix it?
The SSHClient.exec_command by default does not allocate a pseudo terminal for the session. As a consequence a different set of startup scripts is (might be) sourced (particularly for non-interactive sessions, .bash_profile is not sourced). And/or different branches in the scripts are taken, based on an absence/presence of TERM environment variable.
To emulate the default Paramiko behavior with the ssh, use the -T switch:
ssh -T myuser#host
See the ssh man:
-T Disable pseudo-tty allocation.
Contrary, to emulate the default ssh behavior with Paramiko, set the get_pty parameter of the exec_command to True:
def exec_command(self, command, bufsize=-1, timeout=None, get_pty=False):
Though rather than working around the issue by allocating the pseudo terminal in Paramiko, you should better fix your startup scripts to set the same PATH for all sessions.
For that see Some Unix commands fail with "<command> not found", when executed using Python Paramiko exec_command.
Working with the Channel object instead of the SSHClient object solved my problem.
chan=ssh.invoke_shell()
chan.send('echo $PATH\n')
print (chan.recv(1024))
For more details, see the documentation

Unable to import module when using paramiko sshclient [duplicate]

I am connecting to SSH via terminal (on Mac) and run a Paramiko Python script and for some reason, the two sessions seem to behave differently. The PATH environment variable is different in these cases.
This is the code I run:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('host', username='myuser',password='mypass')
stdin, stdout, stderr =ssh.exec_command('echo $PATH')
print (stdout.readlines())
Any idea why the environment variables are different?
And how can I fix it?
The SSHClient.exec_command by default does not allocate a pseudo terminal for the session. As a consequence a different set of startup scripts is (might be) sourced (particularly for non-interactive sessions, .bash_profile is not sourced). And/or different branches in the scripts are taken, based on an absence/presence of TERM environment variable.
To emulate the default Paramiko behavior with the ssh, use the -T switch:
ssh -T myuser#host
See the ssh man:
-T Disable pseudo-tty allocation.
Contrary, to emulate the default ssh behavior with Paramiko, set the get_pty parameter of the exec_command to True:
def exec_command(self, command, bufsize=-1, timeout=None, get_pty=False):
Though rather than working around the issue by allocating the pseudo terminal in Paramiko, you should better fix your startup scripts to set the same PATH for all sessions.
For that see Some Unix commands fail with "<command> not found", when executed using Python Paramiko exec_command.
Working with the Channel object instead of the SSHClient object solved my problem.
chan=ssh.invoke_shell()
chan.send('echo $PATH\n')
print (chan.recv(1024))
For more details, see the documentation

How to easily create a local SSH tunnel using pure Python

I have a pretty simple task: using ssh, I want to create a tunnel that forwards traffic from my local machine to a specific port on a remote machine. I can do this from the command line:
ssh -N -L 123:127.0.0.1:456 user#remotehost
Then if I run:
telnet localhost 123
it's the equivalent of logging into remotehost and running
telnet 127.0.0.1 456
I've managed to do this with something along the lines of;
subprocess.Popen(['ssh', '-N', '-L', '%i:127.0.0.1:%i' % (new_port, old_port), ssh_user + '#' + ip_addr])
But now I want to move away from that and use only Python - no external processes.
I've tried using fabric.context_managers.remote_tunnel but unless I've misunderstood this is meant for creating a tunnel that starts at a remote location, not from the local machine. That is, it is the equivalent of SSHing into a remote machine and creating an SSH tunnel from there, which is silly for my purpose. I suppose I could set the remote host to actually be the local machine but this seems inefficient and honestly I don't even understand how to do that.
I've also tried forward.py on paramiko and it doesn't work because my private key is encrypted. I'd like to modify the script to handle that, and also just simplify it for my needs, but both the script and the paramiko library are daunting and I don't know how to begin.
Surely there's an easy way to do this? I seem to be so close yet so far.
What do you mean by "pure Python"? Subprocess is bundled with standard python installation.
Subprocess and Fabric are designed for such tasks, why would you want to move away from them?
If you have, minimal tasks to be performed remotely e.g. check memory, hostname, etc. you can go ahead with suprocess. However if you have some big requirements, I would suggest going with fabric.
For your purpose where you have to work on the same machine, why not use subprocess with check_call or Popen. As an alternative, you can change your code altogether so as to be able to get into Unix core to achieve what the Linux commands do.

using python commands within paramiko

I've been using Paramiko today to work with a Python SSH connection, and it is useful.
However one thing I'd really like to be able to do over the SSH is to utilise some Pythonic sugar. As far as I can tell I can only use the inbuilt Paramiko functions, and if I want to anything using Python on the remote side I would need to use a script which I have placed on there, and call it.
Is there a way I can send Python commands over the SSH connection rather than having to make do only with the limitations of the Paramiko SSH connection? Since I am running the SSH connection through Paramiko within a Python script, it would only seem right that I could, but I can't see a way to do so.
RPyC could be what you're looking for. It gives you access to another machine's Python environment directly from the local script.
>>> import rpyc
>>> conn = rpyc.classic.connect("someremotehost.com")
>>> conn.modules.sys.path
['D:\\projects\\rpyc\\servers', 'd:\\projects', .....]
To establish a connection over SSL or SSH, see:
http://rpyc.sourceforge.net/docs/secure-connection.html#ssl
Well, that is what SSH created for - to be a secure shell, and the commands are executed on the remote machine (you can think of it as if you were sitting at a remote computer itself, and that either doesn't mean you can execute Python commands in a shell, though you're physically interact with a machine).
You can't send Python commands simply because Python do not have commands, it executes Python scripts.
So everything you can do is a "thing" that will make next steps:
Wrap a piece of Python code into file.
scp it to the remote machine.
Execute it there.
Remove the script (or cache it for further execution).
Basically shell commands are remote machine's programs themselves, so you can think of those scripts like shell extensions (python programs with command-line parameters, e.g.).

Categories

Resources