I'm using single object of paramiko.SSHClient() for executing a command on a remote machine. When I use ssh.exec_command(cmd), and the connection to remote host is lost, ssh.exec_command hangs up.
Is there a way to check for connection existence before ssh.exec_command()?
If you have a long running SSH connection, you may want to use the Keep Alive parameter via Transport.set_keepalive.
As an alternate possibility, maybe execnet would work. It wraps the command line ssh command instead, so it's definitely not the paramiko approach... just a though.
Related
I want to run client and daemon application which responds to client in the same time.
Connection established to SSH using Paramiko. But I could not run both daemon and client in the same time.
How to do this with Paramiko?
Here the expectation is, client provide input as 1,2,3 and daemon responds to each input.
Both run in the same SSH.
Could any one help me with this?
I assume you can use simple shell syntax to achieve what you need. You do not need any fancy code in Python/Paramiko.
Assuming *nix server, see How do I run multiple background commands in bash in a single line?
To run (any) command in Paramiko, see Execute command and wait for it to finish with Python Paramiko
So probably like this:
stdin, stdout, stderr = ssh_client.exec_command("deamon & client")
stdout.channel.set_combine_stderr(True)
output = stdout.readlines()
If you need to somehow run the two commands (deamon and clinet) independenty for a better control, you can start here:
Run multiple commands in different SSH servers in parallel using Python Paramiko
Except that you do not need to open multiple connections (SSHClient). You will just call SSHClient.exec_command twice on the same SSHClient instance.
I'm currently trying to write an airflow job that will allow me to ssh into an EC2 instance and then start an sftp session with another host from within this EC2 box. My current code that I have is as follows:
def run_ssh():
hook = SSHHook(ssh_conn_id='xyz').get_conn() #returns an ssh client
stdin, stdout, stderr = hook.exec_command('sftp user#host.com;')
# This next step prompts me for password so i provide it
stdin.write('password')
logging.info(stdout.readlines())
stdin, stdout, stderr = hook.exec_command('ls')
logging.info(stdout.readlines())
When i print the final line i should be seeing some folders but instead just see ['a\n']... so it seems I'm not actually able to sftp. Are there better ways to sftp from a remote host through a python script running locally.
Any help with this is appreciated. The answer can be geared towards a simple python script as opposed to airflow.
For your literal question, see:
Pass input/variables to command/script over SSH using Python Paramiko
Though implementing an SFTP over jump host this way is not a good solution.
Use port forwarding instead:
Nested SSH using Python Paramiko
Port forwarding and the open SFTP using Python Paramiko
I am working with Paramiko 2.7.1, using a simple client implementation for running commands on remote SSH servers.
On most of my hosts, it works great. Input commands go out, output (if exists) comes back.
One specific type of host (an IBM VIOS partition to be precise) is giving me headaches in that the commands execute, but the output is always empty.
I have used PuTTY in an interactive session to log all SSH packets and check for any differences and, at least during an interactive session, no differences present between a working and a non-working host.
I have enabled Paramiko logging with:
basicConfig(level=DEBUG)
logging.getLogger("paramiko").setLevel(logging.DEBUG)
log_to_file('ssh.log')
But the output doesn't dump each packet. I have done a search for any parameters or methods that would dump those packets but I've come up empty.
Wireshark is not an option since we are talking about an encrypted connection.
I would prefer to keep using exec_command instead of having to refactor everything and adapt to using an SSH shell.
So, in the end. Is there any way to dump the entire SSH session with Paramiko? I can handle either SSH packets or raw data.
Edit 1: I have remembered that PuTTY's plink.exe does ssh exec commands, so I used it to compare both SSH server's output and stumbled onto the solution to my base problem: https://www.ibm.com/support/pages/unable-execute-commands-remotely-vio-server-padmin-user-ssh
Still, I'd rather have captured the session with Paramiko, since I will not always be able to simulate with other tools...
In addition to enabling logging, call Transport.set_hexdump():
client.get_transport().set_hexdump(True)
Regarding your original problem, see also:
Command executed with Paramiko does not produce any output
I am trying to connect machine using the RPyC but it always says that connection refused.
I did on the python shell
import rpyc
rpyc.connect("hostname", port)
but it says connection refused. checked the firewall for the port. firewall allow this port.
Try using the exact same versions of both python and rpyc on client and server !
This means that you are not runnin the server side for rpyc
you need to donwload the source code for rpyc from here
https://github.com/tomerfiliba-org/rpyc/releases
then run:
python bin/rpyc_classic.py
where bin is in the source code folder
Once you have that running, you should be able to run the python code without any issues
I hope it works
default server binds to localhost, but the client needs to have hostname 'None' to do this correctly:
rpyc.connect(None, port)
I'm new to Python and Fabric, and I've modified a script that pings hosts on our LAN (to determine what machines are alive, we have a lot) to log into the hosts and list running processes back to the client. Whilst this works on servers, it seems there's other devices in the subnets that don't permit SSH logins and the connection is refused, causing Fabric to exit with a fatal error. Is there any way to make Fabric skip any host that refuses a connection?
Using
with settings(warn_only=True)
doesn't seem to help.
Thanks.
You can set this env var or also use this flag. Searching the docs, if you can't find it in a heading, is best.