How to check if Paramiko SFTP client is still open - python

How do I check sftp client opened earlier via Paramiko is still active throughout
self.sftp = ssh.open_sftp()
My application logic at times keeps the SFTP connection idle from anywhere between 0.5 sec to 5 minutes.

While you can try testing ssh.get_transport().is_active(), the only sure way is to try to use the connection.
But you can make Paramiko keep the connection alive:
Python Paramiko SSH session not active after being idle for many hours

Related

ERRNO 111 Connection refused when starting server over SSH

I would like to start a rpyc server on a machine I'm connected with over ssh. The general SSH connection is working but when starting the server I receive ERRNO 111 Connection refused. When starting the server manually, by logging in via SSH on the machine and starting the py file, I can connect. I tried:
ssh.exec_command("python3 /tmp/recalibration/rpc_service_pictures.py")
ssh.exec_command("python3 /tmp/recalibration/rpc_service_pictures.py &")
ssh.exec_command("nohup python3 /tmp/recalibration/rpc_service_pictures.py &")
ssh.exec_command("nohup python3 /tmp/recalibration/rpc_service_pictures.py > /dev/null 2>&1 &")
but nothing is changing the Connection Problem, any ideas?
Thanks in advance!
Turns out you cant be connected via SSH at the same time.
I had an open SSH session at the same time to debug, and because of that I couldnt connect. Seems obvious when you know it, but if you dont you are completely lost :D

Python script to SSH into a jumphost and sftp from within that box

I'm currently trying to write an airflow job that will allow me to ssh into an EC2 instance and then start an sftp session with another host from within this EC2 box. My current code that I have is as follows:
def run_ssh():
hook = SSHHook(ssh_conn_id='xyz').get_conn() #returns an ssh client
stdin, stdout, stderr = hook.exec_command('sftp user#host.com;')
# This next step prompts me for password so i provide it
stdin.write('password')
logging.info(stdout.readlines())
stdin, stdout, stderr = hook.exec_command('ls')
logging.info(stdout.readlines())
When i print the final line i should be seeing some folders but instead just see ['a\n']... so it seems I'm not actually able to sftp. Are there better ways to sftp from a remote host through a python script running locally.
Any help with this is appreciated. The answer can be geared towards a simple python script as opposed to airflow.
For your literal question, see:
Pass input/variables to command/script over SSH using Python Paramiko
Though implementing an SFTP over jump host this way is not a good solution.
Use port forwarding instead:
Nested SSH using Python Paramiko
Port forwarding and the open SFTP using Python Paramiko

Python: Unable to connect SSH with paramiko

This is my first time using paramiko. I'm trying to establish an SSH session to a test Amazon Linux 2 instance where I've enabled password authentication, since that doesn't come enabled by default and restarted the SSH daemon on the box. I also made sure that I could connect with SSH via the normal SSH program using the username / password I put in the Python program.
When I run the Python code below, everything looks good and it waits for input and keeps the program running, but when I'm logged into the Amazon instance, I don't see the paramiko user logged in (I did a "w" and a "who" command). In fact, I have no evidence server-side that Paramiko ever connects successfully to begin with.
#!/usr/bin/env python3
import pprint
import boto3
import os
import paramiko
os.system('clear')
pp = pprint.PrettyPrinter(indent=4)
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('X.X.X.X',username='the_username',password='the_password',port=22)
get_input = input("Preventing program from closing and keeping SSH connectiion alive...")
who shows interactive shell sessions only.
Your code only connects. It does not start a shell, let alone an interactive shell.
See List all connected SSH sessions?

Python Paramiko SSH session not active after being idle for many hours

I am using Python Paramiko ssh.exec_command to send commands to a slave machine from the master machine (from CentOS to CentOS). The master sends the command to the slave and waits (sleeps or does something meaningful) for it to be completed by the slave. The slave takes approximately 10 hours to complete the command. After 10 hours if the master sends the next command to the slave using ssh.exec_command, I get an error message saying SSH session is not active.
I tried setting ServerAliveinterval, TCPKeepAlive in ssh_config file on both master and slave side. But nothing worked. What are all the other possibilities that I need to check to keep the SSH session active forever.
ServerAliveInterval has no effect on Paramiko.
Use Transport.set_keepalive:
transport = client.get_transport()
transport.set_keepalive(60)
If that does not help (typically because the server ignores keepalives), you have to keep the session alive yourself. E.g. by making the command produce an output continuously.

Check whether a connection exists to a remote host using paramiko

I'm using single object of paramiko.SSHClient() for executing a command on a remote machine. When I use ssh.exec_command(cmd), and the connection to remote host is lost, ssh.exec_command hangs up.
Is there a way to check for connection existence before ssh.exec_command()?
If you have a long running SSH connection, you may want to use the Keep Alive parameter via Transport.set_keepalive.
As an alternate possibility, maybe execnet would work. It wraps the command line ssh command instead, so it's definitely not the paramiko approach... just a though.

Categories

Resources