Any substitutes for pexpect? - python

I am writing a script using python pexpect to execute another script on a remote machine. It works fine in normal cases, but if there is a time.sleep in the remote script, it fails.
I want to get to the remote machine, start the script in the background and get out. Is this possible ?
Can someone suggest an alternative or let me know how to get around this issue?

Have you considered paramiko?
Here's an example ...
#!/usr/bin/env python
import paramiko
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.connect(hostname='example.com', port=22, username='sethu', password='****')
ssh.exec_command('nohup sleep 300 &')
ssh.close()

Related

How to run two applications (deamon and client) at the same time in single SSH using Python Paramiko

I want to run client and daemon application which responds to client in the same time.
Connection established to SSH using Paramiko. But I could not run both daemon and client in the same time.
How to do this with Paramiko?
Here the expectation is, client provide input as 1,2,3 and daemon responds to each input.
Both run in the same SSH.
Could any one help me with this?
I assume you can use simple shell syntax to achieve what you need. You do not need any fancy code in Python/Paramiko.
Assuming *nix server, see How do I run multiple background commands in bash in a single line?
To run (any) command in Paramiko, see Execute command and wait for it to finish with Python Paramiko
So probably like this:
stdin, stdout, stderr = ssh_client.exec_command("deamon & client")
stdout.channel.set_combine_stderr(True)
output = stdout.readlines()
If you need to somehow run the two commands (deamon and clinet) independenty for a better control, you can start here:
Run multiple commands in different SSH servers in parallel using Python Paramiko
Except that you do not need to open multiple connections (SSHClient). You will just call SSHClient.exec_command twice on the same SSHClient instance.

How to read and write directly to the CMD prompt on Windows?

I know others have asked similar questions to this, but what I want to know is whether or not there is a way to simply open a cmd window (on windows 10), leave it open and have a python script automatically type into it.
What I actually need to do is SSH into a remote computer and then from that remote computer, run IDL commands. I have not found a way to do this with subprocess Popen or with Paramiko to SSH. In the case of subprocess, I can't get past authentication to get into the host, i.e. I can't figure out how to type a password in. With Paramiko I am able to get into the remote computer but once there I can't get my IDL code to run, and if it does run it just hangs on that command and won't output anything, even after closing.
It seems to me that rather than sending commands directly to cmd through a subprocess or os command, it would make it much easier if I could just send a string to an open cmd window and have it execute the command and read the result. I'm going through a remote computer and then running IDL code on that remote computer so there are two layers that python has to get through. If I could just write in the cmd terminal it would make my life much easier, rudimentary as it is.
Apologies if I don't have the right lingo with all of this, I'm definitely not a network programmer.
Below is my attempt to use Paramiko. The "ls" command works and the contents of the folder I'm in are printed so I know I've accessed the remote computer, but my idl command does not print anything. Printing the output from the "idl" command itself just causes the program to hang there.
import paramiko
host = "<hostname>"
user = "<myusername>"
password = input("pass:")
client = paramiko.client.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(host, username=user, password=password)
_stdin, _stdout,_stderr = client.exec_command("ls", get_pty=True)
print(_stdout.read().decode())
_stdin, _stdout,_stderr = client.exec_command("idl")
_stdin, _stdout,_stderr = client.exec_command("<my command>")
print(_stdout.read().decode())
client.close()
I made much less progress with os and/or subprocess as I can't even get into the remote computer with either of those.

Python script to SSH into a jumphost and sftp from within that box

I'm currently trying to write an airflow job that will allow me to ssh into an EC2 instance and then start an sftp session with another host from within this EC2 box. My current code that I have is as follows:
def run_ssh():
hook = SSHHook(ssh_conn_id='xyz').get_conn() #returns an ssh client
stdin, stdout, stderr = hook.exec_command('sftp user#host.com;')
# This next step prompts me for password so i provide it
stdin.write('password')
logging.info(stdout.readlines())
stdin, stdout, stderr = hook.exec_command('ls')
logging.info(stdout.readlines())
When i print the final line i should be seeing some folders but instead just see ['a\n']... so it seems I'm not actually able to sftp. Are there better ways to sftp from a remote host through a python script running locally.
Any help with this is appreciated. The answer can be geared towards a simple python script as opposed to airflow.
For your literal question, see:
Pass input/variables to command/script over SSH using Python Paramiko
Though implementing an SFTP over jump host this way is not a good solution.
Use port forwarding instead:
Nested SSH using Python Paramiko
Port forwarding and the open SFTP using Python Paramiko

Using paramiko when a unix server is using VShell

Use case
On a unix server , when login manually ,opens a command shell of its own to run the command.
I am trying to automate this by using paramiko , however , somehow i am not able to execute the command on command shell using paramiko
What i have done ?
I created a simple script which is able to make connection, but its not executing command on Vshell as the ouput is always coming empty.
import paramiko
import sys
ssh_client=paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(hostname=sys.argv[1],port=sys.argv[2],username=sys.argv[3],password=sys.argv[4])
command="show hwid"
stdin,stdout,stderr=ssh_client.exec_command(command)
out=stdout.read()
print out
err=stderr.read()
print err
ssh_client.close()
The same script runs perfectly fine , when its used on server where vshell is not being used
Anyhelp or suggestion on this?
stdin,stdout,stderr=ssh_client.exec_command(command)
Regarding this line of code, I suspect that the SSH server is not properly configured to allow commands to be executed in this way (this is the equivalent of ssh myserver show hwid, rather than typing it into the terminal after login).
You might want to imitate the behaviour of typing the command in after logging into the server, and for that I think this is appropriate:
shell = ssh_client.invoke_shell()
stdin, stdout, stderr = shell.exec_command(command)

Python best practice : Whether to use subprocess/fabric calls or use a Linux script?

I have a use-case, where I need to run Linux commands on a remote machine. Currently, I am using fabric to achieve the functionality. However, I was wondering if it would be better to use Shell script rather than Python.
My use-case is to setup some services, setup dbs in MySQL, and additionally create some scripts which would then be executed on the remote machine. For now, I have about 50-60 lines of commands embedded in Fabric calls.
Everything has to be executed on the remote machine, and for the same I have created a connection to the machine using fabric, and I run commands with a run/sudo function. For all the different lines of commands, I am using a separate run method.
If I have to use a shell script, I would further have two ways:
use a fabric call to run the script on remote machine.
make the script ssh into the other machine, and run it there.
What would be the best Pythonic way to achieve the functionality.
Have you considered using paramiko ?
Here's a simple example from Brandon Rhodes 'Foundations of Python Network Programming':
import paramiko
class AllowAnythingPolicy(paramiko.MissingHostKeyPolicy):
def missing_host_key(self, client, hostname, key):
return
client = paramiko.SSHClient()
client.set_missing_host_key_policy(AllowAnythingPolicy())
client.connect('127.0.0.1', username='username', password='python')
for command in 'echo "Hello, world!"', 'uname -a', 'uptime':
stdin, stdout, stderr = client.exec_command(command)
stdin.close()
print repr(stdout.read())
stdout.close()
stderr.close()
client.close()
Cheers, Arthur

Categories

Resources