subprocess for executing unix commands - python

I basically want to connect to my simulator from there execute few commands.
from unix shell, I connect to my simulator from unix shell by giving the command "gmake CONFD_NUMBER=1 nthconfdcli", but when I run the below script, my code hangs .
def Simulator():
command = "gmake CONFD_NUMBER=50 nthconfdcli"
p = subprocess.Popen(command,shell=True,stdout=subprocess.PIPE)
(output, err) = p.communicate()
p.expect("#")
p.sendline('show test cli');
p.expect (['#',pexpect.EOF])
show = p.before
print show
p.sendline('exit');

Related

Connecting through openVPN with Python using config file

I am trying to gather data from a computer running in another country. With the Linux terminal, I can use openVPN with the .ovpn file to connect. However, to make automated API calls, I want to use Python.
Is there a way to connect through Python and getting the connection details from the .opvn file? A bit similar to SSHForwarder.
Something like this:
from openvpn_api import VPN
v = VPN('199.249.9.9', 1194)
with v.connection():
print(v.release)
Much appreciated!
Rutger
You may just run console command from your script by subprocess.run(args, stdout=PIPE, stderr=PIPE, universal_newlines=True).
Args should be a list like this: ['sudo', '/usr/local/sbin/openvpn', '--config', home + '/path/to/config.ovpn']
For example:
import subprocess, os
home = os.environ["HOME"]
args = [
'sudo',
'/Mike/local/sbin/openvpn',
'--config',
home + '/Mike/Downloads/office.ovpn'
]
r = subprocess.run(args, stdout=PIPE, stderr=PIPE, universal_newlines=True)
...
# your code which needs to be connected to openvpn
...
# kill connection
r.stdout
One another (easier) way is to use subprocess.Popen():
import subprocess, psutil
# define function to kill connection
def kill(proc_pid):
process = psutil.Process(proc_pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()
# use shell command to connect openvpn
r = subprocess.Popen(shell_command, shell=True)
...
# your code
...
# kill connection
kill(r.pid)

print stdout of a subprocess command in a docker container

I am having problems in displaying the output of a command run with Python subprocess.Popen within a docker container. The terminal just hangs while the process is running and only at the end it prints out the output.
I have a Python script like this (simplified):
def test():
print('start')
process = subprocess.Popen('pytest', stdout=subprocess.PIPE, universal_newlines=True)
for line in iter(process.stdout.readline, ''):
print(">>> {}".format(line.rstrip()))
def docker():
client = docker.from_env()
command = './this_script --test'
generator = client.containers.run('python:3', command, remove=True, init=True, working_dir='/test', stdout=True, stderr=True, stream=True)
for log in generator:
print('>>> {}'.format(log))
if __name__ == '__main__':
parser = argparse.ArgumentParser()
...
# if called with --test, the script calls test()
# if called with --docker, the script calls docker()
Even the print('start') at the beginning of test() is not printed until the end of the process.
How can I force the stdout of the subprocess to be displayed real time?
I am on Ubuntu 18.04, using Python 3.6.7, Docker version 18.09.6, build 481bc77.
EDIT:
So the problem is that the "run" command hangs and does not return until the container process has ended.
I found a way to make it work by running the container in detached mode:
Added the -u flag in the shebang
Updated the docker() function to start the container in detached mode:
def docker():
client = docker.from_env()
command = 'python -u this_script.py --test'
generator = client.containers.run('python:3', command, remove=True, init=True, working_dir='/test', detach=True)
for log in generator.logs(stdout=True, stderr=True, stream=True):
print('>>> {}'.format(log))

open a putty window and run ssh commands - Python

I am new to python. I need to login to a server daily (Desktop -> 1.32 -> 0.20 -> 3.26). For this I need to open putty and using ssh connection i am logging in. To do all this I want to write a script using python.
By using google I thought subprocess.Popen will do that. But Its not working fine.
1st trail:
import subprocess
pid = subprocess.Popen("putty.exe user#xxx.xx.x.32 -pw password").pid
Its working fine (Opening window logging into .32). But cant able to give input. I came to know that to give input for the same process we need to use pipes.
2nd trail:
from subprocess import Popen, PIPE, STDOUT
p = Popen("putty.exe user#xxx.xx.x.32 -pw password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
grep_stdout = p.communicate(input=b'ssh xx.xx.x.20\n')[0]
print(grep_stdout.decode())
by using this i cant login for the first server also. After logging in to all servers I need the terminal as alive. how to do this???
Edit
I need to do this in a new putty window. After logging in dont close the window. I have some manual work to do.
use powershell to call putty in order to open a new window
from subprocess import Popen
Popen("powershell putty.exe user#host -pw mypassword")
Use paramiko library python
Establish a SSH connection using -
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,username, password)
Check the status if connection is alive using -
status = ssh.get_transport().is_active()
#returns True if connection is alive/active
ssh.exec_command() is basically a single session. Use exec_command(command1;command2) to execute multiple commands in one session
Also, you can use this to execute multiple commands in single session
channel = ssh.invoke_shell()
stdin = channel.makefile('wb')
stdout = channel.makefile('rb')
stdin.write('''
Command 1
Command 2
''')
print stdout.read()
There is a SSHv2 protocol implementation for python: http://www.paramiko.org/. You can easily install it with pip:
pip install paramiko
Then you can create ssh client, connect to your host and execute commands:
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect('hostname', username='login', password='pwd')
stdin, stdout, stderr = ssh_client.exec_command('command')
I created a bat file on windows, which references putty and putty session-specific info. This bat file can run by itself on windows. To call from python, I used the subprocess.run() -- python 3.5+.
Example of bat file named putty.bat:
start c:\app\PuTTy\putty.exe -load 192.168.1.230-node1-logs -l <logon user> -pw <logon user password for putty session>
Breaking down the bat file:
It begins with window's command "start".
c:\app\PuTTy\putty.exe --> is the putty directory on Windows containing putty.exe.
-load --> tells putty to load a putty profile. The profile is the name you see on the putty client, under "Saved Sessions".
192.168.1.230-node1-logs --> my putty session specific profile.
-l for logon --> followed by the putty logon user.
-pw is the logon password --> followed by the putty logon password.
That concludes the contents of "putty.bat".
From within python, is used the subprocess.run() command.
Example:
import subprocess
...
...
try:
process = subprocess.run(["putty.bat"], check=True, stdout=subprocess.PIPE, universal_newlines=True)
print(process.stdout)
except Exception as e:
print("subprocess call error in open putty command")
print(str(e))
I hope you find this helpful

Python script to execute remote command in background and get pid

I have tried with subprocess and sshpass to execute remote command. Here is my code to execute remote command using subprocess..
import subprocess
import sys
HOST="192.168.20.175"
COMMAND="cat /proc/meminfo | grep MemTotal"
ssh = subprocess.Popen(["sshpass", "-p", "unlock123",
"ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
print "Return Code - %s" % ssh.returncode
However,
returncode returns None
Now I want to run following loop command on remote machine
"echo"; while true; do ps -p 1820 -o %cpu,%mem | grep -v CPU >> /tmp/proc_out.log;
I want to start the command on remote machine and don't want to wait for the command to complete.
Once the command is fired, get the pid of the command
Kill the process after some time whenever required.
Is there any possible solution for this?

using subprocess to ssh and execute commands

I need to ssh into the server and execute few commands and process the response using subprocess. Here's my code
command = 'ssh -t -t buildMachine.X.lan; sudo su - buildbot ; build-set sets/set123'
print "submitting command"
result = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
print "got response"
response,err = result.communicate()
print response
This is getting stuck. I have seen other threads talking about passing a list instead of string to subprocess and removing shell=True.. I did that too but didn't work.
Ultimately I need the result of last command i.e. build-set in order to extract some information out of it.. help?
I figured the solution by using univerio's comment
The command needs to be
command = 'ssh -t -t buildMachine.X.lan \'sudo su - buildbot \'build-set sets/set123\'\''
Individual commands are like argument to previous command. This works.

Categories

Resources