get PID from paramiko - python

I can't find a simple answer for this: I'm using paramiko to log in and execute a number of processes remotely and I need the PIDs of each process in order to check on them at later times. There doesn't seem to be a function in paramiko to get the PID of an executed command, so I tried using the following:
stdin,stdout,stderr = ssh.exec_command('./someScript.sh &;echo $!;)
I thought that then parsing through the stdout would return the PID, but it doesn't. I'm assuming I should run the script in the background in order to have a PID (while it is running). Is there a more simple, obvious, way of getting the PID?

Here's a way to obtain the remote process ID:
def execute(channel, command):
command = 'echo $$; exec ' + command
stdin, stdout, stderr = channel.exec_command(command)
pid = int(stdout.readline())
return pid, stdin, stdout, stderr

I usually use the standard UNIX command pidof <command name>, when I check on the process later. AFAIK there is no simpler way.
OK, given your comment, you can solve it by wrapping your ./someScript.sh in a Python process that uses the subprocess module.
wrapper.py:
import subprocess
import sys
proc = subprocess.Popen(sys.argv[1])
print proc.pid
proc.wait() #probably
Then run
stdin,stdout,stderr = ssh.exec_command('./wrapper.py ./someScript.sh')
and read the output

Related

How to get PID of the subprocess with subprocess check_output

In my python script I'm trying to run some long lasting download process, like in the example below, and need to find a PID of the process started by check_output:
out = subprocess.check_output(["rsync","-azh","file.log",...])
Is it possible to do?
You can run your subprocess using Popen instead:
import subprocess
proc = subprocess.Popen(["rsync","-azh","file.log",...], stdout=subprocess.PIPE)
out = proc.communicate()[0]
pid = proc.pid
Generally, Popen object gives you better control and more info of the subprocess, but requires a bit more to setup. (Not much, though.) You can read more in the official documentation.

Python: Terminate subprocess = Success, but it's still running (?)

I have a simple script that calls another python script as a subprocess. I can confirm the subprocess is started and I can grab its PID.
When I attempt to terminate the subprocess (in win), I get the SUCCESS message against the correct PID, but Windows task manager shows the 2nd python.exe process to still be running.
Any suggestions to accomplish this task in Win? I'll be extending this to also work in OSX and Linux eventually:
Simplified:
#!/usr/bin/env python
import os, sys
import subprocess
from subprocess import Popen, PIPE, STDOUT, check_call
pyTivoPath="c:\pyTivo\pyTivo.py"
print "\nmyPID: %d" % os.getpid()
## Start pyTivo ##
py_process = subprocess.Popen(pyTivoPath, shell=True, stdout=PIPE, stderr=subprocess.STDOUT)
print "newPID: %s" % py_process.pid
## Terminate pyTivo ##
#py_process.terminate() - for nonWin (?)
py_kill = subprocess.Popen("TASKKILL /PID "+ str(py_process.pid) + " /f")
raw_input("\nPress Enter to continue...")
Note: Python2.7 required, psutils not available
In my implementation, the following actually creates TWO processes in Windows ("cmd.exe" and "python.exe").
py_process = subprocess.Popen(pyTivoPath, shell=True, stdout=PIPE, stderr=subprocess.STDOUT)
Noticing the "python.exe" process is a child of the "cmd.exe" process, I added the "/T" (tree kill) switch to my TASKKILL:
py_kill = subprocess.Popen("TASKKILL /PID "+ str(py_process.pid) + " /f /t")
This results in the desired effect to effectively KILL the python subprocess.
Two processes are created because you call Popen with shell=True. It looks like the only reason you need to use a shell is so you make use of the file association with the interpreter. To resolve your issue you could also try:
from subprocess Popen, PIPE, STDOUT
pyTivoPath = "c:\pyTivo\pyTivo.py"
cmd = r'c:\Python27\python.exe "{}"'.format(pyTivoPath)
# start process
py_process = Popen(cmd, stdout=PIPE, stderr=STDOUT)
# kill process
py_process.terminate()
Use the /F (Force) switch on the TASKKILL command. Lots of windows commands do not has useful return values. Don't recall if TASKKILL returns has a useful value.
Sorry, overlooked your /F
You could try calling the win32 api directly.
import win32api
win32api.TerminateProcess(int(process._handle), -1)
Found the ActiveState page for this. Documents a number of kill methods, including the Win32 approach above.
There are also a number of reasons why Windows will not allow you to terminate a process. Common reasons are permissions and buggy drivers that have pending I/O requests that don't response to the kill signal properly.
There are some programs, e.g. ProcessHacker, that are more enthusiastic about killing processes, but I don't know the technical details for certain, though I suspect forced closing of open file handles etc. and then calling Terminate are involved.
You can have similar issues on Linux, i.e., no permission to kill process or the process is ignoring the kill signal. Easier to resolve on Linux though, if kill -9 does not work, it can't be killed and it is a rarer condition because you have to ignore signal 9 explicitly in your code.
0) You could use TASKKILL /T to kill CMD and the Python interpreter.
1) If you change your process creation to create the python process directly (instead of invoking the .py and relying on cmd to launch) with the script name as command argument you will get the PID you expect when you create the process.
2) You could use TASKKILL /IM to kill the process by name, but the name will be the python interpreter and it could kill unintended processes.

Python get script output with now only the pid of file

for example I run a script
os.execv('script.py',('',))
As I read in docs this command starts a script from your current script by taking it's pid and reasigning it to run script.py. So I can get the pid of process.
The question is following:
After running execv I need to get the stdout of this script, and the only thing I know is the pid of process. Is it possible to perform this with python ? Any suggestions ? I need to use only execv()
Another possible solution redirecting the output to a file.
import os,sys
sys.stdout = open("./data.out","w")
os.dup2(sys.stdout.fileno(), 1)
os.execv('/usr/bin/python', ['python', './script.py'])
os.execv is just binding to execve system call. Thing that you need is a subprocess module:
import sys
import subprocess
proc = subprocess.Popen([sys.executable, 'script.py'],
stdout = subprocess.PIPE)
proc.wait()
print(proc.stdout.read())
See https://docs.python.org/2/library/subprocess.html

Keep a subprocess alive and keep giving it commands? Python

If I spawn a new subprocess in python with a given command (let's say I start the python interpreter with the python command), how can I send new data to the process (via STDIN)?
Use the standard subprocess module. You use subprocess.Popen() to start the process, and it will run in the background (i.e. at the same time as your Python program). When you call Popen(), you probably want to set the stdin, stdout and stderr parameters to subprocess.PIPE. Then you can use the stdin, stdout and stderr fields on the returned object to write and read data.
Untested example code:
from subprocess import Popen, PIPE
# Run "cat", which is a simple Linux program that prints it's input.
process = Popen(['/bin/cat'], stdin=PIPE, stdout=PIPE)
process.stdin.write(b'Hello\n')
process.stdin.flush()
print(repr(process.stdout.readline())) # Should print 'Hello\n'
process.stdin.write(b'World\n')
process.stdin.flush()
print(repr(process.stdout.readline())) # Should print 'World\n'
# "cat" will exit when you close stdin. (Not all programs do this!)
process.stdin.close()
print('Waiting for cat to exit')
process.wait()
print('cat finished with return code %d' % process.returncode)
Don't.
If you want to send commands to a subprocess, create a pty and then fork the subprocess with one end of the pty attached to its STDIN.
Here is a snippet from some of my code:
RNULL = open('/dev/null', 'r')
WNULL = open('/dev/null', 'w')
master, slave = pty.openpty()
print parsedCmd
self.subp = Popen(parsedCmd, shell=False, stdin=RNULL,
stdout=WNULL, stderr=slave)
In this code, the pty is attached to stderr because it receives error messages rather than sending commands, but the principle is the same.

python subprocess communicate() block

I am using the subprocess module to call an external program (plink.exe) to log-in to a server; but when I call communicate to read the output, it is blocking. The code is below:
import subprocess
process = subprocess.Popen('plink.exe hello#10.120.139.170 -pw 123456'.split(), shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print process.communicate() #block here
I know the block is because plink.exe it still running; but I need to read the output before the subprocess terminates. Is there anyway to do that?
The whole purpose of the communicate method is to wait for the process to finish and return all the output. If you don't want to wait, don't call communicate. Instead, read from the stdout or stderr attribute to read the output.
If the process outputs to both stdout and stderr (and you want to read it separately), you will have to be careful to actually read from both without blocking, or you can deadlock. This is fairly hard on Windows, and you may wish to use the pexpect module instead.
Maybe because "plink.exe" needs to take in input arguments, if you don't pass them, it will block until data are given, you could try adding arguments in method communicate(input)
I faced a similar situation where I had to execute a single command lmstat -a and then get the output of the terminal.
If you just need to run a single command and then read the output, you can use the following code:
import subprocess
Username = 'your_username'
Password = 'your_password'
IP = 'IP_of_system'
Connection_type = '-ssh' #can have values -ssh -telnet -rlogin -raw -serial
p = subprocess.Popen(['plink', Connection_type, '-l', Username, '-pw', Password, IP], \
shell = False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = p.communicate('lmstat -a\nexit\n'.encode())
print(out.decode())

Categories

Resources