why is call to ssh with Python subprocess breaking outer bash loop - python

I have a bash script myscript.sh:
#!/bin/bash
while read line; do
myprog.py
done
calling a python program myprog.py
#!/usr/bin/env python
import subprocess
output = subprocess.check_output(['ssh', 'user#host', 'cmd'])
The ssh command that is called by subprocess executes without error, the output is correct. But when called like this the loop in myscript.sh only runs through the first line of input and then exits with status 0. If I replace the subprocess.check_output(...) call with a subprocess.Popen(...) and don't subsequently call Popen.wait() then the outer loop works as expected and the output from the ssh command is dumped to standard out some time after any output from the bash script. With the Popen.wait() behavior is the same as with check_output: bash loop only goes through one iteration before exiting without error.
If instead of ssh another command, e.g. ls, is called with check_output then the bash loop works as expected.
Can anyone help me understand why the code as shown isn't working as expected?
Note: this is a simplified version of what I am trying to do, though I do experience the same behavior with this code. In reality I am doing something with "$line" in the bash script and the subprocess call is wrapped in a try/except block.

As #larsmans guessed the ssh call was consuming stdin, breaking the outer bash loop. Adding the -n option to the ssh command resolved the issue:
output = subprocess.check_output(['ssh', '-n', 'user#host', 'cmd'])

The problem is that ssh reads from standard input, therefore it "eats" all the remaining lines in the loop. You can just connect its standard input to nowhere using the -n flag:
output = subprocess.check_output(['ssh', '-n', 'user#host', 'cmd'])
Look for the details on the man pages of ssh here https://linux.die.net/man/1/ssh and https://man.openbsd.org/ssh

Related

Call python script as module with input from bash script

From a bash function, I want to call a python script which prompts for input, and I need to run that script as a module using python -m
Here is select_pod.py
# above this will be a print out of pods
pod = input('Pick pod')
print(pod)
Here is the bash function:
function foo() {
POD=$(python3 -m select_pod)
kubectl exec $POD --stdin --tty bash
}
I can't get the input to work, i.e. "Pick pod" is not printed to the terminal.
When you do POD=$(python3 -m select_pod), the POD=$(...) means that any output printed to stdout within the parentheses will be captured within the POD variable instead of getting printed to the screen. Simply echoing out POD is no good, as this will first be done once the Python script has finished.
What you need to do is to duplicate the output of the Python program. Assuming Linux/Posix, this can be done using e.g.
POD=$(python3 -m select_pod | tee /dev/stderr)
Because your terminal shows both stdout and stderr, duplicating the output from stdout to stderr makes the text show up.
Hijacking the error channel for this might not be ideal, e.g. if you want to later sort the error messages using something like 2> .... A different solution is to just duplicate it directly to the tty:
POD=$(python3 -m select_pod | tee /dev/tty)
You can change sys.stdout before input :
import sys
save_sys_stdout = sys.stdout
sys.stdout = sys.stderr
pod = input('Pick pod')
sys.stdout = save_sys_stdout
print(pod)
So that POD=$(python3 -m select_pod) will work and you don't need to do split after.

python subprocess.call making program stop after second execution

I'm generating a text file that is later processed by an external program. This must be done 1000 times, for this, i use a subprocess.call() inside a loop for each text file i want to process.
The first call of subprocess.call() works perfectly.
The second call fails and the python program exits with a []Stop.
- There is no debug output.
- Both remain stopped, but in the process list
I have tried subprocess.call(), subprocess.Popen() and the outcome is the same. I have tried to run it with the same textfile as the first execution and it also fails, so the culprit is the subprocess.call() function for sure.
This is the line that calls the external program
subprocess.call(['/bin/bash', '-i', '-c', 'nucplot textfile.txt']);
The program is a simple binary file, but it must use the ENV variables of its installation to work properly, hence the usage of /bin/bash with those options. If I try using a shell, it doesn´t work.
Is there anything else i need to do after calling subprocess.call() in order for it to flush its internal stuff?
Try using subprocess.check_output
https://docs.python.org/3/library/subprocess.html#subprocess.check_output
_ = subprocess.check_output(['/path/to/nucplot', '-i', '-c', 'textfile.txt'])

Linux: cat to named pipe in a python script

I have a Java program that uses video from a framegrabber card. This program is launched through a python launcher.py.
The easiest way to read the video stream I found, is to make Java read on a named pipe, and this works perfectly. So my session is like:
$ mkfifo videopipe
$ cat /dev/video1>videopipe
and in a second terminal (since the cat command is blocking):
$ python launcher.py
I would like to automate this process. Unfortunately, the result is always the same: the Java application starts (confirmed through a print statement in the java program), but then the terminal stalls and nothing appears, exception or else.
Since this process works manually, I guess I am doing something wrong in the python program. To simplify things, I isolated the piping part:
from subprocess import call, Popen, PIPE, check_call
BASH_SWITCHTO_WINTV = ['v4l2-ctl', '-d /dev/video1', '-i 2', '--set-standard=4']
BASH_CREATE_FIFO_PIPE = ['mkfifo', 'videopipe']
BASH_PIPE_VIDEO = 'cat /dev/video1>videopipe'
def run():
try:
print('running bash commands...')
call(BASH_SWITCHTO_WINTV)
call(BASH_CREATE_FIFO_PIPE)
Popen(['cat', '/dev/video1'], stdout=open('videopipe', 'w'))
except:
raise RuntimeError('An error occured while piping the video')
if __name__ == '__main__':
run()
which when run, outputs:
running bash commands...
Failed to open /dev/video1: No such file or directory
A little help would be very much appreciated :-)
If you're using shell=True, just pass a string:
BASH_PIPE_VIDEO = 'cat /dev/video1 > videopipe'
Currently, cat is passed to the shell as your script, and /dev/video>videopipe is passed to that shell as a literal argument -- not parsed as part of the script text at all, and having no effect since the script (just calling cat) doesn't look at its arguments.
Alternately, to avoid needless shell use (and thus shell-related bugs such as shellshock, and potential for injection attacks if you were accepting any argument from a non-hardcoded source):
Popen(['cat', '/dev/video1'], stdout=open('videopipe, 'w'))
On a note unrelated to your "cat to named pipe" question -- be sure you get your spaces correct.
BASH_SWITCHTO_WINTV = ['v4l2-ctl', '-d /dev/video1', ...]
...uses the name <space>/dev/video1, with a leading space, as the input device; it's the same as running v4l2-ctl "-d /dev/video1" in shell, which would cause the same problem.
Be sure that you split your arguments correctly:
BASH_SWITCHTO_WINTV = ['v4l2-ctl', '-d', '/dev/video1', ...]

Python Popen waiting while it shouldn't (bg and output redirected)

When I run directly in a terminal:
echo "useful"; sleep 10 &> /tmp/out.txt & echo "more";
I get both outputs while sleep goes on in the background. I was this same behavious with Popen (python 2.7):
p = Popen('echo "useful"; sleep 10 &> /tmp/out.txt & echo "more";', shell = True, stdout = PIPE, stderr = PIPE)
print p.communicate()
It was my understanding that a background process with redirected stdout and stderr would achieve this, but it does not; I have to wait for sleep. Can someone explain?
I need the other output so changing stdout/stderr arguments in Python is not a solution.
EDIT: I understand now that the behaviour I want (get the output but stop when no more output rather than when completed) is not possible from Python.
However, the behaviour appears more or less automatically when using ssh:
ssh 1.2.3.4 "echo \'useful\'; cd ~/dicp/python; nohup sleep 5 &> /tmp/out.txt & echo \'more\';"
(I can ssh to this address without password). So it's not entirely impossible by working around Python; now I need a way to do it without ssh...
That's because the shell process still has to wait for the background process to finish.
You don't normally realize this is happening because you normally are working in the shell where you backgrounded something. You put a process in the background so you can get control of the shell again and continue to work with it.
In other words, the background process is relative to the shell, not your Python process.
As Martijn Pieters points out, this is not how Python behaves (or is meant to behave). However, because the desired behaviour appears when running the command through ssh with nohup, I found this similar trick:
p = Popen('bash -c "echo \'useful\'; cd ~/dicp/python; nohup sleep 5 &> /tmp/out.txt & echo \'more\';"', shell = True, stdout = PIPE, stderr = PIPE)
print p.communicate()
So if I understand correctly, this starts a new shell (bash -c), which then starts a process not attached to it (nohup). The shell terminates as soon as all other processes complete, but the nohup-process keeps running. Desires behaviour achieved!
Maybe not pretty and probably not efficient, but it works.
EDIT: assuming, of course, that you are using bash. A more general answer is welcome.
EDIT2: actually, if my explanation is correct, I am not sure why nohup does not detach the process even if not using bash -c... Seems like bash -c would be redundant, just detach it from the shell started by Popen, but that does not work.

how can I silence all of the output from a particular python command?

Autodesk Maya 2012 provides "mayapy" - a modded build of python filled with the necessary packages to load Maya files and act as a headless 3D editor for batch work. I'm calling it from a bash script. If that script opens a scene file in it with cmds.file(filepath, open=True), it spews pages of warnings, errors, and other info I don't want. I want to turn all of that off only while the cmds.file command is running.
I've tried redirecting from inside of the Python commands I'm sending into mayapy inside the shell script, but that doesn't work. I can silence everything by redirecting stdout/err to /dev/null in the call to the bash script. Is there any way to silence it in the call to the shell, but still allow my passed-in command inside the script to print out information?
test.sh:
#!/bin/bash
/usr/autodesk/maya/bin/mayapy -c "
cmds.file('filepath', open=True);
print 'hello'
"
calling it:
$ ./test.sh # spews info, then prints 'hello'
$ ./test.sh > /dev/null 2>&1 # completely silent
Basically, I think the best way to solve this is to implement a wrapper that will execute test.sh and sanitize the output to the shell. To sanitize the output, I would simply prepend some string to notify your wrapper that this text is good for output. My inspiration for the wrapper file came from this: https://stackoverflow.com/a/4760274/2030274
The contents are as follows:
import subprocess
def runProcess(exe):
p = subprocess.Popen(exe, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while(True):
retcode = p.poll() #returns None while subprocess is running
line = p.stdout.readline()
yield line
if(retcode is not None):
break
for line in runProcess(['./test.sh']):
if line.startswith('GARYFIXLER:'):
print line,
Now you could imagine test.sh being something along the lines of
#!/bin/bash
/usr/autodesk/maya/bin/mayapy -c "
cmds.file('filepath', open=True);
print 'GARYFIXLER:hello'
"
and this will only print the hello line. Since we are wrapping the python call in a subprocess, all output typically displayed to the shell should get captured and you should intercept the lines that you don't want.
Of course, to call test.sh from a python script, you need to make sure you have the correct permissions.
I knew I was just getting twisted around with pipes. Maya is indeed sending all batch output to stderror. This frees stdout entirely once you properly pipe stderr away. Here's an all-bash one-liner that works.
# load file in batch; divert Maya's output to /dev/null
# then print listing of things in file with cmds.ls()
/usr/autodesk/maya/bin/mayapy -c "import maya.standalone;maya.standalone.initialize(name='python');cmds.file('mayafile.ma', open=True);print cmds.ls()" 2>/dev/null

Categories

Resources