Capturing LIVE output of shell script while running it in python - python

I am writing a python script to ssh into a linux server and execute a shell script that is already stored on the linux server.
Here is what my code look like so far
command = ['ssh into the remote server',
'cd into the directory of the shell script,
'./running the shell script',
]
process = subprocess.Popen(command,
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
err, out = process.communicate()
if out:
print "standard output of subprocess is : "
print out
if err:
print "standard error of subprocess is : "
print err
print "returncode of subprocess: "
print process.returncode
1st question: I can obtain the output of my shell scripts through stderr, but I only obtain it after the entire shell script has finished executing. So if the shell script takes 10 minutes to finish, I only get to see the output of the shell script after 10 minutes.
I want to have the output of my shell scripts return line by line to me just as if I was executing the script manually in the remote server. Can this be done?
2nd question: as you can see, I have three commands in my command list (which is only a small portion of all my commands,) if I put all my commands in the list, I only obtain the output of ALL my commands through stdout ONLY when all my commands has finished executing. If my 1st question cannot be done, is there a way to at least obtain the output of each command after each one has been executed instead of receiving them all at once only when all the commands has finished being executed.

To see the output immediately, don't redirect it:
from subprocess import Popen, PIPE
p = Popen(['ssh', 'user#hostname'], stdin=PIPE)
p.communicate(b"""cd ..
echo 1st command
echo 2nd command
echo ...
""")
If you want both to capture the "live" output in a variable and to display it in the terminal then the solution depends on whether you need to handle stdin/stdout/stderr concurrently.
If input is small and you want to combine stdout/stderr then you could pass all commands at once and read the merged output line-by-line:
from subprocess import Popen, PIPE, STDOUT
p = Popen(['ssh', 'user#hostname'], stdin=PIPE,
stdout=PIPE, stderr=STDOUT, bufsize=1)
p.stdin.write(b"""cd ..
echo 1st command
echo 2nd command
echo ...
""")
p.stdin.close() # no more input
lines = [] # store output here
for line in iter(p.stdout.readline, b''): # newline=b'\n'
lines.append(line) # capture for later
print line, # display now
p.stdout.close()
p.wait()
If you want to capture "live" stdout/stderr separately, see:
Displaying subprocess output to stdout and redirecting it
Subprocess.Popen: cloning stdout and stderr both to terminal and variables

I'm not entirely sure, but maybe you get instant output if you pass the other two commands as arguments to ssh:
command = 'ssh user#example.com \'cd some/path/on/your/server; ./run-the-script.sh\''
The way I understand it, Python first reads and processes all the input and only then returns output. I'm not too familiar with Python, so I might be wrong on this, but if I'm right, this should help.

Don't call .communicate() -- that waits for the process to finish.
Instead, keep reading data from .stdout pipe.
Simple example:
In [1]: import subprocess
In [2]: p = subprocess.Popen(["find", "/"], stdout=subprocess.PIPE)
In [3]: p.stdout
Out[3]: <open file '<fdopen>', mode 'rb' at 0x7f590446dc00>
In [4]: p.stdout.readline()
Out[4]: '/\n'
In [5]: p.stdout.readline()
Out[5]: '/var\n'
In [6]: p.stdout.readline()
Out[6]: '/var/games\n'

Related

Pass variable to bash command with Python

I have the next code:
from subprocess import Popen, PIPE
p = Popen("C:/cygwin64/bin/bash.exe", stdin=PIPE, stdout=PIPE)
path = "C:/Users/Link/Desktop/folder/"
p.stdin.write(b"cd " + str.encode(path)))
p.stdin.close()
out = p.stdout.read()
print(out)
The output is b''
Is there any way to pass a variable to the bash command p.stdin.write(b"cd " + path)
I ask because the way it is written above don't work. Output is null, just like Cygwin started and nothing else.
EDIT
As long as I see the question is not so clear, I'll add this scenario:
I am on Windows and I am using Python 3.6.
I have a bash cmd that requieres Cygwin to be executed. This cmd may have a variable in his string, which will change after every execution. Immagine a for loop which executes a command.
For example (an ImageMagick command):
convert image.jpg -resize 1024x768 output_file.jpg
How can I execute this cmd from Python with output_file.jpg as variable ?
Bash doesn't run in interactive mode by default unless it detects that standard input and output are connected to a terminal. You PIPEd these in, therefore they're definitely not connected to a terminal.
Bash does not display any prompts in non-interactive mode, hence you see nothing. You can force it to be interactive with -i switch.
However, even then, it is not going to write to stdout but stderr; you can try piping stderr to stdout
from subprocess import Popen, PIPE, STDOUT
p = Popen(["C:/cygwin64/bin/bash.exe", "-i"], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
and you will capture the prompts and such.
Or use your original approach with a command that does produce output - here pwd that prints the current working directory:
p.stdin.write(b"cd " + path.encode() + b"\n")
p.stdin.write(b"pwd")
It is tricky to talk to an interactive process like this though - read too little => deadlock. Write too much => deadlock. This is why Popen has the .communicate method for providing all of input at once and getting the stdout and stderr afterwards.
As it seems you are using the Cygwin python, than you should use proper
Posix paths and not Windows-like ones
Instead of
p = Popen("C:/cygwin64/bin/bash.exe", stdin=PIPE, stdout=PIPE)
use
p = Popen("/bin/bash.exe", stdin=PIPE, stdout=PIPE)

Trouble printing text live with Python subprocess.call

Off the bat, here is what I am importing:
import os, shutil
from subprocess import call, PIPE, STDOUT
I have a line of code that calls bjam to compile a library:
call(['./bjam',
'-j8',
'--prefix="' + tools_dir + '"'],
stdout=PIPE)
I want it to print out text as the compilation occurs. Instead, it prints everything out at the end.
It does not print anything when I run it like this. I have tried running the command outside of Python and determined that all of the output is to stdout (when I did ./bjam -j8 > /dev/null I got no output, and when I ran ./bjam -j8 2> /dev/null I got output).
What am I doing wrong here? I want to print the output from call live.
As a sidenote, I also noticed something when I was outputting the results of a git clone operation:
call(['git',
'clone', 'https://github.com/moses-smt/mosesdecoder.git'],
stdout=PIPE)
prints the stdout text live as the call process is run.
call(['git',
'clone', 'https://github.com/moses-smt/mosesdecoder.git'],
stdout=PIPE, stderr=STDOUT)
does not print out any text. What is going on here?
stdout=PIPE redirects subprocess' stdout to a pipe. Don't do it unless you want to read from the subprocesses stdout in your code using proc.communicate() method or using proc.stdout attribute directly.
If you remove it then subprocess should print to stdout like it does in the shell:
from subprocess import check_call
check_call(['./bjam', '-j8', '--prefix', tools_dir])
I've used check_call() to raise an exception if the child process fails.
See Python: read streaming input from subprocess.communicate() if you want to read subprocess' output line by line (making the line available as a variable in Python) as soon as it is avaiable.
Try:
def run(command):
proc = subprocess.Popen(command, stdout=subprocess.PIPE)
for lineno, line in enumerate(proc.stdout):
try:
print(line.decode('utf-8').replace('\n', ''))
except UnicodeDecodeError:
print('error(%d): cannot decode %s' % (lineno, line))
The try...except logic is for python 3 (maybe 3.2/3.3, I'm not sure), as there line is a byte array not a string. For earlier versions of python, you should be able to do:
def run(command):
proc = subprocess.Popen(command, stdout=subprocess.PIPE)
for line in proc.stdout:
print(line.replace('\n', ''))
Now, you can do:
run(['./bjam', '-j8', '--prefix="' + tools_dir + '"'])
call will not print anything it captures. As documentation says "Do not use stdout=PIPE or stderr=PIPE with this function. As the pipes are not being read in the current process, the child process may block if it generates enough output to a pipe to fill up the OS pipe buffer."
Consider using check_output and print its return value.
In the first case with git call you are not capturing stderr and therefor it normally flows onto your terminal.

Python: Hide sub-process print out on the terminal and continue the script while sub is running

is there a way to use python2.6 with either subprocess.Popen() or os.system() to run two tasks? Example the script will run "airodump-ng" first then this process is sub and is hidden(meaning will not print out from terminal) after which continue run the rest of the script which contain "sniff" function of scapy. I been researched but I only found windows version and python3. By the way I running on debian.
Use subprocess.Popen in combination with subprocess.PIPE:
p = Popen(['airodump-ng', …], stdin=PIPE, stdout=PIPE, stderr=PIPE)
If you want to wait until the process has finished use:
stdout, stderr = p.communicate()
If you omit the code above airodump-ng will run in the background and produce no visible output, while you can continue with your python code.
Another method would be to use os.devnull to redirect the output of airodump-ng to, this will completly get rid of any output produced:
devnull = os.open(os.devnull, os.O_WRONLY)
p = Popen(['airodump-n', …], stdout=devnull, stderr=devnull)
In the spot where you put the command airodump-ng replace that part with timeout 'X's airodump-ng mon'X'

Running shell command from Python script

I'm trying to run a shell command from within a python script which needs to do several things
1. The shell command is 'hspice tran.deck >! tran.lis'
2. The script should wait for the shell command to complete before proceeding
3. I need to check the return code from the command and
4. Capture STDOUT if it completed successfully else capture STDERR
I went through the subprocess module and tried out a couple of things but couldn't find a way to do all of the above.
- with subprocess.call() I could check the return code but not capture the output.
- with subprocess.check_output() I could capture the output but not the code.
- with subprocess.Popen() and Popen.communicate(), I could capture STDOUT and STDERR but not the return code.
I'm not sure how to use Popen.wait() or the returncode attribute. I also couldn't get Popen to accept '>!' or '|' as arguments.
Can someone please point me in the right direction? I'm using Python 2.7.1
EDIT: Got things working with the following code
process = subprocess.Popen('ls | tee out.txt', shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = process.communicate()
if(process.returncode==0):
print out
else:
print err
Also, should I use a process.wait() after the process = line or does it wait by default?
Just use .returncode after .communicate(). Also, tell Popen that what you're trying to run is a shell command, rather than a raw command line:
p = subprocess.Popen('ls | tee out.txt', shell=True, ...)
p.communicate()
print p.returncode
From the docs:
Popen.returncode
The child return code, set by poll() and wait() (and indirectly by communicate()). A None value indicates that the process hasn’t terminated yet.
A negative value -N indicates that the child was terminated by signal N (Unix only).
Here is example how to interact with shell:
>>> process = subprocess.Popen(['/bin/bash'], shell=False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
>>> process.stdin.write('echo it works!\n')
>>> process.stdout.readline()
'it works!\n'
>>> process.stdin.write('date\n')
>>> process.stdout.readline()
'wto, 13 mar 2012, 17:25:35 CET\n'
>>>

Keep a subprocess alive and keep giving it commands? Python

If I spawn a new subprocess in python with a given command (let's say I start the python interpreter with the python command), how can I send new data to the process (via STDIN)?
Use the standard subprocess module. You use subprocess.Popen() to start the process, and it will run in the background (i.e. at the same time as your Python program). When you call Popen(), you probably want to set the stdin, stdout and stderr parameters to subprocess.PIPE. Then you can use the stdin, stdout and stderr fields on the returned object to write and read data.
Untested example code:
from subprocess import Popen, PIPE
# Run "cat", which is a simple Linux program that prints it's input.
process = Popen(['/bin/cat'], stdin=PIPE, stdout=PIPE)
process.stdin.write(b'Hello\n')
process.stdin.flush()
print(repr(process.stdout.readline())) # Should print 'Hello\n'
process.stdin.write(b'World\n')
process.stdin.flush()
print(repr(process.stdout.readline())) # Should print 'World\n'
# "cat" will exit when you close stdin. (Not all programs do this!)
process.stdin.close()
print('Waiting for cat to exit')
process.wait()
print('cat finished with return code %d' % process.returncode)
Don't.
If you want to send commands to a subprocess, create a pty and then fork the subprocess with one end of the pty attached to its STDIN.
Here is a snippet from some of my code:
RNULL = open('/dev/null', 'r')
WNULL = open('/dev/null', 'w')
master, slave = pty.openpty()
print parsedCmd
self.subp = Popen(parsedCmd, shell=False, stdin=RNULL,
stdout=WNULL, stderr=slave)
In this code, the pty is attached to stderr because it receives error messages rather than sending commands, but the principle is the same.

Categories

Resources