When using Popen how can I read from stdout? - python

I'm trying to use Popen to start a shell process, execute commands, print the commands to output, and print the output of the commands (if any). Writing to stdin is working fine, but trying to read from stdout causes the program to freeze.
Here's my program:
with Popen(["/bin/sh"], stdin=PIPE, stdout=PIPE, stderr=STDOUT, text=True, bufsize=0) as proc:
with open("script.txt") as scriptFile:
for line in scriptFile.readlines():
proc.stdin.write(line)
print(f"$ {line.strip()}")
# Reading from proc.stdout locks the program.
for outputLine in proc.stdout.readlines():
print(outputLine)
And here's a simplified script.txt. (The real one creates a git repository and illustrates the use of various git commands.)
mkdir project
cd project
echo "This is line 1.\nThis is line 2." > text1.txt
cat text1.txt
If I don't try to read from stdout all of the commands in script.txt (and my real version with multiple git commands) work fine.
Is there any way to get the output of commands from stdout interspersed with writing them to stdin?

Related

Send Multiple Terminal Commands in Gnome Terminals With Subprocess

So I am currently trying to run two different gnome-terminal windows in Ubuntu that I can send individual commands to after they are initially open.
def ssh_command(cmd):
ssh_terminal_1 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
ssh_terminal_2 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
# Activate the conda environment for our multilateration server
spyder_activate('conda activate flyhound')
time.sleep(10)
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'sudo ./srsepc/src/srsepc ../srsepc/epc.conf.example --hss.db_file=../srsepc/user_db_unknown.csv.example\n')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -l /home/administrator/Downloads/hostedxA5-latest.rbf\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -f /home/administrator/Downloads/bladeRF_fw_v2.4.0.img\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'sudo ./srsenb/src/srsenb ../srsenb/enb.conf.example --enb_files.sib_config=../srsenb/sib.conf.example --e nb.n_prb=50 --enb_files.rr_config=../srsenb/rr.conf.example\n')
However when I start the original subprocess command the terminals open up fine with the command given during the function call but all the following commands don't work and I get a broken pipe error errno 32. While I try to run these commands I also need to keep previous terminal open that looks like this below
def access_command(cmd):
while True:
process = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
if b"f0:9e:4a:5f:a4:5b" and b"handshake" in output:
ssh_command("sshpass -p earth ssh -o StrictHostKeyChecking=no administrator#ipaddress; clear; screen")
I am really not sure how I can send multiple commands to the ssh terminals after they ssh into that ip address. I am very new to subprocess and sending commands to terminals via python so any help would be amazing on this!!
As I explained in the comments, your pipe goes to gnome-terminal and neither to ssh nor to bash. gnome-terminal is not listening to stdin but is only listening to the user at the console. Here is what you do.
Make a FIFO (named pipe) -- os.mkfifo -- for each terminal, give it a name that won't collide with any other file (such as put your process ID in it).
Issue the command gnome-terminal -- bash -c ssh <options> < <fifo name> for each terminal. Do not make this a Popen call, use os.system or something like that.
Do your spydy magic (anaconda).
Open the FIFO as a file
Write your commands to the open file; they will be executed by the bash process in the ssh connection. You will probably have to flush, unless there is a way to open it in line-buffered mode.
When you want to close the terminal, close the file.
What this accomplishes is that we move the pipe from gnome-terminal to ssh and hence across the connection to bash. We feed it on one end and it comes out and gets digested by the shell.

Popen reading from stdout takes a very very long time

I am trying to capture the output from a shell command (npm --version) however only the first line is read and the process does not end.
import subprocess
proc = subprocess.Popen(['npm', '--version'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
proc.wait()
for line in proc.stdout:
print(line.decode("utf-8").strip())
print("does not get here?!")
Any idea how I could detect the end of this process?.
If I open a cmd and execute 'npm --version', it ends as expected so I do not know why this done in the above does not end.
Some extra information that maybe of use!...
npm is installed via nvm
this is used to manage node installs via symlinks
npm from what I can see is a .cmd file that executes node?
Running this in python command prompt...
>>> import subprocess
>>> proc = subprocess.Popen(['npm', '--version'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
>>> proc.wait()
0
>>> proc.stdout.readline()
'6.10.3\n'
>>> proc.stdout.readline()
''
Now the second .readline() takes a very very long time to complete!
Using stdout=PIPE and/or stderr=PIPE in popen.wait() will cause a deadlock. Try using communicate() to avoid that.
This is due to other OS pipe buffers filling up and blocking the child process.
See this documentation on how to use communicate ()
https://docs.python.org/2/library/subprocess.html
Hope I could help!
Can you please share the console output when you manually type on the command prompt please.
The code you have shared works on my machine and i am assuming it may have to do something with the way npm is installed. In any case can you share the output from command console.
Thanks
Pushpa

Python repo popen could not work in python script but works on the terminal

I have a python script which captures repo command.
import subprocess
processing(commandforrepo)
def processing(repocmd):
process = None
process = subprocess.Popen(repocmd,
stdout=subprocess.PIPE, stderr=None, shell=True)
process.communicate()
In the particular command, I am trying to parse a list of repocmd to compare two branches and print out the differences
"repo forall $(repo forall -c 'echo $REPO_PROJECT')\
-c 'git log --abbrev-commit --pretty=oneline --no-merges \
--cherry-pick --left-only HEAD...$REPO_RREV'"
Attempted to run the script on the terminal but the command did not get executed. However, when this command is issued on the terminal, it produces a list of differences between the two branches.
Any clue as to what is missing?
Warning: most of the standard output for Git command are done on... stderr.
See here for why: informative messages are on stderr only.
So make sure to parse stderr, not stdout.

How to get stdout and stderr from a tmux session?

I am writing a sample python program in linux system. I am using tmux to create a session and execute another script within the tmux-session. I would like to get the stdout and stderr out of the tmux session to the parent script but that somehow does not work.
Code snippet:
cmd = "tmux new-session -d -s 'test' '/my_dir/fibonacci.py __name:=fibonacci_0'"
proc = Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE)
(stdout, stderr) = proc.communicate()
print(stderr)
I have came across answers to use show-buffer and pipe-pane. But that did not help. Maybe I need to modify the tmux command itself.
Thank you for your support. After digging a bit, I came up with a workaround. I am just adding it here for someone with similar needs.
What I have done is created a named pipe, redirect the output of tmux session to named pipe and then finally read from it.
# this attach if a session exists or creates one, then exits from the session
call("tmux new-session -A -s test \; detach", shell=True)
# to avoid conflict, remove existing named pipe and then create named pipe
call("rm -f /tmp/mypipe && mkfifo /tmp/mypipe && tmux pipe-pane -t test -o 'cat > /tmp/mypipe'", shell=True)
# feed the pipe to the stdout and stderr
poc = Popen(['cat', '/tmp/mypipe'], stdout=PIPE, stderr=PIPE)
# finally execute the command in tmux session
Popen(['tmux', 'send-keys', '-t', '/my_dir/fibonacci.py', 'C-m'])
(stdout, stderr) = proc.communicate()
print(stderr)
Hope this is helpful.
TLDR: tmux is like sandboxed cmd, you can't get in the session and reach sdtout or stderr
tmux creates a dispatched process. stdout and stderr is tmux would be the text messages that tmux provies, not the commands you run in a tmux session.
So you can't get your commands' output from a dispatched process.
In order to get the stdout and stderr, you have to change how fibonacci.py dumps text. Python logging framework can help you in this situation. You can write all stdout to stdout.txt and get content of that file.

subprocess popen Python

i am executing a shell script which is starting a process with background option &. The shell script is called from python script which hangs.
Shell script:
test -f filename -d &
python file
cmd =["shellscript","restart"]
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, stdin=subprocess.PIPE, **kwargs)
pid = proc.pid
out, err = proc.communicate()
returncode = proc.poll()
Python file hangs and it won't return out of the python process. Also python process is an automated one.
The call to proc.communicate() will block until the pipes used for stderr and stdout are closed. If your shell script spawns a child process which inherits those pipes, then it will exit only after that process also has closed its writing ends of the pipes or exited.
To solve this you can either
redirect the output of the started subprocess to /dev/null or a logfile in your shell script, e.g.:
subprocess_to_start >/dev/null 2>&1 &
use subprocess.DEVNULL or an open file object for stderr and stdout in your python script and drop the communicate() call if you don't need the output of "shellscript" in python
A comma is missing in your cmd list:
cmd =["shellscript", "restart"]

Categories

Resources