subprocess.Popen() won't run python script, python command not found - python

I'm trying to get subprocess.Popen to run a python script but I keep getting the following error: /bin/sh: python: command not found. The script takes a yaml file as an argument. I've tried this line both with and without shell=True. The script runs fine when I run it with the python command in my Linux terminal. What am I doing wrong?
process = subprocess.Popen(
['python', PATH_TO_PYTHON_SCRIPT, PATH_TO_CONFIG_FILE],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, stdin=subprocess.PIPE, shell=True)
with process.stdout, open(processing_log, 'ab') as f_in:
for line in iter(process.stdout.readline, b''):
f_in.write(line)

If you want to run with the same interpreter you're currently running in, I'd suggest passing sys.executable rather than 'python' so you're not dependent on the vagaries of PATH lookup; if you want to look it up from the PATH, you might try using shutil.which to look it up at the Python layer to minimize the number of things in the way.
Side-note: Why are you running this with shell=True? That adds a number of complicating factors that you should probably avoid, and you're not taking advantage of the tiny benefits it provides anyway.
Additionally, if all you want to do is append the output to a given file, you can always just do:
with open(processing_log, 'ab') as f_out:
process = subprocess.Popen(
['python', PATH_TO_PYTHON_SCRIPT, PATH_TO_CONFIG_FILE],
stdout=f_out,
stderr=subprocess.STDOUT, stdin=subprocess.PIPE)
and let the process write to the file directly instead of acting as a passthrough layer.

Related

How to run cmd.exe in python

I'm trying to port the following c# code into Python. It firstly defines a new process and then runs a windows prompt command (cmd.exe). After that, it executes a command in the prompt and when an external event occurs, it closes the prompt.
//Start the prompt - when an event occured
Process winShell = new Process();
winShell.StartInfo.FileName = "cmd.exe";
winShell.StartInfo.RedirectStandardInput = true;
winShell.Start();
//Execute a command in the prompt
winShell.StandardInput.WriteLine("cd " + projectDirectory);
//Close it - when an event occured
winShell.StandardInput.Flush();
winShell.StandardInput.Close();
winShell.WaitForExit();
I read that for Python 3 (my version 3.7), It is recommended to use subprocess. Unfortunately, I feel a bit confused about which of the function to use. I found call, run and Popen, but I didn't understand how to use them.
I wrote the following lines, but they don't produce any visible result.
import subprocess
subprocess.run(['cmd.exe'])
First of all, I would like that the shell appears and than to write some commands in it. Finally, I want to close it.
Use subprocess.Popen() like this. Each API matches to the corresponding C# API almost 1:1.
p = subprocess.Popen(['cmd.exe'],
stdin=subprocess.PIPE, stdout=subprocess.PIPE,
text=True)
p.stdin.write('dir\n')
p.stdin.close()
print(p.stdout.read())
p.wait()
p.stdout.close()
Other API's such as run(), call(), etc are wrappers for Popen(). For example, the above code is equivalent to this one line.
print(subprocess.run(['cmd.exe'], capture_output=True, text=True, input = 'dir\n').stdout)

Simple Python Script not Executing Properly

The code is as follows:
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
proc.stdin.close()
proc.stderr.close()
out = proc.stdout.readline()
print out
script is a string which contains the subprocess code, in this case a simple hello world. Since it has unix file endings, I had to use io.open in order to write it properly for windows. path is the path to the python.exe on my machine. The file is generated and looks fine in notepad:
def main():
print 'hello world'
However, when I run the program, the subprocess executes and does nothing.
Its not a problem with the executable path, I've tested it with other programs, so it must be with either the temp file itself, or the text within it. Delete is set to false in order to check the contents of the file for debugging. Is there anything glaringly wrong with this code? I'm a bit new to using Popen.
The main issue in your program is that when you specify shell=True , you need to provide the entire command as a string, not a list.
Given that, there is really no need for you to use shell=True , also, unless absolutely neccessary, you should not use shell=True , its a security hazard, this is given in the documentation as well -
Executing shell commands that incorporate unsanitized input from an untrusted source makes a program vulnerable to shell injection, a serious security flaw which can result in arbitrary command execution. For this reason, the use of shell=True is strongly discouraged in cases where the command string is constructed from external input:
Also, if you do not want to use stdin / stderr (since you are closing those off as soon as you start the process) , there is no need to use PIPE for them.
Example -
fh = tempfile.NamedTemporaryFile(delete=False,suffix = '.py')
stream = io.open(fh.name,'w',newline='\r\n')
stream.write(unicode(script))
stream.flush()
stream.close()
proc = subprocess.Popen(
[path,fh.name],
stdout=subprocess.PIPE,
)
out = proc.stdout.readline()
print out
Also, the script -
def main():
print 'hello world'
would not work, since you need to call main() for it to run.

How to call a series of bash commands in python and store output

I am trying to run the following bash script in Python and store the readlist output. The readlist that I want to be stored as a python list, is a list of all files in the current directory ending in *concat_001.fastq.
I know it may be easier to do this in python (i.e.
import os
readlist = [f for f in os.listdir(os.getcwd()) if f.endswith("concat_001.fastq")]
readlist = sorted(readlist)
However, this is problematic, as I need Python to sort the list in EXACTLY the same was as bash, and I was finding that bash and Python sort certain things in different orders (eg Python and bash deal with capitalised and uncapitalised things differently - but when I tried
readlist = np.asarray(sorted(flist, key=str.lower))
I still found that two files starting with ML_ and M_ were sorted in different order with bash and Python. Hence trying to run my exact bash script through Python, then to use the sorted list generated with bash in my subsequent Python code.
input_suffix="concat_001.fastq"
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g" `
readlist="$(echo $ender)"
I have tried
proc = subprocess.call(command1, shell=True, stdout=subprocess.PIPE)
proc = subprocess.call(command2, shell=True, stdout=subprocess.PIPE)
proc = subprocess.Popen(command3, shell=True, stdout=subprocess.PIPE)
But I just get: subprocess.Popen object at 0x7f31cfcd9190
Also - I don't understand the difference between subprocess.call and subprocess.Popen. I have tried both.
Thanks,
Ruth
So your question is a little confusing and does not exactly explain what you want. However, I'll try to give some suggestions to help you update it, or in my effort, answer it.
I will assume the following: your python script is passing to the command line 'input_suffix' and that you want your python program to receive the contents of 'readlist' when the external script finishes.
To make our lives simpler, and allow things to be more complicated, I would make the following bash script to contain your commands:
script.sh
#!/bin/bash
input_suffix=$1
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g"`
readlist="$(echo $ender)"
echo $readlist
You would execute this as script.sh "concat_001.fastq", where $1 takes in the first argument passed on the command line.
To use python to execute external scripts, as you quite rightly found, you can use subprocess (or as noted by another response, os.system - although subprocess is recommended).
The docs tell you that subprocess.call:
"Wait for command to complete, then return the returncode attribute."
and that
"For more advanced use cases when these do not meet your needs, use the underlying Popen interface."
Given you want to pipe the output from the bash script to your python script, let's use Popen as suggested by the docs. As I posted the other stackoverflow answer, it could look like the following:
import subprocess
from subprocess import Popen, PIPE
# Execute out script and pipe the output to stdout
process = subprocess.Popen(['script.sh', 'concat_001.fastq'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
# Obtain the standard out, and standard error
stdout, stderr = process.communicate()
and then:
>>> print stdout
*concat_001.fastq

python subprocess.Popen stdin.write

I'm new to python and would like to open a windows cmd prompt, start a process, leave the process running and then issue commands to the same running process.
The commands will change so i cant just include these commands in the cmdline variable below. Also, the process takes 10-15 seconds to start so i dont want to waste time waiting for the process to start and run commands each time. just want to start process once. and run quick commands as needed in the same process
I was hoping to use subprocess.Popen to make this work, though i am open to better methods. Note that my process to run is not cmd, but im just using this as example
import subprocess
cmdline = ['cmd', '/k']
cmd = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
cmd.stdin.write("echo hi") #would like this to be written to the cmd prompt
print cmd.stdout.readline() #would like to see 'hi' readback
cmd.stdin.write("echo hi again") #would like this to be written to the cmd prompt
print cmd.stdout.readline() #would like to see 'hi again' readback
The results arent what i expect. Seems as though the stdin.write commands arent actually getting in and the readline freezes up with nothing to read.
I have tried the popen.communicate() instead of write/readline, but it kills the process. I have tried setting bufsize in the Popen line, but that didn't make too much difference
Your comments suggest that you are confusing command-line arguments with input via stdin. Namely, the fact that system-console.exe program accepts script=filename parameter does not imply that you can send it the same string as a command via stdin e.g., python executable accepts -c "print(1)" command-line arguments but it is a SyntaxError if you pass it as a command to Python shell.
Therefore, the first step is to use the correct syntax. Suppose the system-console.exe accepts a filename by itself:
#!/usr/bin/env python3
import time
from subprocess import Popen, PIPE
with Popen(r'C:\full\path\to\system-console.exe -cli -',
stdin=PIPE, bufsize=1, universal_newlines=True) as shell:
for _ in range(10):
print('capture.tcl', file=shell.stdin, flush=True)
time.sleep(5)
Note: if you've redirected more than one stream e.g., stdin, stdout then you should read/write both streams concurrently (e.g., using multiple threads) otherwise it is very easy to deadlock your program.
Related:
Q: Why not just use a pipe (popen())? -- mandatory reading for Unix environment but it might also be applicable for some programs on Windows
subprocess readline hangs waiting for EOF -- code example on how to pass multiple inputs, read multiple outputs using subprocess, pexpect modules.
The second and the following steps might have to deal with buffering issues on the side of the child process (out of your hands on Windows), whether system-console allows to redirect its stdin/stdout or whether it works with a console directly, and character encoding issues (how various commands in the pipeline encode text).
Here is some code that I tested and is working on Windows 10, Quartus Prime 15.1 and Python 3.5
import subprocess
class altera_system_console:
def __init__(self):
sc_path = r'C:\altera_lite\15.1\quartus\sopc_builder\bin\system-console.exe --cli --disable_readline'
self.console = subprocess.Popen(sc_path, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
def read_output(self):
rtn = ""
loop = True
i = 0
match = '% '
while loop:
out = self.console.stdout.read1(1)
if bytes(match[i],'utf-8') == out:
i = i+1
if i==len(match):
loop=False
else:
rtn = rtn + out.decode('utf-8')
return rtn
def cmd(self,cmd_string):
self.console.stdin.write(bytes(cmd_string+'\n','utf-8'))
self.console.stdin.flush()
c = altera_system_console()
print(c.read_output())
c.cmd('set jtag_master [lindex [get_service_paths master] 0]')
print(c.read_output())
c.cmd('open_service master $jtag_master')
print(c.read_output())
c.cmd('master_write_8 $jtag_master 0x00 0xFF')
print(c.read_output())
You need to use iter if you want to see the output in real time:
import subprocess
cmdline = ['cmd', '/k']
cmd = subprocess.Popen(cmdline, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
cmd.stdin.write("echo hi\n")#would like this to be written to the cmd prompt
for line in iter(cmd.stdout.readline,""):
print line
cmd.stdin.write("echo hi again\n")#would like this to be written to the cmd prompt
Not sure exactly what you are trying to do but if you want to input certain data when you get certain output then I would recommend using pexpect

Popen in Python 3

HERE is a code snippet from my program.
I am trying to open cmd.exe on Windows and pass commands to a separate program and capture the output and parse it WITHOUT having to load cmd.exe every time.
All the examples to do this I found where using Python 2, and there are several changes in Python 3 about PIPES, making me unsure what is going wrong.
#DOScmd is a list of command line parameters to type into command shell.
p = subprocess.Popen('cmd.exe',
stdout=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True,
bufsize=0)
myCall = ' '.join(DOScmd) + '\n'
p.stdin.write( bytes(myCall, 'UTF-8') )
searchLines = p.stdout.readlines()
print(searchLines)
I am calling a program bowtie.exe. Now, bowtie.exe crashes when I do this. I think I might be angering the I/O gods. Any help appreciate.
I am trying to open cmd.exe on Windows and pass commands to a separate program and capture the output and parse it WITHOUT having to load cmd.exe every time.
Unless you want to run commands that are builtin to cmd.exe such as dir then you don't need to start cmd.exe at all:
from subprocess import check_output
for cmd in ["first.exe", "arg1", "arg2"], ["second.exe", ".."]:
output = check_output(cmd)
do_whatever_you_like_with(output)

Categories

Resources