Best way to check if a shell command has executed successfully - python

import os
val = os.popen("ls | grep a").read()
Let's say I want to check if a given directory has a any file named a. If the directory doesn't have a file with a in it the val is empty and if not val should be assigned with some output gotten from executing that command.
Are there any cases where the value of val could be still something even with empty output? In this case, there are no files with a but could val still have some value? Are there any cases where the output looks empty when we execute on a terminal, but value still has some value (e.g. white space)?
Is it an effective approach to use in general? (I am not really trying to check for files with certain names. This is actually just an example.)
Are there any better ways of doing such a thing?

I'd recommend using python3 subprocess, where you can use the check parameter. Then your command will throw an error if it does not succeed:
import subprocess
proc = subprocess.run(["ls | grep a"], shell=True, check=True, stdout=subprocess.PIPE)
# proc = subprocess.run(["ls | grep a"], shell=True, check=True, capture_output=True) # starting python3.7
print(proc.stdout)
but as #JohnKugelman suggested, in this case you'd better use glob:
import glob
files_with_a = glob.glob("*a*")

If you reallly go for the approach of running a os command from python, I'd recommend using the subprocess package:
import subprocess
command = "ls | grep a"
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
process.wait()
print(process.returncode)

Related

Call subprocess "ls -l folder | wc -l" in python can't be done [duplicate]

This question already has answers here:
How do I use subprocess.Popen to connect multiple processes by pipes?
(9 answers)
Closed 7 years ago.
I want to run this command using call subprocess
ls -l folder | wc -l
My code in Python file is here:
subprocess.call(["ls","-l","folder","|","wc","-l"])
I got an error message like this:
ls: cannot access |: No such file or directory
ls: cannot access wc: No such file or directory
It's like command |wc can't be read by call subprocess.
How can i fix it?
Try out the shell option using a string as first parameter:
subprocess.call("ls -l folder | wc -l",shell=True)
Although this work, note that using shell=True is not recommended since it can introduce a security issue through shell injection.
You can setup a command pipeline by connecting one process's stdout with another's stdin. In your example, errors and the final output are written to the screen, so I didn't try to redirect them. This is generally preferable to something like communicate because instead of waiting for one program to complete before starting another (and encouring the expense of moving the data into the parent) they run in parallel.
import subprocess
p1 = subprocess.Popen(["ls","-l"], stdout=subprocess.PIPE)
p2 = subprocess.Popen(["wc","-l"], stdin=p1.stdout)
# close pipe in parent, its still open in children
p1.stdout.close()
p2.wait()
p1.wait()
You'll need to implement the piping logic yourself to make it work properly.
def piped_call(prog1, prog2):
out, err = subprocess.call(prog1).communicate()
if err:
print(err)
return None
else:
return subprocess.call(prog2).communicate(out)
You could try using subprocess.PIPE, assuming you wanted to avoid using subprocess.call(..., shell=True).
import subprocess
# Run 'ls', sending output to a PIPE (shell equiv.: ls -l | ... )
ls = subprocess.Popen('ls -l folder'.split(),
stdout=subprocess.PIPE)
# Read output from 'ls' as input to 'wc' (shell equiv.: ... | wc -l)
wc = subprocess.Popen('wc -l'.split(),
stdin=ls.stdout,
stdout=subprocess.PIPE)
# Trap stdout and stderr from 'wc'
out, err = wc.communicate()
if err:
print(err.strip())
if out:
print(out.strip())
For Python 3 keep in mind the communicate() method used here will return a byte object instead of a string. :
In this case you will need to convert the output to a string using decode():
if err:
print(err.strip().decode())
if out:
print(out.strip().decode())

Using multiple command in `subprocess.call`

I want to create a shell pipeline using call. For example, I want to run this code get the number of lines having 123:
The shell command would be:
grep "123" myfile | wc -l > sum.txt
But "123" is a variable so I want to use python:
A= ["123","1234","12345"]
for i in A:
call(["grep", i,"| wc >> sum.txt"])
This code does not work!
call only calls one executable. What you want is that executable to be a shell (e.g. bash), which parses the command line. bash is also responsible for handling pipes. You can do this using shell=True option, which is off by default.
When you give it an array like you do, then you are calling that executable with those arguments. A pipe is not an argument; and grep does not know how to pipe, nor how to invoke wc.
You can do
call(["grep '%s' myfile | wc >> sum.txt" % i, shell=True)
If you are using the pipe character you would need shell=True,pass i each time using str.format:
call('grep "{}" myfile | wc >> sum.txt'.format(i),shell=True)
You can also do it without shell=True and using python to open the file instead of shell redirection:
from subprocess import Popen, PIPE, check_call
p = Popen(["grep","myfile" i], stdout=PIPE)
with open('sum.txt', "a") as f:
check_call(["wc"], stdin=p.stdout,stdout=f)
Also your > and >> are not the same so what mode you open the file in will depend on which one you actually want to replicate

How to call a series of bash commands in python and store output

I am trying to run the following bash script in Python and store the readlist output. The readlist that I want to be stored as a python list, is a list of all files in the current directory ending in *concat_001.fastq.
I know it may be easier to do this in python (i.e.
import os
readlist = [f for f in os.listdir(os.getcwd()) if f.endswith("concat_001.fastq")]
readlist = sorted(readlist)
However, this is problematic, as I need Python to sort the list in EXACTLY the same was as bash, and I was finding that bash and Python sort certain things in different orders (eg Python and bash deal with capitalised and uncapitalised things differently - but when I tried
readlist = np.asarray(sorted(flist, key=str.lower))
I still found that two files starting with ML_ and M_ were sorted in different order with bash and Python. Hence trying to run my exact bash script through Python, then to use the sorted list generated with bash in my subsequent Python code.
input_suffix="concat_001.fastq"
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g" `
readlist="$(echo $ender)"
I have tried
proc = subprocess.call(command1, shell=True, stdout=subprocess.PIPE)
proc = subprocess.call(command2, shell=True, stdout=subprocess.PIPE)
proc = subprocess.Popen(command3, shell=True, stdout=subprocess.PIPE)
But I just get: subprocess.Popen object at 0x7f31cfcd9190
Also - I don't understand the difference between subprocess.call and subprocess.Popen. I have tried both.
Thanks,
Ruth
So your question is a little confusing and does not exactly explain what you want. However, I'll try to give some suggestions to help you update it, or in my effort, answer it.
I will assume the following: your python script is passing to the command line 'input_suffix' and that you want your python program to receive the contents of 'readlist' when the external script finishes.
To make our lives simpler, and allow things to be more complicated, I would make the following bash script to contain your commands:
script.sh
#!/bin/bash
input_suffix=$1
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g"`
readlist="$(echo $ender)"
echo $readlist
You would execute this as script.sh "concat_001.fastq", where $1 takes in the first argument passed on the command line.
To use python to execute external scripts, as you quite rightly found, you can use subprocess (or as noted by another response, os.system - although subprocess is recommended).
The docs tell you that subprocess.call:
"Wait for command to complete, then return the returncode attribute."
and that
"For more advanced use cases when these do not meet your needs, use the underlying Popen interface."
Given you want to pipe the output from the bash script to your python script, let's use Popen as suggested by the docs. As I posted the other stackoverflow answer, it could look like the following:
import subprocess
from subprocess import Popen, PIPE
# Execute out script and pipe the output to stdout
process = subprocess.Popen(['script.sh', 'concat_001.fastq'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
# Obtain the standard out, and standard error
stdout, stderr = process.communicate()
and then:
>>> print stdout
*concat_001.fastq

python subprocess and passing in shell arguments

I'm trying to utilize python's subprocess to run a command that downloads a file, but it requires an argument in order to proceed. If I run the command stand alone, it will prompt you as shown below:
./goro-new export --branch=testing --file=corp/goro.sites/test/meta.json
Finding pages .........
The following pages will be exported from Goro to your local filesystem:
/goro.sites/test/meta.json -> /usr/local/home/$user/schools/goro.sites/test/meta.json
Export pages? [y/N]: y
Exporting 1 pages .............................................................................................................. 0% 0:00:03
Exported 1 pages in 3.66281s.
My question is, how do I answer the "y/N" in the Export pages part? I suspect I need to pass in an argument to my subprocess, but I am relatively a newcomer to python, so I was hoping for some help. Below is a printout of my testing in the python environment:
>>> import subprocess
>>> cmd = ['goro-new export --branch=test --file=corp/goro.sites/test/meta.json']
>>> p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,stderr=subprocess.PIPE, stdin=subprocess.PIPE)
>>> out, err = p.communicate()
>>> print out
Finding pages ....
The following pages will be exported from Goro to your local filesystem:
/goro.sites/test/meta.json -> /var/www/html/goro.sites/test/meta.json
Export pages? [y/N]:
How can I pass in the "y/N" so it can proceed?
You use the function which you are already using, the communicate() -function and pass whatever you want as it's input parameter. I cannot verify this works but it should give you an idea:
>>> import subprocess
>>> cmd = ['goro-new export --branch=test --file=corp/goro.sites/test/meta.json']
>>> p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,stderr=subprocess.PIPE, stdin=subprocess.PIPE)
>>> out, err = p.communicate(input="y")
>>> print out
The easiest way to do this if you always want to answer yes (which I'm assuming you do) is with some bash: yes | python myscript.py. To do this directly in python, you can make a new subprocess.Popen (say, called yes) with stdout=subprocess.PIPE, and set the stdin of p to be equal to yes.stdout. Reference: Python subprocess command with pipe

convert the output from Popen to an array

I try to find the process ID on linux OS with python script, with following:
PID = Popen("ps -elf | grep <proc_name>| grep -v grep | awk '{print $4}'", shell=True, stdout=PIPE).stdout
pid = PID.read()
pid=int(pid)
However, the script does not work if there are more than one PIDs with the same
The program exits at the int() function due to '123\n146\n' is not the 10 based int
I then tried the following:
pid= Pid.read().split()
print len(pid)
print pid[0]
It seems to work with the python command line and forms an array of pid =['123','156'], but somehow, it does not work in the script.
any suggestion ? thanks
Are you trying to find out your own process id? If so, use os.getpid()
You could use subprocess.check_output() and str.splitlines():
from subprocess import check_output as qx
pids = map(int, qx(["pgrep", procname]).splitlines())
To do it without an external process you could try psutil:
import psutil # pip install psutil
pids = [p.pid for p in psutil.process_iter() if p.name == procname]
Experiment with p.name, p.cmdline and various comparisons with procname to get what you need in your particular case.
And there is also os.getpid() to return the current process id.

Categories

Resources