How to run " ps cax | grep something " in Python? - python

How do I run a command with a pipe | in it?
The subprocess module seems complex...
Is there something like
output,error = `ps cax | grep something`
as in shell script?

See Replacing shell pipeline:
import subprocess
proc1 = subprocess.Popen(['ps', 'cax'], stdout=subprocess.PIPE)
proc2 = subprocess.Popen(['grep', 'python'], stdin=proc1.stdout,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
proc1.stdout.close() # Allow proc1 to receive a SIGPIPE if proc2 exits.
out, err = proc2.communicate()
print('out: {0}'.format(out))
print('err: {0}'.format(err))
PS. Using shell=True can be dangerous. See for example the warning in the docs.
There is also the sh module which can make subprocess scripting in Python a lot more pleasant:
import sh
print(sh.grep(sh.ps("cax"), 'something'))

You've already accepted an answer, but:
Do you really need to use grep? I'd write something like:
import subprocess
ps = subprocess.Popen(('ps', 'cax'), stdout=subprocess.PIPE)
output = ps.communicate()[0]
for line in output.split('\n'):
if 'something' in line:
...
This has the advantages of not involving shell=True and its riskiness, doesn't fork off a separate grep process, and looks an awful lot like the kind of Python you'd write to process data file-like objects.

import subprocess
process = subprocess.Popen("ps cax | grep something",
shell=True,
stdout=subprocess.PIPE,
)
stdout_list = process.communicate()[0].split('\n')

Drop that 'ps' subprocess and back away slowly! :)
Use the psutil module instead.

import os
os.system('ps -cax|grep something')
If you wanna replace grep argument with some variable:
os.system('ps -cax|grep '+your_var)

Related

How can I use run instead of communicate when providing text on stdin?

trying to figure out how to do this:
command = f"adb -s {i} shell"
proc = Popen(command, stdin=PIPE, stdout=PIPE)
out, err = proc.communicate(f'dumpsys package {app_name} | grep version'.encode('utf-8'))
but in this:
command = f"adb -s {i} shell"
proc = run(command, stdin=PIPE, stdout=PIPE, shell=True)
out, err = run(f'dumpsys package {app_name} | grep version', shell=True, text=True, stdin=proc.stdout )
The idea is to make a command which require input of some kind( for example(entering shell)) and afterwards inserting another command to shell.
I've found a way online with communicate, But I wonder how to do it with run() func.
Thanks!
You only need to call run once -- pass the remote command in the input argument (and don't use shell=True in places where you don't need it).
import subprocess, shlex
proc = subprocess.run(['adb', '-s', i, 'shell'],
capture_output=True,
input=f'dumpsys package {shlex.quote(app_name)} | grep version')
shlex.quote prevents an app name that contains $(...), ;, etc from running unwanted commands on your device.

python how to use subprocess pipe with linux shell

I have a python script search for logs, it continuously output the logs found and I want to use linux pipe to filter the desired output. example like that:
$python logsearch.py | grep timeout
The problem is the sort and wc are blocked until the logsearch.py finishes, while the logsearch.py will continuous output the result.
sample logsearch.py:
p = subprocess.Popen("ping google.com", shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
for line in p.stdout:
print(line)
UPDATE:
figured out, just change the stdout in subprocess to sys.stdout, python will handle the pipe for you.
p = subprocess.Popen("ping -c 5 google.com", shell=True, stdout=**sys.stdout**)
Thanks for all of you help!
And why use grep? Why don't do all the stuff in Python?
from subprocess import Popen, PIPE
p = Popen(['ping', 'google.com'], shell=False, stdin=PIPE, stdout=PIPE)
for line in p.stdout:
if 'timeout' in line.split():
# Process the error
print("Timeout error!!")
else:
print(line)
UPDATE:
I change the Popen line as recommended #triplee. Pros and cons in Actual meaning of 'shell=True' in subprocess

How to implement a complex process pipe in Python 2.6?

I like to have the Python (2.6, sorry!) equivalent of this shell pipe:
$ longrunningprocess | sometextfilter | gzip -c
That is, I have to call a binary longrunningprocess, filter its output through sometextfilter and need to get gzip output.
I know how to use subprocess pipes, but I need the output of the pipe chunkwise (probably using yield) and not all at once. E.g. this
https://security.openstack.org/guidelines/dg_avoid-shell-true.html
works only for getting all output at once.
Note, that both longrunningprocess and sometextfilter are external programs, that cannot be replaced with Python functions.
Thanks in advance for any hint or example!
Again, I thought it were difficult, while Python is (supposed to be) easy. Just concatenating the subprocesses just works, it seems:
def get_lines():
lrp = subprocess.Popen(["longrunningprocess"],
stdout=subprocess.PIPE,
close_fds=True)
stf = subprocess.Popen(["sometextfilter"],
stdin=lrp.stdout,
stdout=subprocess.PIPE,
bufsize=1,
close_fds=True)
for l in iter(stf.stdout.readline, ''):
yield l
lrp.stdout.close()
stf.stdout.close()
stf.stdin.close()
stf.wait()
lrp.wait()
[Changes by J.F. Sebastian applied. Thanks!]
Then I can use Pythons gzip for compression.
The shell syntax is optimized for one-liners, use it:
#!/usr/bin/env python2
import sys
from subprocess import Popen, PIPE
LINE_BUFFERED = 1
ON_POSIX = 'posix' in sys.builtin_module_names
p = Popen('longrunningprocess | sometextfilter', shell=True,
stdout=PIPE, bufsize=LINE_BUFFERED, close_fds=ON_POSIX)
with p.stdout:
for line in iter(p.stdout.readline, ''):
print line, # do something with the line
p.wait()
How do I use subprocess.Popen to connect multiple processes by pipes?
Python: read streaming input from subprocess.communicate()
If you want to emulate the pipeline manually:
#!/usr/bin/env python2
import sys
from subprocess import Popen, PIPE
LINE_BUFFERED = 1
ON_POSIX = 'posix' in sys.builtin_module_names
sometextfilter = Popen('sometextfilter', stdin=PIPE, stdout=PIPE,
bufsize=LINE_BUFFERED, close_fds=ON_POSIX)
longrunningprocess = Popen('longrunningprocess', stdout=sometextfilter.stdin,
close_fds=ON_POSIX)
with sometextfilter.stdin, sometextfilter.stdout as pipe:
for line in iter(pipe.readline, ''):
print line, # do something with the line
sometextfilter.wait()
longrunningprocess.wait()

Python - How to call bash commands with pipe?

I can run this normally on the command line in Linux:
$ tar c my_dir | md5sum
But when I try to call it with Python I get an error:
>>> subprocess.Popen(['tar','-c','my_dir','|','md5sum'],shell=True)
<subprocess.Popen object at 0x26c0550>
>>> tar: You must specify one of the `-Acdtrux' or `--test-label' options
Try `tar --help' or `tar --usage' for more information.
You have to use subprocess.PIPE, also, to split the command, you should use shlex.split() to prevent strange behaviours in some cases:
from subprocess import Popen, PIPE
from shlex import split
p1 = Popen(split("tar -c mydir"), stdout=PIPE)
p2 = Popen(split("md5sum"), stdin=p1.stdout)
But to make an archive and generate its checksum, you should use Python built-in modules tarfile and hashlib instead of calling shell commands.
Ok, I'm not sure why but this seems to work:
subprocess.call("tar c my_dir | md5sum",shell=True)
Anyone know why the original code doesn't work?
What you actually want is to run a shell subprocess with the shell command as a parameter:
>>> subprocess.Popen(['sh', '-c', 'echo hi | md5sum'], stdout=subprocess.PIPE).communicate()
('764efa883dda1e11db47671c4a3bbd9e -\n', None)
>>> from subprocess import Popen,PIPE
>>> import hashlib
>>> proc = Popen(['tar','-c','/etc/hosts'], stdout=PIPE)
>>> stdout, stderr = proc.communicate()
>>> hashlib.md5(stdout).hexdigest()
'a13061c76e2c9366282412f455460889'
>>>
i would try your on python v3.8.10 :
import subprocess
proc1 = subprocess.run(['tar c my_dir'], stdout=subprocess.PIPE, shell=True)
proc2 = subprocess.run(['md5sum'], input=proc1.stdout, stdout=subprocess.PIPE, shell=True)
print(proc2.stdout.decode())
key points (like outline in my solution on related https://stackoverflow.com/a/68323133/12361522):
subprocess.run()
no splits of bash command and parameters, i.e. ['tar c my_dir']or ["tar c my_dir"]
stdout=subprocess.PIPE for all processes
input=proc1.stdout chain of output of previous one into input of the next one
enable shell shell=True

Parsing a stdout in Python

In Python I need to get the version of an external binary I need to call in my script.
Let's say that I want to use Wget in Python and I want to know its version.
I will call
os.system( "wget --version | grep Wget" )
and then I will parse the outputted string.
How to redirect the stdout of the os.command in a string in Python?
One "old" way is:
fin,fout=os.popen4("wget --version | grep Wget")
print fout.read()
The other modern way is to use a subprocess module:
import subprocess
cmd = subprocess.Popen('wget --version', shell=True, stdout=subprocess.PIPE)
for line in cmd.stdout:
if "Wget" in line:
print line
Use the subprocess module:
from subprocess import Popen, PIPE
p1 = Popen(["wget", "--version"], stdout=PIPE)
p2 = Popen(["grep", "Wget"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
Use subprocess instead.
If you are on *nix, I would recommend you to use commands module.
import commands
status, res = commands.getstatusoutput("wget --version | grep Wget")
print status # Should be zero in case of of success, otherwise would have an error code
print res # Contains stdout

Categories

Resources