python subprocess.check_output giving wrong output - python

I have problem execute this code here
subprocess.check_output(['ps -ef | grep ftp | wc -l'],env=environ,shell=True)
When I execute from terminal
ps -ef | grep ftp | wc -l
I get "1" as output which is fine.
Now, I execute same code from my python files as subprocess.check_output and it gives me 2. That is strange. Any Ideas why is it happening. Here is the complete code:
def countFunction():
environ = dict(os.environ)
return subprocess.check_output(['ps -ef | grep ftp | wc -l'],env=environ,shell=True)
count = countFunction()
print count
EDIT:
Just to update , I do not have any ftp connections on.So command line is printing 1 on command which is fine.
Thanks
Arvind

The grep command will find itself:
$ ps -ef | grep ftp
wallyk 12546 12326 0 16:25 pts/3 00:00:00 grep ftp
If you don't want that, exclude the grep command:
$ ps -ef | grep ftp | grep -v ftp
$
It would be better to drop the -f switch to ps so that the command line arguments are not searched. That way, it won't find the grep ftp running:
$ ps -e | grep ftp | wc -l

Related

KeyError: 'print $2' when I use subprocess.call("ps -ef | grep wget | grep {0} | awk '{print $2}'

In my macOS, I using a wget downloading the www.test.com index page.
then I have below python code:
#-*- coding:utf-8 -*-
import subprocess
url = "https://www.test.com"
subprocess.call("ps -ef | grep wget | grep {0} | awk '{print $2}'".format(url), shell=True)
when I run it, I get issue:
subprocess.call("ps -ef | grep wget | grep {0} | awk '{print $2}'".format(url), shell=True)
KeyError: 'print $2'
and I switch the subprocess.call() to os.system(), still get this issue.
Your error is not caused by subprocess, rather the string format.
"ps -ef | grep wget | grep {0} | awk '{print $2}'".format(url)
you can use %s to forming string.
"ps -ef | grep wget | grep %s | awk '{print $2}'"%(url)
You need to use subprocess.run()
From Docs
Code needing to capture stdout or stderr should use run() instead
Note : Do not use stdout=PIPE or stderr=PIPE with this function. The child process will block if it generates enough output to a pipe to fill up the OS pipe buffer as the pipes are not being read from.

how to kill python program in linux using name of the program

when i use ps -ef |grep i get the current running programs
if below shown are the currently running programs.How can i stop a program using the name of the program
user 8587 8577 30 12:06 pts/9 00:03:07 python3 program1.py
user 8588 8579 30 12:06 pts/9 00:03:08 python3 program2.py
eg. If i want to stop program1.py then how can i stop the process using the program name "program1.py"
.If any suggestions on killing the program with python will be great
By using psutil is fairly easy
import psutil
proc = [p for p in psutil.process_iter() if 'program.py' in p.cmdline()]
proc[0].kill()
To find out the process from the process name filter through the process list with psutil like in Cross-platform way to get PIDs by process name in python
Try doing this with the process name:
pkill -f "Process name"
For eg. If you want to kill the process "program1.py", type in:
pkill -f "program1.py"
Let me know if it helps!
Assuming you have pkill utility installed, you can just use:
pkill program1.py
If you don't, using more common Linux commands:
kill $(ps -ef | grep program1.py | awk '{print $2}')
If you insist on using Python for that, see How to terminate process from Python using pid?
grep the program and combine add pipe send the output in another command.
1. see program ps -ef.
2.search program grep program.
3. remove the grep that you search because is appear in the search process grep -v grep.
4.separate the process to kill with awk awk '{ print $2 }'
5. apply cmd on the previous input xarks kill -9
ps -ef | grep progam | grep -v grep | awk '{ print $2 }' | xargs kill -9
see here for more:
about pipe , awk, xargs
with python you can use os:
template = "ps -ef | grep {program} | grep -v grep | awk '{{ print $2 }}' | xargs kill -9"
import os
os.system(template.format(program="work.py"))

Get source file path of a running python script from process id

I have a process running in the background, a python one, with ps -ef I can see filename from running command : UID PID PPID ... python ./filename.py
How can I know where the file is located
pwdx < PID > gives full directory the process is running from.
So, the full script would be
ps -ef | grep 'your process' | awk '{print $2}' | xargs pwdx
Though, you can simplify this into
pgrep 'your process' | awk '{print $1}' | xargs pwdx

Combine all the kill bash command in one line

I am using the following command to kill my servers at the moment and want to combine them into one.
1- ps aux | grep 'python manage.py runserver'
sudo kill -9 $PID
2- ps aux | grep 'python -m SimpleHTTPServer'
sudo kill -9 $PID
3- ps aux | grep 'aptly api server'
sudo kill -9 $PID
Is there a way to kill all the three processes using a single command? or atleast combine them.
EDIT: I am trying the below command but it is just print out single PID number.
ps aux | egrep -i 'python manage.py runserver|aptly api serve|python -m SimpleHTTPServer' | awk '{print $2}' | xargs kill -9

python subprocess.Popen vs os.popen

I'm trying to get the output of the following shell command in my python script,
hadoop fs -ls /projectpath/ | grep ^d | grep -v done | head -1 | awk {'print $8'}
I can successfully get the output through os.popen as follows:
import os
cmd = "hadoop fs -ls /projectpath/ | grep ^d | grep -v done | head -1 | awk {'print $8'}"
p = os.popen(cmd,"r")
while 1:
line = p.readline()
if not line: break
print line
But os.popen() is deprecated since python 2.6 so I wanted to replace the above snippet with the subprocess.Popen() function.
But the code snippet for subprocess.Popen() below gives a different result than the code snippet above.
import subprocess as sub
import shlex
cmd = "hadoop fs -ls /projectpath/ | grep ^d | grep -v done | head -1 | awk {'print $8'}"
args = shlex.split(cmd)
p = sub.Popen(args,stdout=sub.PIPE,stderr=sub.PIPE)
output, errors = p.communicate()
print output
The above command just gives output for 'hadoop fs -ls /projectpath/' part of the command.
I have tried consulting several references (http://docs.python.org/2/library/subprocess.html#popen-objects, Python, os.system for command-line call (linux) not returning what it should?) for subpocess.Popen() but cannot get it to execute the command in the string cmd. Can anyone point out what I'm doing wrong?
try this:
cmd = "hadoop fs -ls /projectpath/ | grep ^d | grep -v done | head -1 | awk {'print $8'}"
p = sub.Popen(cmd,stdout=sub.PIPE,stderr=sub.PIPE, shell=True)

Categories

Resources