I have running python file "cepusender/main.py" (and another python files). How can I restart/kill only main.py file?
Here's a way (there are many):
ps -ef | grep 'cepusender/main.py' | grep -v grep | awk '{print $2}' | xargs kill
ps is the process snapshot command. -e prints every process on the system, and -f prints the full-format listing, which, germanely, includes the command line arguments of each process.
grep prints lines matching a pattern. We first grep for your file, which will match both the python process and the grep process. We then grep -v (invert match) for grep, paring output down to just the python process.
Output now looks like the following:
user 77864 68024 0 13:53 pts/4 00:00:00 python file.py
Next, we use awk to pull out just the second column of the output, which is the process ID or PID.
Finally we use xargs to pass the PID to kill, which asks the python process to shutdown gracefully.
kill is the command to send signals to processes.
You can use kill -9 PID to kill your python process, where 9 is the number for SIGKILL and PID is the python Process ID.
Related
Killing all celery processes involves 'grep'ing on 'ps' command and run kill command on all PID. Greping
on ps command results in showing up self process info of grep command. In order to skip that PID, 'grep
-v grep' command is piplined which executes 'not' condition on 'grep' search key. The 'awk' command is
used to filter only 2nd and 'tr' command translates result of 'awk' command output from rows to columns.
Piplining 'kill' command did not work and so the entire process has been set as command substitution, i.e.,
'$()'. The redirection at the end is not mandatory. Its only to supress the enitre output into background.
kill -9 $(ps aux | grep celery | grep -v grep | awk '{print $2}' | tr '\n' ' ') > /dev/null 2>&1
Apart from the standard unix ways of killing celery processes, Celery also provides API to kill all the workers listening on a particular broker.
To kill using python. You can refer to the docs here or here. Former calls the latter function internally.
app.control.shutdown()
where app is the celery app instance configured with the broker.
Celery command line interface can also be used for the same.
celery -A app_name control shutdown
I'm trying to kill a specific process with a command which works well in the shell but not from python subprocess
import subprocess
subprocess.call(["kill", "$(ps | grep process | awk '{ print $1}' | head -n1)"], shell=False)
A work around would be to put this command into a shell script and run the shell script.
Is it possible from python subprocess directly ?
import subprocess
subprocess.Popen("kill $(ps | grep ncat | awk '{print $1}' | head -n1)", shell=True)
In your example, you don't create a subprocess from bash, but at the same time you use $(...) which is a bash instruction. To be more precise, you create a kill process and pass it argument $(...) which is not precomputed.
The above example creates a bash process, and then tells it to interpret kill $(...). Bash converts $(...) to a value, and then it runs kill VALUE.
I am aware how to check if some python process is running. I am trying to write a script, which checks whether a python script is running, if it is not it should rerun it.
What I have right now is:
import os
stream = os.popen("ps aux | grep combined.py")
output = stream.read()
print(output[0])
The problem is I can't get the specific process ID this way, because output is a list of characters not a dict, where I could get PID by output["PID"], to check whether there is an PID in the list.
How would I implement such script?
In bash script:
#!/bin/bash
pid=`ps -ef |grep combined.py |grep -v grep |awk '{print $2}'`
echo $pid
You can use crontab to run the bash script and checks every few minutes if the python process is running
when i use ps -ef |grep i get the current running programs
if below shown are the currently running programs.How can i stop a program using the name of the program
user 8587 8577 30 12:06 pts/9 00:03:07 python3 program1.py
user 8588 8579 30 12:06 pts/9 00:03:08 python3 program2.py
eg. If i want to stop program1.py then how can i stop the process using the program name "program1.py"
.If any suggestions on killing the program with python will be great
By using psutil is fairly easy
import psutil
proc = [p for p in psutil.process_iter() if 'program.py' in p.cmdline()]
proc[0].kill()
To find out the process from the process name filter through the process list with psutil like in Cross-platform way to get PIDs by process name in python
Try doing this with the process name:
pkill -f "Process name"
For eg. If you want to kill the process "program1.py", type in:
pkill -f "program1.py"
Let me know if it helps!
Assuming you have pkill utility installed, you can just use:
pkill program1.py
If you don't, using more common Linux commands:
kill $(ps -ef | grep program1.py | awk '{print $2}')
If you insist on using Python for that, see How to terminate process from Python using pid?
grep the program and combine add pipe send the output in another command.
1. see program ps -ef.
2.search program grep program.
3. remove the grep that you search because is appear in the search process grep -v grep.
4.separate the process to kill with awk awk '{ print $2 }'
5. apply cmd on the previous input xarks kill -9
ps -ef | grep progam | grep -v grep | awk '{ print $2 }' | xargs kill -9
see here for more:
about pipe , awk, xargs
with python you can use os:
template = "ps -ef | grep {program} | grep -v grep | awk '{{ print $2 }}' | xargs kill -9"
import os
os.system(template.format(program="work.py"))
$1 &
echo $!
is there a different way to launch a command in the background and return the pid immediately?
So when I launch bash run.sh "python worker.py" it will give me the pid of the launched job.
I am using paramiko, a python library which doesn't work with python worker.py &. so I want to create a bash script which will do this for me on the remote server.
Since you're using bash, you can just get the list of background processes from jobs, and instruct it to return the PID via the -l flag. To quote man bash:
jobs [-lnprs] [ jobspec ... ]
jobs -x command [ args ... ]
The first form lists the active jobs. The options have the
following meanings:
-l List process IDs in addition to the normal information.
So in your case, something like
jobs -l | grep 'worker.py' | awk '{print $2}' would probably give you what you want.