How to kill a process in window without killing current process - python

Hello I just wont to kill a process without killing current process suppose that i have a python code which write a string in txt file e.g
import os
while True:
with open('input.txt' , 'r') as re:
all_data = re.read()
if all_data == 'pause':
os.system('kill.bat')
else:
print("\nContinue\n")
here it will read input.txt and if it is equal to pause then it will run kill.bat here i would like to restart this code for doing this i will write another script kill.bat which restart this code but the problem is it is not restarting because it was killing kill.bat file but i wont to kill only python terminal and restart it how can i do it here is kill.bat file code
taskkill /IM cmd.exe
python main.py
main.py is my python file

Nice try #Pryanshu
You're doing it right. The problem with your current approach is python and the batch script will be under same process and get's killed at the same time.
So just change the approach a bit
here's the gist:
python
get current process id
while loop
read file
if matches condition
call kill.bat and pass this python process ID as separate process
batch
get first parameter (which is the python process to kill)
kill the process via taskkill
start the python script as seprate process
exit
Helpers
get current process ID - pid = os.getpid()
pass id to batch script - os.system("kill.bat %s" % pid)
batch script get arguments - %1
this %1 will contain the first argument passed.
by default %* will contain all the arguments passed to the script.
python start program as
separate process - use subprocess python package
cmd start program as
separate process - use Start command
kill process via taskkill - taskkill /F /PID <pid>
replace <pid> with your argument
I know you can handle the code part. You'll have to make use of the Helpers and combine them to do the task. Lemme know the output.
If you have any question do comment. I'll try to reply ASAP
Cheers

Related

End execution subprocess when Python script is closed

I'm running a Python script from my NGINX server that runs this command
subprocess.call(["sh", "/runscript.sh", arg1, arg2, arg3], shell=False)
Problem is that when my server kill the script execution, the subprocess runned can't stop, just run forever.
That's a huge problem.
Already tried to change shell=True/shell=False.
EDIT
I've implemented the code inside the sh script inside the python script.
So now process start directly from subprocess.call.
There is a ways to save the PID of processes started from subprocess.call and end when task does not have input?
First, you can:
import os
os.system("tasklist > task.temp")
with open("task.temp", "r") as f:
print(f.read())
task = input("Enter process: ")
os.system("taskkill /f /im "+task)
After you find out which task name belongs to runscript.sh, you can replace it to:
import os
os.system("taskkill /f /im "+task)
Where task is the name of the process you want to terminate.

How to stop a python program that is running in the background shell [duplicate]

I've got a long running python script that I want to be able to end from another python script. Ideally what I'm looking for is some way of setting a process ID to the first script and being able to see if it is running or not via that ID from the second. Additionally, I'd like to be able to terminate that long running process.
Any cool shortcuts exist to make this happen?
Also, I'm working in a Windows environment.
I just recently found an alternative answer here: Check to see if python script is running
You could get your own PID (Process Identifier) through
import os
os.getpid()
and to kill a process in Unix
import os, signal
os.kill(5383, signal.SIGKILL)
to kill in Windows use
import subprocess as s
def killProcess(pid):
s.Popen('taskkill /F /PID {0}'.format(pid), shell=True)
You can send the PID to the other programm or you could search in the process-list to find the name of the other script and kill it with the above script.
I hope that helps you.
You're looking for the subprocess module.
import subprocess as sp
extProc = sp.Popen(['python','myPyScript.py']) # runs myPyScript.py
status = sp.Popen.poll(extProc) # status should be 'None'
sp.Popen.terminate(extProc) # closes the process
status = sp.Popen.poll(extProc) # status should now be something other than 'None' ('1' in my testing)
subprocess.Popen starts the external python script, equivalent to typing 'python myPyScript.py' in a console or terminal.
The status from subprocess.Popen.poll(extProc) will be 'None' if the process is still running, and (for me) 1 if it has been closed from within this script. Not sure about what the status is if it has been closed another way.
This worked for me under windows 11 and PyQt5:
subprocess.Popen('python3 MySecondApp.py')
Popen.terminate(app)
where app is MyFirstApp.py (the caller script, running) and MySecondApp.py (the called script)

Start subprocess that does not block files the parent redirects to

I'm trying to spawn a subprocess that should still be running after the main process closed. This part works fine, but if I redirect the output of this process to a file, I can't start the script a second time because the process still blocks the log file.
This short example demonstrates the problem:
In this case the second process is "notepad" and is started by "other.cmd". While the main process/script is "test_it.py" which is started by "start_it.cmd".
start_it.cmd
#python test_it.py > test.log
test_it.py
import subprocess
from subprocess import DEVNULL, STDOUT
subprocess.Popen(["other.cmd"], stdin=DEVNULL, stdout=DEVNULL, stderr=STDOUT)
other.cmd
start notepad
When start_it.cmd is executed the second time, it will fail with this error message "The process cannot access the file because it is being used by another process".
How can I start the subprocess so that it doesn't block the log file?
A solution using a pipe.
multiplexer.py
with open('log.txt', 'a') as outputFile:
while True:
data = sys.stdin.read(1024)
if None == data:
break
outputFile.write(data)
start_it.cmd
#python test_it.py | python multiplexer.py
Everything else stays the same.
I found a solution that is close to what I originally intended:
subprocess.Popen("explorer other.cmd", shell=True)
By letting the explorer start the .cmd file this succesfully detaches the called .cmd from the original process. And thus doesn't keep the log file open.

Issues with python scripts running simultaneously

I have two python scripts that use two different cameras for a project I am working on and I am trying to run them both inside a different script or within each other, either way is fine.
import os
os.system('python 1.py')
os.system('python 2.py')
My problem however is that they don't run at the same time, I have to quit the first one for the next to open. I also tried doing it with bash as well with the & shell operator
python 1.py &
python 2.py &
And this does in fact make them both run however the issue is that they both run endlessly in the background and I need to close them rather easily. Any suggestion what I can do to avoid the issues with these implementations
You could do it with multiprocessing
import os
import time
import psutil
from multiprocessing import Process
def run_program(cmd):
# Function that processes will run
os.system(cmd)
# Initiating Processes with desired arguments
program1 = Process(target=run_program, args=('python 1.py',))
program2 = Process(target=run_program, args=('python 2.py',))
# Start our processes simultaneously
program1.start()
program2.start()
def kill(proc_pid):
process = psutil.Process(proc_pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()
# Wait 5 seconds and kill first program
time.sleep(5)
kill(program1.pid)
program1.join()
# Wait another 1 second and kill second program
time.sleep(1)
kill(program2.pid)
program2.join()
# Print current status of our programs
print('1.py alive status: {}'.format(program1.is_alive()))
print('2.py alive status: {}'.format(program2.is_alive()))
One possible method is to use systemd to control your process (i.e. treat them as daemons).
This is how I control my Python servers since they need to run in the background and be completely detached from the current tty so I can exit my connection to the machine and the continue processes continue. You can then also stop the server later using systemctl, as explained below.
Instructions:
Create a .service file and save it in /etc/systemd/system, with contents along the lines of:
[Unit]
Description=daemon one
[Service]
ExecStart=/path/to/1.py
and repeat with one going to 2.py.
Then you can use systemctl to control your daemons.
First reload all config files with:
systemctl daemon-reload
then start either of your daemons (where my_daemon.service is one of your unit files):
systemctl start my_daemon
it should now be running and you should find it in:
systemctl list-units
You can also check its status with:
systemctl status my_daemon
and stop/restart them with:
systemctl stop|restart my_daemon
Use subprocess.Popen. This will create a child process and return its pid.
pid = Popen("python 1.py").pid
And then check out these functions for communicating with the child process and checking if it is still running.

Python script doesn't restart itself properly

I have a Python script and I want to have it restart itself. I found the following lines Googling around:
def restart_program():
"""Restarts the current program.
Note: this function does not return. Any cleanup action (like
saving data) must be done before calling this function."""
python = sys.executable
os.execl(python, python, * sys.argv)
but problems became apparent right after trying this out. I'm running on a really small embedded system and I ran out of memory really quick (after 2 or three iterations of this function). Checking the process list, I can see a whole bunch of python processes.
Now, I realize, I could check the process list and kill all processes that have another PID than myself - is this what I have to do or is there a better Python solution?
This spawns a new child process using the same invocation that was used to spawn the first process, but it does not stop the existing process (more precisely: the existing process waits for the child to exit).
The easier way would be to refactor your program so you don't have to restart it. Why do you need to do this?
I rewrote my restart function as follows, it will kill every python process other than itself before launching the new sub process:
def restart_program():
"""Restarts the current program.
Note: this function does not return. Any cleanup action (like
saving data) must be done before calling this function."""
logger.info("RESTARTING SCRIPT")
# command to extract the PID from all the python processes
# in the process list
CMD="/bin/ps ax | grep python | grep -v grep | awk '{ print $1 }'"
#executing above command and redirecting the stdout int subprocess instance
p = subprocess.Popen(CMD, shell=True, stdout=subprocess.PIPE)
#reading output into a string
pidstr = p.communicate()[0]
#load pidstring into list by breaking at \n
pidlist = pidstr.split("\n")
#get pid of this current process
mypid = str(os.getpid())
#iterate through list killing all left over python processes other than this one
for pid in pidlist:
#find mypid
if mypid in pid:
logger.debug("THIS PID "+pid)
else:
#kill all others
logger.debug("KILL "+pid)
try:
pidint = int(pid)
os.kill(pidint, signal.SIGTERM)
except:
logger.error("CAN NOT KILL PID: "+pid)
python = sys.executable
os.execl(python, python, * sys.argv)
Not exactly sure if this is the best solution but it works for the interim anyways...

Categories

Resources