Python interpreter waits for child process to die - python

Contents of check.py:
from multiprocessing import Process
import time
import sys
def slp():
time.sleep(30)
f=open("yeah.txt","w")
f.close()
if __name__=="__main__" :
x=Process(target=slp)
x.start()
sys.exit()
In windows 7, from cmd, if I call python check.py, it doesn't immediately exit, but instead waits for 30 seconds. And if I kill cmd, the child dies too- no "yeah.txt" is created.
How do I make ensure the child continues to run even if parent is killed and also that the parent doesn't wait for child process to end?

What you seem to want is running your script as a background process. The solution in How to start a background process in Python? should do, you will have to specify some command line parameter that tell your script to go into slp rather than spawning a new process.

have a look at subprocess module instead.

Related

Python How to keep child process alive even after parent process dies..?

import sys, subprocess, os
path = 'child.exe path'
args = [path]
subprocess.Popen(args, creationflags=subprocess.DETACHED_PROCESS | subprocess.CREATE_NEW_PROCESS_GROUP)
sys.exit()
I ran this script, child.exe died with script. It didn't even look like it was executed. Is it possible to keep the child.exe alive after the script dies? I am using Python 3.9.7.
I found a way might not good but worked.
subprocess.run(args) # not popen not shell=true
sys.exit()
subprocess.run() blocks the thread until end of child process, in that while, child process kills the parent process. It did not worked if I set shell=True for subprocess.run(). Please let me know if you have a good way. This was tested on windows 10, python 397.

not able to terminate the process in multiprocessing python (linux)

I am new to python and using multiprocessing, I am starting one process and calling one shell script through this process. After terminating this process shell script keeps running in the background, how do I kill it, please help.
python script(test.py)
#!/usr/bin/python
import time
import os
import sys
import multiprocessing
# test process
def test_py_process():
os.system("./test.sh")
return
p=multiprocessing.Process(target=test_py_process)
p.start()
print 'STARTED:', p, p.is_alive()
time.sleep(10)
p.terminate()
print 'TERMINATED:', p, p.is_alive()
shell script (test.sh)
#!/bin/bash
for i in {1..100}
do
sleep 1
echo "Welcome $i times"
done
The reason is that the child process that is spawned by the os.system call spawns a child process itself. As explained in the multiprocessing docs descendant processes of the process will not be terminated – they will simply become orphaned. So. p.terminate() kills the process you created, but the OS process (/bin/bash ./test.sh) simply gets assigned to the system's scheduler process and continues executing.
You could use subprocess.Popen instead:
import time
from subprocess import Popen
if __name__ == '__main__':
p = Popen("./test.sh")
print 'STARTED:', p, p.poll()
time.sleep(10)
p.kill()
print 'TERMINATED:', p, p.poll()
Edit: #Florian Brucker beat me to it. He deserves the credit for answering the question first. Still keeping this answer for the alternate approach using subprocess, which is recommended over os.system() in the documentation for os.system() itself.
os.system runs the given command in a separate process. Therefore, you have three processes:
The main process in which your script runs
The process in which test_py_processes runs
The process in which the bash script runs
Process 2 is a child process of process 1, and process 3 is a child of process 1.
When you call Process.terminate from within process 1 this will send the SIGTERM signal to process two. That process will then terminate. However, the SIGTERM signal is not automatically propagated to the child processes of process 2! This means that process 3 is not notified when process 2 exits and hence keeps on running as a child of the init process.
The best way to terminate process 3 depends on your actual problem setting, see this SO thread for some suggestions.

How to Terminate a Python program before its child is finished running?

I have a script that is supposed to run 24/7 unless interrupted. This script is script A.
I want script A to call Script B, and have script A exit while B is running. Is this possible?
This is what I thought would work
#script_A.py
while(1)
do some stuff
do even more stuff
if true:
os.system("python script_B.py")
sys.exit(0)
#script_B.py
time.sleep(some_time)
do something
os.system("python script_A.py")
sys.exit(0)
But it seems as if A doesn't actually exit until B has finished executing (which is not what I want to happen).
Is there another way to do this?
What you are describing sounds a lot like a function call:
def doScriptB():
# do some stuff
# do some more stuff
def doScriptA():
while True:
# do some stuff
if Your Condition:
doScriptB()
return
while True:
doScriptA()
If this is insufficient for you, then you have to detach the process from you python process. This normally involves spawning the process in the background, which is done by appending an ampersand to the command in bash:
yes 'This is a background process' &
And detaching said process from the current shell, which, in a simple C program is done by forking the process twice. I don't know how to do this in python, but would bet, that there is a module for this.
This way, when the calling python process exits, it won't terminate the spawned child, since it is now independent.
It seems you want to detach a system call to another thread.
script_A.py
import subprocess
import sys
while(1)
do some stuff
do even more stuff
if true:
pid = subprocess.Popen([sys.executable, "python script_B.py"]) # call subprocess
sys.exit(0)
Anyway it does not seem a good practice at all. Why do you not try the script A listens the Process Stack and if it finds script B running then stops. This is another example how you could do it.
import subprocess
import sys
import psutil
while(1)
#This sections queries the current processes running
for proc in psutil.process_iter():
pinfo = proc.as_dict(attrs=['pid', 'name'])
if pinfo[ 'name' ] == "script_B.py":
sys.exit(0)
do some stuff
do even more stuff
if true:
pid = subprocess.Popen([sys.executable, "python script_B.py"]) # call subprocess
sys.exit(0)

Running external program from my python program

I have a clock GUI program that I need to run another python program from but my clock stops or I get a defunct process when I close the client program and I need the parent program to continue to run. What I've tried is:
os.system(run my program) "This stops the parent clock"
os.popen(run my program) "This stops the parent clock"
subprocess.call(run my program) "This stops the parent clock"
subprocess.Popen(run my program) "This works but when the client is closed goes defunct"
Is there a way to run my external program without stopping my clock and not leaving a defunct process?
Have you tried using subprocess module with a nohup attached?
Something like this should stop the sigint from quiting the child process
import subprocess
import os
def run_process(self,cmd):
subprocess.Popen("nohup " + cmd, shell=True, executable='/bin/bash',
stdout=open('/dev/null', 'w'),
stderr=open('logfile.log', 'a'),
preexec_fn=os.setpgrp)
run_process('ls -R ~') # Or whatever you are trying to run
Here is what works for me:
import signal
signal.signal(signal.SIGCHLD, signal.SIG_IGN)
This works!
If you want to run different programs at the same time, you can use threads for each subprocess.
import threading
t = threading.Thread(target=myprogram)
t.start()

Python subprocess.call thread hang, subprocess.popen no hang

I am trying to automate the installation of a specific program using Sikuli and scripts on Windows 7. I needed to start the program installer and then used Siluki to step through the rest of the installation. I did this using Python 2.7
This code works as expected by creating a thread, calling the subprocess, and then continuing the main process:
import subprocess
from threading import Thread
class Installer(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
subprocess.Popen(["msiexec", "/i", "c:\path\to\installer.msi"], shell=True)
i = Installer()
i.run()
print "Will show up while installer is running."
print "Other things happen"
i.join()
This code does not operate as desired. It will start the installer but then hang:
import subprocess
from threading import Thread
class Installer(Thread):
def __init__(self):
Thread.__init__(self)
def run(self):
subprocess.call("msiexec /i c:\path\to\installer.msi")
i = Installer()
i.run()
print "Will not show up while installer is running."
print "Other things happen"
i.join()
I understand that subprocess.call will wait for the process to terminate. Why does that prevent the main thread from continuing on? Should the main continue execution immediately after the process call?
Why is there such a difference in behaviors?
I have only just recently started using threads C.
You're calling i.run(), but what you should be calling is i.start(). start() invokes run() in a separate thread, but calling run() directly will execute it in the main thread.
First.
you need to add the command line parameters to your install command to make it a silent install..
http://msdn.microsoft.com/en-us/library/aa372024%28v=vs.85%29.aspx
the subprocess is probably hung waiting for an install process that will never end because it is waiting for user input.
Second.
if that doesn't work.. you should be using popen and communicate
How to use subprocess popen Python
Third.
if that still didn't work, your installer is hanging some where and you should debug the underlying process there.

Categories

Resources