I don't want to delete the temp file until the subprocess execution completes and hence, I invoke the subprocess script as:
import os
import tempfile
import subprocess
def main():
with tempfile.NamedTemporaryFile("w", delete=False) as temp:
temp.write("Hello World")
temp.flush()
print(f"Temp file is: {temp.name}")
args = ["python3",
os.path.dirname(__file__) + "/hello_world.py",
"--temp-file", temp.name]
subprocess.Popen(args)
return
main()
hello_world.py
import argparse
import sys
def print_hello():
print("Hello World")
return
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="""Test case""")
parser.add_argument('--temp-file',
required=True,
help='For test')
args = parser.parse_args()
print(args)
print_hello()
sys.exit(0)
I was hoping the temp file to be deleted once subprocess execution finishes.
Do I need to manually delete the temp file in this case?
Calling subprocess.Popen() starts the process but does not wait for it to finish.
If you want to wait for the process to finish before exiting the with block, you can use subprocess.run() instead.
Edit: Per your comment, you don't want to wait for the process to finish. Since you are creating the file with delete=False, it won't be deleted when the file pointer is closed at the end of the with block, so you will need to manually delete the path, either in the parent or child process.
I have a flask python file that i want to open when i start the main python file.
main python file
start flask python file
continue with it's own independent processes (threading)
Which solution to take since i do not want the execution of the flask app to hinder the performance of the later processes. Not sure if i should do a subprocess or exec file?
both files are pretty independent of each other.
if I understood correctly, you can create a daemon thread for flask and continue with the execution of main program.
If you have a independent_module.py like this:
# independent_module.py
# your independent functions
def start():
pass
Then your main file would look something like this:
# main.py file
import threading
from flask import Flask
import main
app = Flask(__name__)
#app.route("/health")
def health():
return "OK"
#app.route("/ping")
def ping():
return "Et. Voila!!"
def run_server_api():
app.run(host='0.0.0.0', port=8080)
def main():
flask_thread = threading.Thread(target=run_server_api, daemon=True)
flask_thread.start()
# continue with your main program functions
independent_module.start()
if __name__ == "__main__":
# execute main
main()
You can simply execute python main.py
I have a script main.py which called a function fun from a library.
I want to exit only from fun continuing the script main.py, using for this purpose another script kill_fun.py.
I tried to use different bash commands (using os.system) with ps, but the pid it gives me is referred only to main.py.
Example:
-main.py
from lib import fun
if __name__ == '__main__':
try:
fun()
except:
do_something
do_something_else
-lib.py
def fun():
do_something_of_long_time
-kill_fun.py
if __name__ == '__main__':
kill_only_fun
You can do so by run fun in a different process.
from time import sleep
from multiprocessing import Process
from lib import fun
def my_fun():
tmp = 0
for i in range(1000000):
sleep(1)
tmp += 1
print('fun')
return tmp
def should_i_kill_fun():
try:
with open('./kill.txt','r') as f:
read = f.readline().strip()
#print(read)
return read == 'Y'
except Exception as e:
return False
if __name__ == '__main__':
try:
p = Process(target=my_fun, args=())
p.start()
while p.is_alive():
sleep(1)
if should_i_kill_fun():
p.terminate()
except Exception as e:
print("do sth",e)
print("do sth other thing")
to kill fun, simply echo 'Y' > kill.txt
or you can write a python script to write the file as well.
Explain
The idea is to start fun in a different process. p is a process handler that you can control. And then, we put a loop to check file kill.txt to see if kill command 'Y' is in there. If yes, then it call p.terminate(). The process will then get killed and continue to do next things.
Hope this helps.
I am trying to execute script (say, main.py) from another file (say, test.py), I don't want to call any function of main, instead i want to start executing the file from if __name__=="__main__":
Example:
test.py
def funA():
....
def funB():
....
#the last function of file
def funC():
#here i want to start executing main.py without calling any function.
main.py
def fun_x(arg1):
#do something
if __name__ == "__main__": #execution starts here
arg1=10
x(arg1)
Is it possible to call the main directly like this ?
You can just call the script "like a user would do" with The subprocess module.
I'm trying to create a script which is using multiprocessing module with python. The script (lets call it myscript.py) will get the input from another script with pipe.
Assume that I call the scripts like this;
$ python writer.py | python myscript.py
And here is the codes;
// writer.py
import time, sys
def main():
while True:
print "test"
sys.stdout.flush()
time.sleep(1)
main()
//myscript.py
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
p1 = Process(target=get_input, args=())
p1.start()
this is clearly not working, since the sys.stdin objects are different for main process and p1. So I have tried this to solve it,
//myscript.py
def get_input(temp):
while True:
text = temp.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
p1 = Process(target=get_input, args=(sys.stdin,))
p1.start()
but I come across with this error;
Process Process-1:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "in.py", line 12, in get_input
text = temp.readline()
ValueError: I/O operation on closed file
So, I guess that main's stdin file closed and I can't read from it. At this conjunction, how can I pass main's stdin file to another process? If passing stdin is not possible, how can I use main's stdin from another process?
update:
Okay, I need to clarify my question since people think using multiprocessing is not really necessary.
consider myscript.py like this;
//myscript.py
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
def do_more_things():
while True:
#// some code here
time.sleep(60*5)
if __name__ == '__main__':
p1 = Process(target=get_input, args=())
p1.start()
do_more_things()
so, I really need to run get_input() function parallelly with main function (or other sub processes).
Sorry for the conflicts, I have a decent English, and I guess I couldn't be clear on this question. I would appreciate if you guys can tell me if i can use the main processes STDIN object in another process.
thanks in advance.
The simplest thing is to swap get_input() and do_more_things() i.e., read sys.stdin in the parent process:
def get_input(stdin):
for line in iter(stdin.readline, ''):
print("hello", line, end='')
stdin.close()
if __name__ == '__main__':
p1 = mp.Process(target=do_more_things)
p1.start()
get_input(sys.stdin)
The next best thing is to use a Thread() instead of a Process() for get_input():
if __name__ == '__main__':
t = Thread(target=get_input, args=(sys.stdin,))
t.start()
do_more_things()
If the above doesn't help you could try os.dup():
newstdin = os.fdopen(os.dup(sys.stdin.fileno()))
try:
p = Process(target=get_input, args=(newstdin,))
p.start()
finally:
newstdin.close() # close in the parent
do_more_things()
Each new process created with the multiprocessing module gets its own PID, and therefore it's own standard input device and output devices, even if they're both writing to the same terminal, hence the need for locks.
You're already creating two processes by separating the content into two scripts, and creating a third process with get_input(). get_input could read the standard input if it was a thread instead of a process. Then, no need to have a sleep function in the reader.
## reader.py
from threading import Thread
import sys
def get_input():
text = sys.stdin.readline()
while len(text) != 0:
print 'hello ' + text
text = sys.stdin.readline()
if __name__ == '__main__':
thread = Thread(target=get_input)
thread.start()
thread.join()
This will only be a partial answer - as I'm unclear about subsequent parts of the question.
You start by saying that you anticipate calling your scripts:
$ python writer.py | python myscript.py
If you're going to do that, writer needs to write to standard out and myscript needs to read from standard input. The second script would look like this:
def get_input():
while True:
text = sys.stdin.readline()
print "hello " + text
time.sleep(3)
if __name__ == '__main__':
get_input()
There's no need for the multiprocessing.Process object... you're firing off two processes from the command line already - and you're using the shell to connect them with an (anonymous) pipe (the "|" character) that connects standard output from the first script to standard input from the second script.
The point of the Process object is to manage launch of a second process from the first. You'd need to define a process; then start it - then you'd probably want to wait until it has terminated before exiting the first process... (calling p1.join() after p1.start() would suffice for this).
If you want to communicate between a pair of processes under python control, you'll probably want to use the multiprocess.Pipe object to do so. You can then easily communicate between the inital and the subordinate spawned process by reading and writing to/from the Pipe object rather than standard input and standard output. If you really want to re-direct standard input and standard output, this is probably possible by messing with low-level file-descriptors and/or by overriding/replacing the sys.stdin and sys.stdout objects... but, I suspect, you probably don't want (or need) to do this.
To read the piped in input use fileinput:
myscript.py
import fileinput
if __name__ == '__main__':
for line in fileinput.input():
#do stuff here
process_line(line)