Run one Python code in the background from another Python code - python

I have two Python files, file 1 and file 2 that does two separate things. I want to run them together. I am using VS2017
The pseudo code for file 1 is:
Class A:
foo1():
.
.
foo2();
if variable<30;
#do this
else;
subprocess.Popen('py file2.py')
#rest of the code for foo2()
if __name__ == "__main__":
A.foo2();
Currently when I use this format, the subprocess does start the file 2 and run it but the rest of the code for foo2() after the if-else condition runs only when the process is terminated( another condition that I have setup inside file 2).
I am trying to work it in such a way that, file 2 will start running in the background once the if-else condition is met, and will give outputs in the command window but also run the rest of file 1. Not pausing the running of file 1 till file2 is done. If not in subprocess is there another way to start both files simultaneous but control the output of file 2 by passing the value of the "variable". I am trying to figure a proper work-around.
I am new to Python.
EDIT 1:
I used the command:
process = subprocess.Popen('py file2.py' ,shell=True,stdin=None, stdout=None, stderr=None, close_fds=True)
Even if I use process.kill(), the subprocess still runs in the background. It won't quit even if use the task manager.
I also wanted to pass a variable to the second file. I am looking into something like
variable = input("enter variable)
subprocess.Popen('py file2.py -a' + variable ,shell=True,stdin=None, stdout=None, stderr=None, close_fds=True)
But as far as I have looked, it was told that I can only pass strings through a subprocess. is it true?

I believe you can do this with both multithreading and multiprocessing. If you want to start them both right away and then monitor the variable, you can connect them with a pipe or queue.
starting when triggered:
from py_file2.py import your_func
import threading
Class A:
foo1():
.
.
foo2();
if variable<30;
#do this
else;
#put something here to make sure it only starts once
t = threading.Thread(target = your_func)
t.start()
#rest of the code for foo2()
if __name__ == "__main__":
A.foo2();
starting right away:
from py_file2.py import your_func
import threading
from queue import Queue
Class A:
foo1():
.
.
foo2(your_queue);
if variable<30;
#do this
else;
your_queue.put(variable)
#rest of the code for foo2()
if __name__ == "__main__":
your_queue = Queue()
t = threading.Thread(target = your_func, args = (your_queue,))
t.start()
A.foo2(your_queue);

Related

Cannot kill a loading animation when using multiprocessing

I'm trying to use multiprocessing to run multiple scripts. At the start, I launch a loading animation, however I am unable to ever kill it. Below is an example...
Animation: foo.py
import sys
import time
import itertools
# Simple loading animation that runs infinitely.
for c in itertools.cycle(['|', '/', '-', '\\']):
sys.stdout.write('\r' + c)
sys.stdout.flush()
time.sleep(0.1)
Useful script: bar.py
from time import sleep
# Stand-in for a script that does something useful.
sleep(5)
Attempt to run them both:
import multiprocessing
from multiprocessing import Process
import subprocess
pjt_dir = "/home/solebay/path/to/project" # Setup paths..
foo_path = pjt_dir + "/foo.py" # ..
bar_path = pjt_dir + "/bar.py" # ..
def run_script(path): # Simple function that..
"""Launches python scripts.""" # ..allows me to set a..
subprocess.run(["python", path]) # ..script as a process.
foo_p = Process(target=run_script, args=(foo_path,)) # Define the processes..
bar_p = Process(target=run_script, args=(bar_path,)) # ..
foo_p.start() # start loading animation
bar_p.start() # start 'useful' script
bar_p.join() # Wait for useful script to finish executing
foo_p.kill() # Kill loading animation
I get no error messages, and (my_venv) solebay#computer:~$ comes up in my terminal, but the loading animation persists (clipping over my name and environement). How can I kill it?
I've run into a similar situation before where I couldn't terminate the program using ctrl + c. The issue is (more or less) solved by using daemonic processes/threads (see multiprocessing doc). To do this, you simply change
foo_p = Process(target=run_script, args=(foo_path,))
to
foo_p = Process(target=run_script, args=(foo_path,), daemon=True)
and similarly for other children processes that you would like to create.
With that being said, I myself am not exactly sure if this is the correct way to remedy the issue with not being able to terminate the multiprocessing program, or is it just some artifact that happens to help with this. I would suggest this thread that went into the discussion about daemon threads more. But essentially, from my understanding, daemon threads would be terminated automatically whenever their parent process is terminated, regardless of whether they are finished or not. Meanwhile, if a thread is not daemonic, then somehow you need to wait until the children processes to finish before you're able to fully terminate the program.
You are creating too many processes. These two lines:
foo_p = Process(target=run_script, args=(foo_path,)) # Define the processes..
bar_p = Process(target=run_script, args=(bar_path,)) # ..
create two new processes. Let's all them "A" and "B". Each process consists of this function:
def run_script(path): # Simple function that..
"""Launches python scripts.""" # ..allows me to set a..
subprocess.run(["python", path]) # ..script as a process.
which then creates another subprocess. Let's call those two processes "C" and "D". In all you have created 4 extra processes, instead of just the 2 that you need. It is actually process "C" that's producing the output on the terminal. This line:
bar_p.join()
waits for "B" to terminate, which implies that "D" has terminated. But this line:
foo_p.kill()
kills process "A" but orphans process "C". So the output to the terminal continues forever.
This is well documented - see the description of multiprocessing.terminate, which says:
"Note that descendant processes of the process will not be terminated – they will simply become orphaned."
The following program works as you intended, exiting gracefully from the second process after the first one has finished. (I renamed "foo.py" to useless.py and "bar.py" to useful.py, and made small changes so I could run it on my computer.)
import subprocess
import os
def run_script(name):
s = os.path.join(r"c:\pyproj310\so", name)
return subprocess.Popen(["py", s])
if __name__ == "__main__":
useless_p = run_script("useless.py")
useful_p = run_script("useful.py")
useful_p.wait() # Wait for useful script to finish executing
useless_p.kill() # Kill loading animation
You can't use subprocess.run() to launch the new processes since that function will block the main script until the process completes. So I used Popen instead. Also I placed the running code under an if __name__ == "__main__" which is good practice (and maybe necessary on Windows).

Sharing variable values between Python scripts?

For a Raspberry Pi-based project I'm working on, I want to have a main program and a "status checker" secondary script. In other words, when the first program is started, I want it to start as a background service and kick me back out to Terminal, and then the secondary program can be used to check the first program's status/progress.
I need the main program to send variable values to the status checking script, which will then print them to Terminal. I found this old post, but it doesn't seem to work.
I modified the code a bit from the old post, but here it is. The import main doesn't import the function, it seems to just run main.py. I added the for loop in main.py as a placeholder for the stuff I would be doing in the main script.
#main.py
from multiprocessing import Process,Pipe
import time
def f(child_conn):
msg = "Hello"
child_conn.send(msg)
child_conn.close()
for i in range(1000000):
print(i)
time.sleep(0.05)
#second.py
from multiprocessing import Process,Queue,Pipe
from main import f
if __name__ == '__main__':
parent_conn,child_conn = Pipe()
p = Process(target=f, args=(child_conn,))
p.start()
print(parent_conn.recv()) # prints "Hello"
The problem is that when second.py imports f from main.py, it runs everything on the global scope. If you remove them, you can see that your process and pipe do work. If you want to keep that part as well you can do something like:
if __name__ == '__main__':
for i in range(1000000):
print(i)
time.sleep(0.05)
Refer to this answer why this is the case:

How to run two python scripts simultaneously from a master script

I have two independent scripts that are in an infinite loop. I need to call both of them from another master script and make them run simultaneously. Producing results at the same time.
Here are some scripts
script1.py
y= 1000000000
while True:
y=y-1
print("y is now: ", y)
script2.py
x= 0
while True:
x=x+1
print("x is now: ", x)
The Aim Is to compile the master script with pyinstaller into one console
You can use the python 'multiprocessing' module.
import os
from multiprocessing import Process
def script1:
os.system("script1.py")
def script2:
os.system("script2.py")
if __name__ == '__main__':
p = Process(target=script1)
q = Process(target=script2)
p.start()
q.start()
p.join()
q.join()
Note that print statement might not be the accurate way to check parallelism of the processes.
Python scripts are executed when imported.
So if you really want to keep your two scripts untouched, you can import each one of then in a separate process, like the following.
from threading import Thread
def one(): import script1
def two(): import script2
Thread(target=one).start()
Thread(target=two).start()
Analogous if you want two processes instead of threads:
from multiprocessing import Process
def one(): import script1
def two(): import script2
Process(target=one).start()
Process(target=two).start()
Wrap scripts' code in functions so that they can be imported.
def main():
# script's code goes here
...
Use "if main" to keep ability to run as scripts.
if __name__ == '__main__':
main()
Use multiprocessing or threading to run created functions.
If you really can't make your scripts importable, you can always use subprocess module, but communication between the runner and your scripts (if needed) will be more complicated.

Python multiprocessing start method doesn't run the process

I'm new to multiprocessing and I'm trying to check that I can run two process simultaneously with the following code :
import random, time, multiprocessing as mp
def printer():
"""print function"""
z = random.randit(0,60)
for i in range(5):
print z
wait = 0.2
wait += random.randint(1,60)/100
time.sleep(wait)
return
if __name__ == '__main__':
p1 = mp.Process(target=printer)
p2 = mp.Process(target=printer)
p1.start()
p2.start()
This code does not print anything on the console although I checked that the process are running thanks to the is.alive() method.
However, I can print something using :
p1.run()
p2.run()
Question 1 : Why doesn't the start() method run the process ?
Question 2 : While running the code with run() method, why do I get a sequence like
25,25,25,25,25,11,11,11,11,11
instead of something like
25,25,11,25,11,11,11,25,11,25 ?
It seems that the process run one after the other.
I would like to use multiprocessing for using the same function on multiple files to parallelize file conversion.
I made the script run by adding
from multiprocessing import Process
However, I don't have a random sequence of two numbers, the pattern is always A,B,A,B.. If you know how to show that the two process run simultaneously, any ideas are welcome !

confused about python subprocess inside for loop

I am trying to automate some big data file processing using python.
A lop of the processing is chained , i.e script1 writes a file , that is then processed by script2 , then script2's output by script3 etc.
I am using the subprocess module in a threaded context.
I have one class that creates tuples of chained scripts
("scr1.sh","scr2.sh","scr3.sh").
Then another class that uses a call like
for script in scriplist:
subprocess.call(script)
My question is that in this for loop , is each script only called after subprocess.call(script1) returns a successful retcode?.
Or is it that all three get called right after one another since I am using subprocess.call, Without using "sleep" or "wait", I want to make sure that the second script only starts after the first one is over.
edit: The pydoc says
"subprocess.call(*popenargs, **kwargs)
Run command with arguments. Wait for command to complete, then return the returncode attribute."
So in the for loop (above) , does it wait for each retcode before iterating to the next script.
I am new to threading . I am attaching the stripped-down code for the class that runs the analysis here. The subprocess.call loop is part of this class.
class ThreadedDataProcessor(Thread):
def __init__(self, in_queue, out_queue):
# Uses Queue
Thread.__init__(self)
self.in_queue = in_queue
self.out_queue = out_queue
def run(self):
while True:
path = self.in_queue.get()
if path is None:
break
myprocessor = ProcessorScriptCreator(path)
scrfiles = myprocessor.create_and_return_shell_scripts()
for index,file in enumerate(scrfiles):
subprocess.call([file])
print "CALLED%s%s" % (index,file) *5
#report(myfile.describe())
#report("Done %s" % path)
self.out_queue.put(path)
in_queue = Queue()
The loop will serially call each script, wait until it completes, and then call the next one regardless of success or failure of the previous call. You probably want to say:
try:
map(subprocess.check_call, script_list)
except Exception, e:
# failed script
A new thread will start with each call to run, and also end when run is done. You iterate over the script with subprocess within one thread.
You should make sure that each set of calls in each thread are not going to impact other calls from other threads. For example trying to read and write to the same file from a script call in multiple threads at the same time.

Categories

Resources