How to run pytest from a python script subprocess - python

I need to run pytest test_start.py and keep running the program.
My cod:
import subprocess
subprocess.run(['pytest', r'C:\Python\test_start.py'], shell=True)
print('hello')
But when I run the script, pytest starts executing and print waits for it to finish.
How can I run py test and go ahead to execute the script?
UPD: When i used subprocess.Popen - I see that print has been executed, but I don't see the execution of pytest

subprocess.run specifically waits for the process to finish. If you don't want to wait, use subprocess.Popen

I solved this problem by simply adding
time.sleep(5) after subprocess
subprocess.Popen(['pytest', r'E:\Parser\Python\test_start.py'], shell=True)
time.sleep(5)
print('hello')
Apparently pytest just didn't have time to start)

Related

Debugging python code in subprocess using Breakpoints in VScode

I have been trying to debug a massive PyTorch model in VScode.
The starting point of the code first processes a configuration file, and then runs a subprocess containing the cofigs.
The issue is that after calling subprocess.call functions, the VScode, the code is executed in an external sub-process, which does not allow to use breakpoint.
example code:
def main():
args = parse_args()
cmd = construct_cmd(args)
subprocess.call(cmd, shell=True)
where cmd is the string command to be executed, but after this line, all breakpoints are ignored (probably because a sub-process runs this command.
Any solutions how to solve this?

python: run external program and direct output to file and wait for finish

I want to run an external program from python, redirect output (lots of text) to a log file and wait for that program to finish. I know I can do it via bash:
#! /bin/bash
my_external_program > log_file 2>&1
echo "done"
But how can I do the same with python? Note that with the bash command, I can check the log_file while the program is running. I want this property in python as well.
See the subprocess module.
For example:
with open("log_file", "w") as log_file:
subprocess.run(["my_external_program"], stdout=log_file, stderr=log_file)
print("done")
Controlling a python script from another script
You can check the link above, it is indeed similar issue. Using Popen from subprocess or from os.popen it is possible to check real time.
With a simple os.system ("your script > /tmp/mickey.log") will also run the script, but it will wait the execution of the command before.
Please let me know if this solve your issue.

Run python script from another python script but not as an child process

Is it possible to run a python script from another python script without wating for termination.
Parent process will terminate immediately after creation of child process.
I tried:
subprocess.Popen([sys.executable, "main.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)
and also:
os.system(...)
If you know that the other Python script has a main method, you could simply in you code call that other script:
import main
...
exit(main.main())
But here the other script executes in the context of calling script. If you want to avoid it, you could use the os.exec... functions, by launching a new Python interpretor:
import os
...
os.execl(sys.executable, "python", 'main.py')
The exec family functions will replace (under Unix-Linux) the current Python interpretor with a new one.
You can just add & to start script in background:
import os
os.system('/path/to/script.sh &')
exit()
In this case launched shell script will continue working even after main Python script exits.
But keep in mind that it can cause zombie processes appearance in our system.

Running one process in parallel linux

I need to make sure to run two processes (python scripts) almost at the same time. But I want the program to continue until one of them is finished. I am running these processes from a C++ program using system.
Is this the right way to run script1 and script2 at the same time and continue just after script2 is finished?
python ./script1.py & python ./script2.py
Thank you!
Your snippet won't work because it will continue as soon as script2 finishes. script1 may still be working at the background.
If you are using bash shell you can do the following:
python ./script1.py &
PID1=$!
python ./script2.py
wait $PID1
$! has the process id of the previously background command. So we run script1 in the background, then we run script2 until completion, and then we wait for script1 to finish (if not already finished).

Python subprocess.popen() without waiting

I'm using Python 3.4.2 on Windows. In script1.py I'm doing this:
myProc = subprocess.Popen([sys.executable, "script2.py", "argument"])
myProc.communicate()
it works and call script2.py .
The problem is that in script2.py there is a infinite loop (there must be) and the script1.py is waiting for script2.py to finish. How can I tell to script1.py to just call script2.py and don't wait for the process to finish?
Just don't call myProc.communicate() if you don't want to wait. subprocess.Popen will start the process.
Call the script in another window.
myProc = subprocess.Popen(["start", sys.executable, "script2.py", "argument"])
myProc.communicate()
start is a windows shell function that runs a program separately, allowing the current one to continue its process. I haven't tested this as I've no access to a Windows OS, but The linux equivalent (nohup) works as required.
If you need fine control over what happens with script2.py, refer to the multiprocessing module here.

Categories

Resources