How to immediately kill Python script after running subprocess? - python

I made a script which plays a video file by using subprocess.run().
import subprocess
DATA_DIR = 'path\\to\\video\\files'
MEDIA_PLAYER = 'path\\to\\my\\media-player'
# returns path of random video file
p = chooseOne(DATA_DIR)
print('playing {}'.format(p))
# runs chosen path
subprocess.run([MEDIA_PLAYER, p])
But I would like to kill the python script running this code immediately after opening the child subprocess.
Is this possible? And if not, is there an alternative means of opening an external process using Python which would allow the script to terminate?
Note: I am using Python v3.6

Don't use subprocess.run; use os.execl instead. That makes your media player replace your Python code in the current process, rather that starting a new process.
os.execl(MEDIA_PLAYER, p)
subprocess.run effectively does the same thing, but forks first so that there are temporarily two processes running your Python script; in one, subprocess.run returns without doing anything else to allow your script to continue. In the other, it immediately uses one of the os.exec* functions—there are 8 different varieties—to execute your media player. In your case, you just want the first process to exit anyway, so save the effort of forking and just use os.execl right away.

Related

Is there a way I can store the output of a terminal command into a file using python?

I want to store the output of the terminal command top into a file, using Python.
In the terminal, when I type top and hit enter, I get an output that is real time, so it keeps updating. I want to store this into a file for a fixed duration and then stop writing.
file=open("data.txt","w")
file.flush()
import os,time
os.system("top>>data.txt -n 1")
time.sleep(5)
exit()
file.close()
I have tried to use time.sleep() and then exit(), but it doesn't work, and the only way top can be stopped is in the terminal, by Control + C
The process keeps running and the data is continuously written onto the file, which is not ideal, as one would guess
For clarity: I know how to write the output on to the file, I just want to stop writing after a period
system will wait for the end of the child process. If you do not want that, the Pythonic way is to directly use the subprocess module:
import subprocess
timeout=60 # let top run for one minute
file=open("data.txt","w")
top = subprocess.Popen(["top", "-n", 1], stdout=file)
if top.wait(timeout) is None: # wait at most timeout seconds
top.terminate() # and terminate child
The panonoic way (which is highly recommended for robust code) would be to use the full path of top. I have not here, because it may depend on the actual system...
The issue you could be facing is that os.system starts the process as part of the current process. So the rest of your script will not be run until the command you run has completed execution.
I think what you want to be doing is executing your console command on another thread so that the thread running your python script can continue while the command runs in the background. See run a python program on a new thread for more info.
I'd suggest something like (this is untested):
import os
import time
import multiprocessing
myThread = multiprocessing.process(target=os.system, args=("top>>data.txt -n 1",))
myThread.start()
time.sleep(5)
myThread.terminate()
That being said, you may need to consider the thread safety of os.system(), if it is not thread safe you'll need to find an alternative that is.
Something else worth noting (and that I know little about) is that it may not be ideal to terminate threads in this way, see some of the answers here: Is there any way to kill a Thread?

Make Python wait for commands to end

I have a python program which dynamically move and rename files into a hadoop cluster. The files usually range from 10mb(parsed) up to 1.5gb (raw data). For the move commands to finish it can take a while and from what I can tell python races through them and none of the move commands get to finish. What is the proper way to have python wait for previous commands. I store the commands in a variable and pass it to os.system. The relevant code is
os.system(moverawfile)
os.system(renamerawfile)
os.system(moveparsedfile)
os.system(renameparsedfile)
I know rename commands are done basically instantaneously. Am I not supposed to use os.system? How do i ensure that python will wait for each command to finish before moving onto the next one.
I would suggest that you use run from subprocess as per Python documentation. It waits for your command to complete before returning.

Check if a subprocess is running in Linux using Python

I have an environment variable that allows a suite of applications to run under certain conditions, and the applications cannot run with the environment variable off.
My python script uses
p = subprocess.Popen(cmdStr3,stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
to open the subprocess.
To test if the application is running, I have tried
try:
os.kill(pid, 0)
return True
except OSError:
return False
and also checking p.returncode. However, these always return true, because even if the application doesn't pull up on the screen, there are mini processes of ~1 MB that still run without the application fully running, so the os sees these and returns true. Is there a way around this?
Another issue is that os.kill doesn't work, the only way I have found to terminate the application is os.killpg at the end.
What I've learned from the comments is that what actually happens is that the subprocess I call is a starter that calls a child which is another application. My subprocess always runs, but with the environment variable off, the child application does not run. Is there a way to see if the child is running?
you can use p.poll() to check if the process is still running.
Another issue is that os.kill doesn't work, the only way I have found to terminate the application is os.killpg at the end.
If you want to kill a process launched using subprocess, you can use p.terminate().
Finally, if you want to wait until the process child dies, you can use p.wait().

Use python subprocess module like a command line simulator

I am writing a test framework in Python for a command line application. The application will create directories, call other shell scripts in the current directory and will output on the Stdout.
I am trying to treat {Python-SubProcess, CommandLine} combo as equivalent to {Selenium, Browser}. The first component plays something on the second and checks if the output is expected. I am facing the following problems
The Popen construct takes a command and returns back after that command is completed. What I want is a live handle to the process so I can run further commands + verifications and finally close the shell once done
I am okay with writing some infrastructure code for achieveing this since we have a lot of command line applications that need testing like this.
Here is a sample code that I am running
p = subprocess.Popen("/bin/bash", cwd = test_dir)
p.communicate(input = "hostname") --> I expect the hostname to be printed out
p.communicate(input = "time") --> I expect current time to be printed out
but the process hangs or may be I am doing something wrong. Also how do I "grab" the output of that sub process so I can assert that something exists?
subprocess.Popen allows you to continue execution after starting a process. The Popen objects expose wait(), poll() and many other methods to communicate with a child process when it is running. Isn't it what you need?
See Popen constructor and Popen objects description for details.
Here is a small example that runs Bash on Unix systems and executes a command:
from subprocess import Popen, PIPE
p = Popen (['/bin/sh'], stdout=PIPE, stderr=PIPE, stdin=PIPE)
sout, serr = p.communicate('ls\n')
print 'OUT:'
print sout
print 'ERR:'
print serr
UPD: communicate() waits for process termination. If you do not need that, you may use the appropriate pipes directly, though that usually gives you rather ugly code.
UPD2: You updated the question. Yes, you cannot call communicate twice for a single process. You may either give all commands you need to execute in a single call to communicate and check the whole output, or work with pipes (Popen.stdin, Popen.stdout, Popen.stderr). If possible, I strongly recommend the first solution (using communicate).
Otherwise you will have to put a command to input and wait for some time for desired output. What you need is non-blocking read to avoid hanging when there is nothing to read. Here is a recipe how to emulate a non-blocking mode on pipes using threads. The code is ugly and strangely complicated for such a trivial purpose, but that's how it's done.
Another option could be using p.stdout.fileno() for select.select() call, but that won't work on Windows (on Windows select operates only on objects originating from WinSock). You may consider it if you are not on Windows.
Instead of using plain subprocess you might find Python sh library very useful:
http://amoffat.github.com/sh/
Here is an example how to build in an asynchronous interaction loop with sh:
http://amoffat.github.com/sh/tutorials/2-interacting_with_processes.html
Another (old) library for solving this problem is pexpect:
http://www.noah.org/wiki/pexpect

Run external program concurrently in Python

I'm wondering how to call an external program in such a way that allows the user to continue to interact with my program's UI (built using tkinter, if it matters) while the Python program is running. The program waits for the user to select files to copy, so they should still be able to select and copy files while the external program is running. The external program is Adobe Flash Player.
Perhaps some of the difficulty is due to the fact that I have a threaded "worker" class? It updates the progress bar while it does the copying. I would like the progress bars to update even if the Flash Player is open.
I tried the subprocess module. The program runs, however it prevents the user from using the UI until the Flash Player is closed. Also, the copying still seems to occur in the background, it's just that the progress bar does not update until the Flash Player is closed.
def run_clip():
flash_filepath = "C:\\path\\to\\file.exe"
# halts UI until flash player is closed...
subprocess.call([flash_filepath])
Next, I tried using the concurrent.futures module (I was using Python 3 anyway). Since I'm still using subprocess to call the application, it's not surprising that this code behaves exactly like the above example.
def run_clip():
with futures.ProcessPoolExecutor() as executor:
flash_filepath = "C:\\path\\to\\file.exe"
executor.submit(subprocess.call(animate_filepath))
Does the problem lie in using subprocess? If so, is there a better way to call the external program? Thanks in advance.
You just need to keep reading about the subprocess module, specifically about Popen.
To run a background process concurrently, you need to use subprocess.Popen:
import subprocess
child = subprocess.Popen([flash_filepath])
# At this point, the child process runs concurrently with the current process
# Do other stuff
# And later on, when you need the subprocess to finish or whatever
result = child.wait()
You can also interact with the subprocess' input and output streams via members of the Popen-object (in this case child).

Categories

Resources