When running python script background, the job always stops - python

I wrote a python script, "download_vod.py" that runs a child process using subprocess module.
download_vod.py
#!/usr/bin/python
import subprocess
url = xxx
filename = xxx.mp4
cmd = "ffmpeg -i " + url + " -c:v copy -c:a copy " + "\"" + filename + "\""
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True)
proc.wait()
When I run foreground on a bash shell as below, it works fine and terminates properly
./download_vod.py
But a problem occurs when I run the script background as below.
./download_vod.py&
The script always stops as below.
[1]+ Stopped download_vod.py
If I resume the job as below, it resumes and terminates properly.
bg
I assume it is caused by running subprocess because it never happens without subprocess.
Would you let me know what happens to the subprocess (child process) when I run the python script as background? And how would it be fixed?

Related

How to execute a non-blocking script in python and get its return code?

I am trying to execute a non-blocking bash script from python and to get its return code. Here is my function so far:
def run_bash_script(script_fullname, logfile):
my_cmd = ". " + script_fullname + " >" + logfile +" 2>&1"
p = subprocess.Popen(my_cmd, shell=True)
os.waitpid(p.pid, 0)
print(p.returncode)
As you can see, all the output is redirected into a log file, which I can monitor while the bash process is running.
However, the last command just returns 'None' instead of a useful exit code.
What am I doing wrong here?
You should use p.wait() rather than os.waitpid(). os.waitpid() is a low level api and it knows nothing about the Popen object so it could not touch p.

Run a bash command with python in background without killing it

I'm trying to execute a bash command with python. The problem is that the program needs to run in background so I try executing the code with `&` but the subprocess module kills it.
Who can I do it?
def run_command(bashCommand):
process = subprocess.Popen(bashCommand, shell=True)
output, error = process.communicate()
return output
command = 'bettercap -iface wlx485d60575bf2 -eval "set api.rest.username bettercap; set api.rest.password bettercap; set api.rest.address 127.0.0.1; set api.rest.port 8011; net.probe on; api.rest on" &'
run_command(command)
[SOLVED] It only kills it when you try to get the output.
def run_command(bashCommand):
process = subprocess.Popen(bashCommand, shell=True)

Kill not responding exe file in Python script

I'm running a kind of touchy .exe file in Python to receive a couple of data measurements. The file should open, take the measurement, then close. The issue is sometimes it crashes and I need to be able to take these measurements every 10 minutes over a long period of time.
What I need is a 'check' to see if the .exe is not responding and if it's not, then to have it kill the process. Or to just kill the whole script after every measurement taken. The issue is that the script gets stuck when it tries to run the .exe file that's not responding.
Here's the script:
FNULL = open(os.devnull, 'a')
filename = "current_pressure.log"
command = '"*SRH#\r"'
args = "httpget -r -o " + filename + " -C 2 -S " + command + IP
subprocess.call(args, stdout=FNULL, stderr=FNULL, shell=False)
Basically, need something like:
"if httpget.exe not responding, then kill process"
OR
"kill above script if running after longer than 20 seconds"
Use a timer to kill the process if its gone on too long. Here I've got two timers for a graceful and hard termination but you can just do the kill if you want.
import threading
FNULL = open(os.devnull, 'a')
filename = "current_pressure.log"
command = '"*SRH#\r"'
args = "httpget -r -o " + filename + " -C 2 -S " + command + IP
proc = subprocess.Popen(args, stdout=FNULL, stderr=FNULL, shell=False)
nice = threading.Timer(20, proc.terminate)
nice.start()
mean = threading.Timer(22, proc.kill)
mean.start()
proc.wait()
nice.cancel()
mean.cancel()
Generally, when a program hangs while working on windows we try to go to Task Manager and end the process of that particular program. When this approach fails, we experiment with some third party softwares to terminate it. However, there is even another better way for terminating such hanged programs automatically
http://www.problogbooster.com/2010/01/automatically-kills-non-responding.html

Not able to give inputs to subprocess(process which runs adb shell command) after 100 iterations

I want to run a stress test for adb(android debug bridge) shell. ( adb shell in this respect just a command line tool provided by Android phones).
I create a sub-process from python and in this subprocess i execute 'adb shell' command. there are some commands which has to be given to this subprocess which I am providing via stdin proper of the sub process.
Everything seems to be fine but when I am running a stress test. after around 100 iterations the command which I give to stdin does not reach to subprocess. If I run commands in separate terminal it is running fine. but the problem is with this stdin.
Can anyone tell me what I am doing wrong. Below is the code sample
class ADB():
def __init__(self):
self.proc = subprocess.Popen('adb shell', stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True,bufsize=0)
def provideAMcommand(self, testParam):
try:
cmd1 = "am startservice -n com.test.myapp/.ADBSupport -e \"" + "command" + "\" \"" + "test" + "\""
cmd2 = " -e \"" + "param" + "\"" + " " + testParam
print cmd1+cmd2
sys.stdout.flush()
self.proc.stdin.write(cmd1 + cmd2 + "\n")
except:
raise Exception("Phone is not Connected to Desktop or ADB is not available \n")
If it works for the first few commands but blocks later then you might forgot to read from self.proc.stdout that might lead to (as the docs warn) to OS pipe buffer filling up and blocking the child process.
To discard the output, redirect it to os.devnull:
import os
from subprocess import Popen, PIPE, STDOUT
DEVNULL = open(os.devnull, 'wb')
# ...
self.proc = Popen(['adb', 'shell'], stdin=PIPE, stdout=DEVNULL, stderr=STDOUT)
# ...
self.proc.stdin.write(cmd1 + cmd2 + "\n")
self.proc.stdin.flush()
There is pexpect module that might be a better tool for a dialog-based interaction (if you want both read/write intermitently).
IN provideAMcommand you are writing to and flushing the stdout of your main process. That will not send anything to the stdin of the child process you have created with Popen. The following code creates a new bash child process, a bit like the code in your __init__:
import subprocess as sp
cproc = sp.Popen("bash", stdin=sp.PIPE, stdout=sp.PIPE, stderr=sp.PIPE, shell=True)
Now, the easiest way to communicate with that child process is the following:
#Send command 'ls' to bash.
out, err = cproc.communicate("ls")
This will send the text "ls" and EOF to bash (equal to running a bash script with only the text "ls" in it). Bash will execute the ls command and then quit. Anything that bash or ls write to stdout and stderr will end up in the variables out and err respectively. I have not used the adb shell, but I guess it behaves like bash in this regard.
If you just want your child process to print to the terminal, don't specify the stdout and stderr arguments to Popen.
You can check the exit code of the child, and raise an exception if it is non-zero (indicating an error):
if (cproc.returncode != 0):
raise Exception("Child process returned non-zero exit code")

Failing to capture stdout from application

I have the following script:
import subprocess
arguments = ["d:\\simulator","2332.txt","2332.log", "-c"]
output=subprocess.Popen(arguments, stdout=subprocess.PIPE).communicate()[0]
print(output)
which gives me b'' as output.
I also tried this script:
import subprocess
arguments = ["d:\\simulator","2332.txt","atp2332.log", "-c"]
process = subprocess.Popen(arguments,stdout=subprocess.PIPE)
process.wait()
print(process.stdout.read())
print("ERROR:" + str(process.stderr))
which gives me the output: b'', ERROR:None
However when I run this at the cmd prompt I get a 5 lines of text.
d:\simulator atp2332.txt atp2332.log -c
I have added to simulator a message box which pops up when it launches. This is presented for all three cases. So I know that I sucessfully launch the simulator. However the python scripts are not caturing the stdout.
What am I doing wrong?
Barry.
If possible (not endless stream of data) you should use communicate() as noted on the page.
Try this:
import subprocess
arguments = ["d:\\simulator","2332.txt","atp2332.log", "-c"]
process = subprocess.Popen(arguments, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
sout, serr = process.communicate()
print(sout)
print(serr)
The following code gives me text output on stdout.
Perhaps you could try it, and then substitute your command for help
import subprocess
arguments = ["help","2332.txt","atp2332.log", "-c"]
process = subprocess.Popen(arguments,stdout=subprocess.PIPE, stderr=subprocess.PIPE)
process.wait()
print 'Return code', process.returncode
print('stdout:', process.stdout.read())
print("stderr:" + process.stderr.read())

Categories

Resources