executing a subprocess from python - python

I think something is getting subtly mangeled when I attempt to execute a subprocess from a python script
I attempt to execute vlc with some (a lot) of arguments.
the instance of vlc that arises complains:
Your input can't be opened:
VLC is unable to open the MRL ' -vvv rtsp://192.168.1.201:554/ch0_multicast_one --sout=#transcode{acodec=none}:duplicate{dst=rtp{sdp=rtsp://:5544/user_hash.sdp},dst=display} :no-sout-rtp-sap :no-sout-standard-sap :ttl=1 :sout-keep'. Check the log for details.
Here is the python code
pid = subprocess.Popen(["vlc "," -vvv rtsp://%s" % target_nvc.ip_address + ":554/ch0_multicast_one --sout=#transcode{acodec=none}:duplicate{dst=rtp{sdp=rtsp://:5544/user_hash.sdp},dst=display} :no-sout-rtp-sap :no-sout-standard-sap :ttl=1 :sout-keep" ], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)
I have examined the output of the subprocess function (using a shell), and if I copy paste that string into my cmd window, the vlc instance works fine... Is this a privilege thing?

Since you're passing a list to subprocess.Popen, each parameter must be in its own element. So you'd want something like:
pid = subprocess.Popen([
"vlc",
"-vvv",
"rtsp://%s:554/ch0_multicast_one" % target_nvc.ip_address,
# etc
], ...)
Each parameter (that the shell would normally parse apart for you) must be in a separate list element.
You can also pass a single command line string and let the shell pull it apart:
pid = subprocess.Popen("vlc -vvv rtsp://...", shell=True, ...)
Using the first form is better for commands that have lots of arguments.

You should use this...
pid = subprocess.Popen(["vlc", "-vvv",
"rtsp://%s" % target_nvc.ip_address + ":554/ch0_multicast_one",
"--sout=#transcode{acodec=none}:duplicate{dst=rtp{sdp=rtsp://:5544/user_hash.sdp},dst=display}",
":no-sout-rtp-sap", ":no-sout-standard-sap",
":ttl=1", ":sout-keep" ], stdout=subprocess.PIPE,
stderr=subprocess.PIPE, stdin=subprocess.PIPE)

movies_path = glob.glob("D:\\MOVIES\**\*\*\*.mp4", recursive=True) + \
glob.glob("D:\\MOVIES\**\*\*\*.mkv", recursive=True) + \
glob.glob("D:\\MOVIES\**\*\*\*.avi", recursive=True)
# probably the right movie
rightMoviePath = difflib.get_close_matches(which_movie, movies_path, len(movies_path), 0)
movie_name = rightMoviePath[0].split("\\")[-1]
hebrew_subtitle_path = glob.glob(rightMoviePath[0].replace(movie_name, "Hebrew.srt"))[0]
english_subtitle_path = glob.glob(rightMoviePath[0].replace(movie_name, "English.srt"))[0]
process, player = subprocess.Popen(["C:\\Users\\yonat\\Downloads\\VLC\\vlc.exe", "--sub-file", hebrew_subtitle_path, rightMoviePath[0]],
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)

Related

Redirecting shell command output to a file does not work using subprocess.Popen in Python

I am using Python 2.6.6 and failed to re-direct the Beeline(Hive) SQL query output returning multiple rows to a file on Unix using ">". For simplicity's sake, I replaced the SQL query with simple "ls" command on current directory and outputting to a text file.
Please ignore syntax of function sendfile. I want help to tweak the function "callcmd" to pipe the stdout onto the text file.
def callcmd(cmd, shl):
logging.info('> '+' '.join(map(str,cmd)))
#return 0;
start_time = time.time()
command_process = subprocess.Popen(cmd, shell=shl, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
command_output = command_process.communicate()[0]
logging.info(command_output)
elapsed_time = time.time() - start_time
logging.info(time.strftime("%H:%M:%S",time.gmtime(elapsed_time))+' = time to complete (hh:mm:ss)')
if (command_process.returncode != 0):
logging.error('ERROR ON COMMAND: '+' '.join(map(str,cmd)))
logging.error('ERROR CODE: '+str(ret_code))
return command_process.returncode
cmd=['ls', ' >', '/home/input/xyz.txt']
ret_code = callcmd(cmd, False)
Your command (i.e. cmd) could be ['sh', '-c', 'ls > ~/xyz.txt']. That would mean that the output of ls is never passed to Python, it happens entirely in the spawned shell – so you can't log the output. In that case, I'd have used return_code = subprocess.call(cmd), no need for Popen and communicate.
Equivalently, assuming you use bash or similar, you can simply use
subprocess.call('ls > ~/test.txt', shell=True)
If you want to access the output, e.g. for logging, you could use
s = subprocess.check_output(['ls'])
and then write that to a file like you would regularly in Python. To check for a non-zero exit code, handle the CalledProcessError that is raised in such cases.
Here the stdout in command_output is written to a file. You don't need to use any redirection although an alternative might be to have the python print to stdout, and then you would redirect that in your shell to a file.
#!/usr/bin/python
import subprocess
cmd=['ls']
command_process = subprocess.Popen(
cmd,
shell='/bin/bash',
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
universal_newlines=True
)
command_output = command_process.communicate()[0]
if (command_process.returncode != 0):
logging.error('ERROR ON COMMAND: '+' '.join(map(str,cmd)))
logging.error('ERROR CODE: '+str(ret_code))
f = open('listing.txt','w')
f.write(command_output)
f.close()
I added this piece of code to my code and It works fine.Thanks to #Snohdo
f = open('listing.txt','w')
f.write(command_output)
f.close()

Python Popen using PIPE issue

I am trying to replicate this command using python and Popen:
echo "Acct-Session-Id = 'E4FD590583649358F3B712'" | /usr/local/freeradius/bin/radclient -r 1 1.1.1.1:3799 disconnect secret
When running this from the command line as it is above, I get the expected:
Sent Disconnect-Request Id 17 from 0.0.0.0:59887 to 1.1.1.1:3799 length 44
I want to achieve the same from a python script, so I coded it like this:
rp1 = subprocess.Popen(["echo", "Acct-Session-Id = 'E4FD590583649358F3B712'"], stdout=subprocess.PIPE)
rp2 = subprocess.Popen(["/usr/local/freeradius/bin/radclient",
"-r 1",
"1.1.1.1:3799",
"disconnect",
"secret"],
stdin = rp1.stdout,
stdout = subprocess.PIPE,
stderr = subprocess.PIPE)
rp1.stdout.close()
result = rp2.communicate()
print "RESULT: " + str(result)
But, I must be doing this incorrectly as the "result" variable contains the radclient usage info, as if it is called incorrectly:
RESULT: ('', "Usage: radclient [options] server[:port] <command> [<secret>]\n <command>....
Anybody any idea where my mistake lies?
Thanks!
Besides #Rawing catch of the args typo, you can make it much simpler with a single Popen process. Try this:
rp = subprocess.Popen(["/usr/local/freeradius/bin/radclient",
"-r",
"1",
"1.1.1.1:3799",
"disconnect",
"secret"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = rp.communicate("Acct-Session-Id = 'E4FD590583649358F3B712'")
Using communicate to handle all the I/O prevents possible deadlocks that are possible when explicitly writing to stdin when you also need to read from stdout/stderr.

python: subprocess.communicate not working

I want to input one argument to the subprocess.communicate, but it always be out of my expects.
the folder tree:
├── file1
├── file2
├── main.py
the content of main.py:
import subprocess
child = subprocess.Popen(["ls"], stdin=subprocess.PIPE, universal_newlines=True)
filepath = '/Users/haofly'
child.communicate(filepath)
whatever i change the filepath to, the result only lists current folder(file1,file2,main.py).
Is I misunderstanding the communicate? How I send data to the Popen?
And how about ssh command if i want to send password?
subprocess.Popen(['ssh', 'root#ip'], stdin=subprocess.PIPE, universal_newlines=True)
You cannot 'pipe' data into ls - it lists directories based on the provided CLI arguments - but you should be able to use xargs to achieve what you want (essentially passing your folder as an argument to ls) if you don't want to provide it with the command itself:
import subprocess
child = subprocess.Popen(["xargs", "ls"], stdin=subprocess.PIPE, universal_newlines=True)
filepath = '/Users/haofly'
child.communicate(filepath)
When you use ls manually, do you type ls alone, hit Enter, and then type in the filepath in response to a prompt? That's how you're trying to use it here - the parameter to .communicate() becomes the subprocess's standard input, which in fact ls ignores completely. It wants the directory to list as a command-line parameter, which you would specify as ["ls", filepath].
I think you are missing the a shell parameter in your Popen call:
import subprocess
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
process.wait()
print process.returncode
'ls' is probably not the best command to illustrate the concept, but if you want to pass in arguments to a command you would have to do something similar to this:
cmd = ['cmd', 'opt1', 'optN']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
out, err = p.communicate('args')
print out

Python subprocess readlines()?

So I'm trying to move away from os.popen to subprocess.popen as recommended by the user guide. The only trouble I'm having is I can't seem to find a way of making readlines() work.
So I used to be able to do
list = os.popen('ls -l').readlines()
But I can't do
list = subprocess.Popen(['ls','-l']).readlines()
ls = subprocess.Popen(['ls','-l'], stdout=subprocess.PIPE)
out = ls.stdout.readlines()
or, if you want to read line-by-line (maybe the other process is more intensive than ls):
for ln in ls.stdout:
# whatever
With subprocess.Popen, use communicate to read and write data:
out, err = subprocess.Popen(['ls','-l'], stdout=subprocess.PIPE).communicate()
Then you can always split the string from the processes' stdout with splitlines().
out = out.splitlines()
Making a system call that returns the stdout output as a string:
lines = subprocess.check_output(['ls', '-l']).splitlines()
list = subprocess.Popen(['ls', '-l'], stdout=subprocess.PIPE).communicate()[0].splitlines()
straight from the help(subprocess)
A more detailed way of using subprocess.
# Set the command
command = "ls -l"
# Setup the module object
proc = subprocess.Popen(command,
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
# Communicate the command
stdout_value,stderr_value = proc.communicate()
# Once you have a valid response, split the return output
if stdout_value:
stdout_value = stdout_value.split()

catching stdout in realtime from subprocess

I want to subprocess.Popen() rsync.exe in Windows, and print the stdout in Python.
My code works, but it doesn't catch the progress until a file transfer is done! I want to print the progress for each file in real time.
Using Python 3.1 now since I heard it should be better at handling IO.
import subprocess, time, os, sys
cmd = "rsync.exe -vaz -P source/ dest/"
p, line = True, 'start'
p = subprocess.Popen(cmd,
shell=True,
bufsize=64,
stdin=subprocess.PIPE,
stderr=subprocess.PIPE,
stdout=subprocess.PIPE)
for line in p.stdout:
print(">>> " + str(line.rstrip()))
p.stdout.flush()
Some rules of thumb for subprocess.
Never use shell=True. It needlessly invokes an extra shell process to call your program.
When calling processes, arguments are passed around as lists. sys.argv in python is a list, and so is argv in C. So you pass a list to Popen to call subprocesses, not a string.
Don't redirect stderr to a PIPE when you're not reading it.
Don't redirect stdin when you're not writing to it.
Example:
import subprocess, time, os, sys
cmd = ["rsync.exe", "-vaz", "-P", "source/" ,"dest/"]
p = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
for line in iter(p.stdout.readline, b''):
print(">>> " + line.rstrip())
That said, it is probable that rsync buffers its output when it detects that it is connected to a pipe instead of a terminal. This is the default behavior - when connected to a pipe, programs must explicitly flush stdout for realtime results, otherwise standard C library will buffer.
To test for that, try running this instead:
cmd = [sys.executable, 'test_out.py']
and create a test_out.py file with the contents:
import sys
import time
print ("Hello")
sys.stdout.flush()
time.sleep(10)
print ("World")
Executing that subprocess should give you "Hello" and wait 10 seconds before giving "World". If that happens with the python code above and not with rsync, that means rsync itself is buffering output, so you are out of luck.
A solution would be to connect direct to a pty, using something like pexpect.
I know this is an old topic, but there is a solution now. Call the rsync with option --outbuf=L. Example:
cmd=['rsync', '-arzv','--backup','--outbuf=L','source/','dest']
p = subprocess.Popen(cmd,
stdout=subprocess.PIPE)
for line in iter(p.stdout.readline, b''):
print '>>> {}'.format(line.rstrip())
Depending on the use case, you might also want to disable the buffering in the subprocess itself.
If the subprocess will be a Python process, you could do this before the call:
os.environ["PYTHONUNBUFFERED"] = "1"
Or alternatively pass this in the env argument to Popen.
Otherwise, if you are on Linux/Unix, you can use the stdbuf tool. E.g. like:
cmd = ["stdbuf", "-oL"] + cmd
See also here about stdbuf or other options.
On Linux, I had the same problem of getting rid of the buffering. I finally used "stdbuf -o0" (or, unbuffer from expect) to get rid of the PIPE buffering.
proc = Popen(['stdbuf', '-o0'] + cmd, stdout=PIPE, stderr=PIPE)
stdout = proc.stdout
I could then use select.select on stdout.
See also https://unix.stackexchange.com/questions/25372/
for line in p.stdout:
...
always blocks until the next line-feed.
For "real-time" behaviour you have to do something like this:
while True:
inchar = p.stdout.read(1)
if inchar: #neither empty string nor None
print(str(inchar), end='') #or end=None to flush immediately
else:
print('') #flush for implicit line-buffering
break
The while-loop is left when the child process closes its stdout or exits.
read()/read(-1) would block until the child process closed its stdout or exited.
Your problem is:
for line in p.stdout:
print(">>> " + str(line.rstrip()))
p.stdout.flush()
the iterator itself has extra buffering.
Try doing like this:
while True:
line = p.stdout.readline()
if not line:
break
print line
You cannot get stdout to print unbuffered to a pipe (unless you can rewrite the program that prints to stdout), so here is my solution:
Redirect stdout to sterr, which is not buffered. '<cmd> 1>&2' should do it. Open the process as follows: myproc = subprocess.Popen('<cmd> 1>&2', stderr=subprocess.PIPE)
You cannot distinguish from stdout or stderr, but you get all output immediately.
Hope this helps anyone tackling this problem.
To avoid caching of output you might wanna try pexpect,
child = pexpect.spawn(launchcmd,args,timeout=None)
while True:
try:
child.expect('\n')
print(child.before)
except pexpect.EOF:
break
PS : I know this question is pretty old, still providing the solution which worked for me.
PPS: got this answer from another question
p = subprocess.Popen(command,
bufsize=0,
universal_newlines=True)
I am writing a GUI for rsync in python, and have the same probelms. This problem has troubled me for several days until i find this in pyDoc.
If universal_newlines is True, the file objects stdout and stderr are opened as text files in universal newlines mode. Lines may be terminated by any of '\n', the Unix end-of-line convention, '\r', the old Macintosh convention or '\r\n', the Windows convention. All of these external representations are seen as '\n' by the Python program.
It seems that rsync will output '\r' when translate is going on.
if you run something like this in a thread and save the ffmpeg_time property in a property of a method so you can access it, it would work very nice
I get outputs like this:
output be like if you use threading in tkinter
input = 'path/input_file.mp4'
output = 'path/input_file.mp4'
command = "ffmpeg -y -v quiet -stats -i \"" + str(input) + "\" -metadata title=\"#alaa_sanatisharif\" -preset ultrafast -vcodec copy -r 50 -vsync 1 -async 1 \"" + output + "\""
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True, shell=True)
for line in self.process.stdout:
reg = re.search('\d\d:\d\d:\d\d', line)
ffmpeg_time = reg.group(0) if reg else ''
print(ffmpeg_time)
Change the stdout from the rsync process to be unbuffered.
p = subprocess.Popen(cmd,
shell=True,
bufsize=0, # 0=unbuffered, 1=line-buffered, else buffer-size
stdin=subprocess.PIPE,
stderr=subprocess.PIPE,
stdout=subprocess.PIPE)
I've noticed that there is no mention of using a temporary file as intermediate. The following gets around the buffering issues by outputting to a temporary file and allows you to parse the data coming from rsync without connecting to a pty. I tested the following on a linux box, and the output of rsync tends to differ across platforms, so the regular expressions to parse the output may vary:
import subprocess, time, tempfile, re
pipe_output, file_name = tempfile.TemporaryFile()
cmd = ["rsync", "-vaz", "-P", "/src/" ,"/dest"]
p = subprocess.Popen(cmd, stdout=pipe_output,
stderr=subprocess.STDOUT)
while p.poll() is None:
# p.poll() returns None while the program is still running
# sleep for 1 second
time.sleep(1)
last_line = open(file_name).readlines()
# it's possible that it hasn't output yet, so continue
if len(last_line) == 0: continue
last_line = last_line[-1]
# Matching to "[bytes downloaded] number% [speed] number:number:number"
match_it = re.match(".* ([0-9]*)%.* ([0-9]*:[0-9]*:[0-9]*).*", last_line)
if not match_it: continue
# in this case, the percentage is stored in match_it.group(1),
# time in match_it.group(2). We could do something with it here...
In Python 3, here's a solution, which takes a command off the command line and delivers real-time nicely decoded strings as they are received.
Receiver (receiver.py):
import subprocess
import sys
cmd = sys.argv[1:]
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
for line in p.stdout:
print("received: {}".format(line.rstrip().decode("utf-8")))
Example simple program that could generate real-time output (dummy_out.py):
import time
import sys
for i in range(5):
print("hello {}".format(i))
sys.stdout.flush()
time.sleep(1)
Output:
$python receiver.py python dummy_out.py
received: hello 0
received: hello 1
received: hello 2
received: hello 3
received: hello 4

Categories

Resources