I want to execute a shell script with 3 arguments from a python script. (as described here: Python: executing shell script with arguments(variable), but argument is not read in shell script)
Here is my code:
subprocess.call('/root/bin/xen-limit %s %s %s' % (str(dom),str(result),str('--nosave'),), shell=True)
variables dom and result are containing strings.
And here is the output:
/bin/sh: --nosave: not found
UPDATE:
That is the variable "result":
c1 = ['/bin/cat', '/etc/xen/%s.cfg' % (str(dom))]
p1 = subprocess.Popen(c1, stdout=subprocess.PIPE)
c2 = ['grep', 'limited']
p2 = subprocess.Popen(c2, stdin=p1.stdout, stdout=subprocess.PIPE)
c3 = ['cut', '-d=', '-f2']
p3 = subprocess.Popen(c3, stdin=p2.stdout, stdout=subprocess.PIPE)
c4 = ['tr', '-d', '\"']
p4 = subprocess.Popen(c4, stdin=p3.stdout, stdout=subprocess.PIPE)
result = p4.stdout.read()
After that, the variable result is containing a number with mbit (for example 16mbit)
And dom is a string like "myserver"
from subprocess import Popen, STDOUT, PIPE
print('Executing: /root/bin/xen-limit ' + str(dom) + ' ' + str(result) + ' --nosave')
handle = Popen('/root/bin/xen-limit ' + str(dom) + ' ' + str(result) + ' --nosave', shell=True, stdout=PIPE, stderr=STDOUT, stdin=PIPE)
print(handle.stdout.read())
If this doesn't work i honestly don't know what would.
This is the most basic but yet error describing way of opening a 3:d party application or script while still giving you the debug you need.
Why not you save --nosave to a variable and pass the variable in subprocess
It's simpler (and safer) to pass a list consisting of the command name and its arguments.
subprocess.call(['/root/bin/xen-limit]',
str(dom),
str(result),
str('--nosave')
])
str('--nosave') is a no-op, as '--nosave' is already a string. The same may be true for dom and result as well.
Related
'/usr/local/bin/wave' only accepts a filename as input, so I need to invoke the process, then "send in" the commands, and wait for the output file to be written. Then my process can proceed to read the output file. Here is my code that does not write to the output file:
hdfFile = "/archive/HDF/16023343.hdf"
pngFile = "/xrfc_calib/xrfc.130.png"
lpFile = os.environ['DOCUMENT_ROOT'] + pngFile
waveCmd = "hdfview, '" + hdfFile + "', outfile='" + lpFile + "', web, view='RASTER', /neg"
os.environ['WAVE_PATH'] = "/oudvmt/wave/pro:/dvmt/wave/pro"
wfile = subprocess.Popen ('/usr/local/bin/wave >&2', shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
wfile.stdin = "\#hdf_startup\n\#hdf_common\n" + waveCmd + "\nquit\n"
I found what I was missing. The change is to the last 2 lines. They are:
wfile = subprocess.Popen ('/usr/local/bin/wave', stdin=subprocess.PIPE, stdout=subprocess.PIPE)
wfile.communicate("\#hdf_startup\n\#hdf_common\n" + waveCmd + "\nquit\n")
I needed to set "stdout" to avoid extra output from PV-Wave.
I needed to use "communicate" to wait for the process to complete.
this prints the directory size, but how can i save the output to a python variable, instead of print.
svn list -vR http://myIP/repos/test | awk '{sum+=$3; i++} END {print sum/1024000}'
but i need to store this print in a python variable;
proc = subprocess.Popen(svnproc, stdout=subprocess.PIPE, shell=True)
output = proc.stdout.read()
Print str(output)
nasty workaround is the push it out to a file and cat the file
svn list -vR http://myIP/repos/test | awk '{sum+=$3; i++} END {> /tmp/output.txt}'
From the fine docstring of "subprocess" I can read:
Replacing shell pipe line
output=dmesg | grep hda
p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
so that in your case I'd try the following code
switches = ...
directory = ...
p1 = Popen(["svn", "list", switches, directory], stdout=PIPE)
p2 = Popen(["awk", "{sum+=$3; i++} END {print sum/1024/1024}", stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0].strip()
ps: I have changed from sum/1024000 to sum/1024/1024 assuming that you want to count in megabytes
"
svnproc = "svn list -vR " + repoURL + " | awk '{sum+=$3; i++} END {print sum/1073741824}'"
proc = subprocess.Popen(svnproc, shell=True,
stdout=subprocess.PIPE)
svnbackupsize = float(proc.stdout.read())
The only problematic part of this script is that Popen does not wait till the process is DONE, however subprocess.call does wait till the process is completed.
argument_list = ['name=Jon', 'id=100' ]
output = subprocess.check_output(
['/usr/bin/python', 'test.py', argument_list ], stderr=subprocess.STDOUT)
In simple terms, I am trying to invoke a script using subprocess called test.py; I want to pass arguments to test.py through a list. Importatnt - List can be of any size.
Things I tried ,
output = subprocess.check_output(
['/usr/bin/python', 'test.py', ", ".join(argument_list) ], stderr=subprocess.STDOUT)
and
output = subprocess.check_output(
['/usr/bin/python', 'test.py', '%s' % argument_list ], stderr=subprocess.STDOUT)
Neither works because in subprocess.checkoutput should be (' ', ' ' ,' ') etc....
Is there a better way to do this ?
You can make a new list by adding lists together:
output = subprocess.check_output(['/usr/bin/python', 'test.py'] + argument_list, stderr=subprocess.STDOUT)
This will run test.py with argument_list as it's command-line parameters.
In the snipet of my python script below, I think that temp2 doesn't wait for temp to finish running, the output can be large, but is just text. This truncates the result ('out') from temp, it stops mid line. 'out' from temp works fine until temp 2 is added. I tried adding time.wait() as well as subprocess.Popen.wait(temp). These both allow temp to run to completion so that 'out' is not truncated but disrupt the chaining process so that there is no 'out2'. Any ideas?
temp = subprocess.Popen(call, stdout=subprocess.PIPE)
#time.wait(1)
#subprocess.Popen.wait(temp)
temp2 = subprocess.Popen(call2, stdin=temp.stdout, stdout=subprocess.PIPE)
out, err = temp.communicate()
out2, err2 = temp2.communicate()
According to the Python Docs communicate() can accept a stream to be sent as input. If you change stdin of temp2 to subprocess.PIPE and put out in communicate(), the data is properly piped.
#!/usr/bin/env python
import subprocess
import time
call = ["echo", "hello\nworld"]
call2 = ["grep", "w"]
temp = subprocess.Popen(call, stdout=subprocess.PIPE)
temp2 = subprocess.Popen(call2, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = temp.communicate()
out2, err2 = temp2.communicate(out)
print("Out: {0!r}, Err: {1!r}".format(out, err))
# Out: b'hello\nworld\n', Err: None
print("Out2: {0!r}, Err2: {1!r}".format(out2, err2))
# Out2: b'world\n', Err2: None
Following "Replacing shell pipeline" section from the docs:
temp = subprocess.Popen(call, stdout=subprocess.PIPE)
temp2 = subprocess.Popen(call2, stdin=temp.stdout, stdout=subprocess.PIPE)
temp.stdout.close()
out2 = temp2.communicate()[0]
I have an application that takes input, either from the terminal directly or I can use a pipe to pass the output of another program into the stdin of this one. What I am trying to do is use python to generate the output so it's formatted correctly and pass that to the stdin of this program all from the same script. Here is the code:
#!/usr/bin/python
import os
import subprocess
import plistlib
import sys
def appScan():
os.system("system_profiler -xml SPApplicationsDataType > apps.xml")
appList = plistlib.readPlist("apps.xml")
sys.stdout.write( "Mac_App_List\n"
"Delimiters=\"^\"\n"
"string50 string50\n"
"Name^Version\n")
appDict = appList[0]['_items']
for x in appDict:
if 'version' in x:
print x['_name'] + "^" + x['version'] + "^"
else:
print x['_name'] + "^" + "no version found" + "^"
proc = subprocess.Popen(["/opt/altiris/notification/inventory/lib/helpers/aex- sendcustominv","-t","-"], shell=False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
proc.communicate(input=appScan())
For some reason this subprocess I am calling doesn't like what is coming into stdin. However if I remove the subprocess items and just have the script print to stdout and then call the script from the terminal (python appScan.py | aex-sendcustominv), aex-sendcustominv is able to accept the input just fine. Is there any way to take a functions output in python and send it to the stdin of an subprocess?
The problem is that appScan() only prints to stdout; appScan() returns None, so proc.communicate(input=appScan()) is equivalent to proc.communicate(input=None). You need appScan to return a string.
Try this (not tested):
def appScan():
os.system("system_profiler -xml SPApplicationsDataType > apps.xml")
appList = plistlib.readPlist("apps.xml")
output_str = 'Delimiters="^"\nstring50 string50\nName^Version\n'
appDict = appList[0]['_items']
for x in appDict:
if 'version' in x:
output_str = output_str + x['_name'] + "^" + x['version'] + "^"
else:
output_str = output_str + x['_name'] + "^" + "no version found" + "^"
return output_str
proc = subprocess.Popen(["/opt/altiris/notification/inventory/lib/helpers/aex- sendcustominv","-t","-"], shell=False, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
proc.communicate(input=appScan())