Related
If I do the following:
import subprocess
from cStringIO import StringIO
subprocess.Popen(['grep','f'],stdout=subprocess.PIPE,stdin=StringIO('one\ntwo\nthree\nfour\nfive\nsix\n')).communicate()[0]
I get:
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "/build/toolchain/mac32/python-2.4.3/lib/python2.4/subprocess.py", line 533, in __init__
(p2cread, p2cwrite,
File "/build/toolchain/mac32/python-2.4.3/lib/python2.4/subprocess.py", line 830, in _get_handles
p2cread = stdin.fileno()
AttributeError: 'cStringIO.StringI' object has no attribute 'fileno'
Apparently a cStringIO.StringIO object doesn't quack close enough to a file duck to suit subprocess.Popen. How do I work around this?
Popen.communicate() documentation:
Note that if you want to send data to
the process’s stdin, you need to
create the Popen object with
stdin=PIPE. Similarly, to get anything
other than None in the result tuple,
you need to give stdout=PIPE and/or
stderr=PIPE too.
Replacing os.popen*
pipe = os.popen(cmd, 'w', bufsize)
# ==>
pipe = Popen(cmd, shell=True, bufsize=bufsize, stdin=PIPE).stdin
Warning Use communicate() rather than
stdin.write(), stdout.read() or
stderr.read() to avoid deadlocks due
to any of the other OS pipe buffers
filling up and blocking the child
process.
So your example could be written as follows:
from subprocess import Popen, PIPE, STDOUT
p = Popen(['grep', 'f'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
grep_stdout = p.communicate(input=b'one\ntwo\nthree\nfour\nfive\nsix\n')[0]
print(grep_stdout.decode())
# -> four
# -> five
# ->
On Python 3.5+ (3.6+ for encoding), you could use subprocess.run, to pass input as a string to an external command and get its exit status, and its output as a string back in one call:
#!/usr/bin/env python3
from subprocess import run, PIPE
p = run(['grep', 'f'], stdout=PIPE,
input='one\ntwo\nthree\nfour\nfive\nsix\n', encoding='ascii')
print(p.returncode)
# -> 0
print(p.stdout)
# -> four
# -> five
# ->
I figured out this workaround:
>>> p = subprocess.Popen(['grep','f'],stdout=subprocess.PIPE,stdin=subprocess.PIPE)
>>> p.stdin.write(b'one\ntwo\nthree\nfour\nfive\nsix\n') #expects a bytes type object
>>> p.communicate()[0]
'four\nfive\n'
>>> p.stdin.close()
Is there a better one?
There's a beautiful solution if you're using Python 3.4 or better. Use the input argument instead of the stdin argument, which accepts a bytes argument:
output_bytes = subprocess.check_output(
["sed", "s/foo/bar/"],
input=b"foo",
)
This works for check_output and run, but not call or check_call for some reason.
In Python 3.7+, you can also add text=True to make check_output take a string as input and return a string (instead of bytes):
output_string = subprocess.check_output(
["sed", "s/foo/bar/"],
input="foo",
text=True,
)
I'm a bit surprised nobody suggested creating a pipe, which is in my opinion the far simplest way to pass a string to stdin of a subprocess:
read, write = os.pipe()
os.write(write, "stdin input here")
os.close(write)
subprocess.check_call(['your-command'], stdin=read)
I am using python3 and found out that you need to encode your string before you can pass it into stdin:
p = Popen(['grep', 'f'], stdout=PIPE, stdin=PIPE, stderr=PIPE)
out, err = p.communicate(input='one\ntwo\nthree\nfour\nfive\nsix\n'.encode())
print(out)
Apparently a cStringIO.StringIO object doesn't quack close enough to
a file duck to suit subprocess.Popen
I'm afraid not. The pipe is a low-level OS concept, so it absolutely requires a file object that is represented by an OS-level file descriptor. Your workaround is the right one.
from subprocess import Popen, PIPE
from tempfile import SpooledTemporaryFile as tempfile
f = tempfile()
f.write('one\ntwo\nthree\nfour\nfive\nsix\n')
f.seek(0)
print Popen(['/bin/grep','f'],stdout=PIPE,stdin=f).stdout.read()
f.close()
"""
Ex: Dialog (2-way) with a Popen()
"""
p = subprocess.Popen('Your Command Here',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
stdin=PIPE,
shell=True,
bufsize=0)
p.stdin.write('START\n')
out = p.stdout.readline()
while out:
line = out
line = line.rstrip("\n")
if "WHATEVER1" in line:
pr = 1
p.stdin.write('DO 1\n')
out = p.stdout.readline()
continue
if "WHATEVER2" in line:
pr = 2
p.stdin.write('DO 2\n')
out = p.stdout.readline()
continue
"""
..........
"""
out = p.stdout.readline()
p.wait()
On Python 3.7+ do this:
my_data = "whatever you want\nshould match this f"
subprocess.run(["grep", "f"], text=True, input=my_data)
and you'll probably want to add capture_output=True to get the output of running the command as a string.
On older versions of Python, replace text=True with universal_newlines=True:
subprocess.run(["grep", "f"], universal_newlines=True, input=my_data)
Beware that Popen.communicate(input=s)may give you trouble ifsis too big, because apparently the parent process will buffer it before forking the child subprocess, meaning it needs "twice as much" used memory at that point (at least according to the "under the hood" explanation and linked documentation found here). In my particular case,swas a generator that was first fully expanded and only then written tostdin so the parent process was huge right before the child was spawned,
and no memory was left to fork it:
File "/opt/local/stow/python-2.7.2/lib/python2.7/subprocess.py", line 1130, in _execute_child
self.pid = os.fork()
OSError: [Errno 12] Cannot allocate memory
This is overkill for grep, but through my journeys I've learned about the Linux command expect, and the python library pexpect
expect: dialogue with interactive programs
pexpect: Python module for spawning child applications; controlling them; and responding to expected patterns in their output.
import pexpect
child = pexpect.spawn('grep f', timeout=10)
child.sendline('text to match')
print(child.before)
Working with interactive shell applications like ftp is trivial with pexpect
import pexpect
child = pexpect.spawn ('ftp ftp.openbsd.org')
child.expect ('Name .*: ')
child.sendline ('anonymous')
child.expect ('Password:')
child.sendline ('noah#example.com')
child.expect ('ftp> ')
child.sendline ('ls /pub/OpenBSD/')
child.expect ('ftp> ')
print child.before # Print the result of the ls command.
child.interact() # Give control of the child to the user.
p = Popen(['grep', 'f'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
p.stdin.write('one\n')
time.sleep(0.5)
p.stdin.write('two\n')
time.sleep(0.5)
p.stdin.write('three\n')
time.sleep(0.5)
testresult = p.communicate()[0]
time.sleep(0.5)
print(testresult)
This question already has answers here:
Store output of subprocess.Popen call in a string [duplicate]
(15 answers)
Closed 4 years ago.
How can I get the output of a process run using subprocess.call()?
Passing a StringIO.StringIO object to stdout gives this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 444, in call
return Popen(*popenargs, **kwargs).wait()
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 588, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 945, in _get_handles
c2pwrite = stdout.fileno()
AttributeError: StringIO instance has no attribute 'fileno'
>>>
If you have Python version >= 2.7, you can use subprocess.check_output which basically does exactly what you want (it returns standard output as string).
Simple example (linux version, see note):
import subprocess
print subprocess.check_output(["ping", "-c", "1", "8.8.8.8"])
Note that the ping command is using linux notation (-c for count). If you try this on Windows remember to change it to -n for same result.
As commented below you can find a more detailed explanation in this other answer.
Output from subprocess.call() should only be redirected to files.
You should use subprocess.Popen() instead. Then you can pass subprocess.PIPE for the stderr, stdout, and/or stdin parameters and read from the pipes by using the communicate() method:
from subprocess import Popen, PIPE
p = Popen(['program', 'arg1'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate(b"input data that is passed to subprocess' stdin")
rc = p.returncode
The reasoning is that the file-like object used by subprocess.call() must have a real file descriptor, and thus implement the fileno() method. Just using any file-like object won't do the trick.
See here for more info.
For python 3.5+ it is recommended that you use the run function from the subprocess module. This returns a CompletedProcess object, from which you can easily obtain the output as well as return code.
from subprocess import PIPE, run
command = ['echo', 'hello']
result = run(command, stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(result.returncode, result.stdout, result.stderr)
I have the following solution. It captures the exit code, the stdout, and the stderr too of the executed external command:
import shlex
from subprocess import Popen, PIPE
def get_exitcode_stdout_stderr(cmd):
"""
Execute the external command and get its exitcode, stdout and stderr.
"""
args = shlex.split(cmd)
proc = Popen(args, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
exitcode = proc.returncode
#
return exitcode, out, err
cmd = "..." # arbitrary external command, e.g. "python mytest.py"
exitcode, out, err = get_exitcode_stdout_stderr(cmd)
I also have a blog post on it here.
Edit: the solution was updated to a newer one that doesn't need to write to temp. files.
I recently just figured out how to do this, and here's some example code from a current project of mine:
#Getting the random picture.
#First find all pictures:
import shlex, subprocess
cmd = 'find ../Pictures/ -regex ".*\(JPG\|NEF\|jpg\)" '
#cmd = raw_input("shell:")
args = shlex.split(cmd)
output,error = subprocess.Popen(args,stdout = subprocess.PIPE, stderr= subprocess.PIPE).communicate()
#Another way to get output
#output = subprocess.Popen(args,stdout = subprocess.PIPE).stdout
ber = raw_input("search complete, display results?")
print output
#... and on to the selection process ...
You now have the output of the command stored in the variable "output". "stdout = subprocess.PIPE" tells the class to create a file object named 'stdout' from within Popen. The communicate() method, from what I can tell, just acts as a convenient way to return a tuple of the output and the errors from the process you've run. Also, the process is run when instantiating Popen.
The key is to use the function subprocess.check_output
For example, the following function captures stdout and stderr of the process and returns that as well as whether or not the call succeeded. It is Python 2 and 3 compatible:
from subprocess import check_output, CalledProcessError, STDOUT
def system_call(command):
"""
params:
command: list of strings, ex. `["ls", "-l"]`
returns: output, success
"""
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call(["ls", "-l"])
If you want to pass commands as strings rather than arrays, use this version:
from subprocess import check_output, CalledProcessError, STDOUT
import shlex
def system_call(command):
"""
params:
command: string, ex. `"ls -l"`
returns: output, success
"""
command = shlex.split(command)
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call("ls -l")
In Ipython shell:
In [8]: import subprocess
In [9]: s=subprocess.check_output(["echo", "Hello World!"])
In [10]: s
Out[10]: 'Hello World!\n'
Based on sargue's answer. Credit to sargue.
This question already has answers here:
Store output of subprocess.Popen call in a string [duplicate]
(15 answers)
Closed 4 years ago.
How can I get the output of a process run using subprocess.call()?
Passing a StringIO.StringIO object to stdout gives this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 444, in call
return Popen(*popenargs, **kwargs).wait()
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 588, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/subprocess.py", line 945, in _get_handles
c2pwrite = stdout.fileno()
AttributeError: StringIO instance has no attribute 'fileno'
>>>
If you have Python version >= 2.7, you can use subprocess.check_output which basically does exactly what you want (it returns standard output as string).
Simple example (linux version, see note):
import subprocess
print subprocess.check_output(["ping", "-c", "1", "8.8.8.8"])
Note that the ping command is using linux notation (-c for count). If you try this on Windows remember to change it to -n for same result.
As commented below you can find a more detailed explanation in this other answer.
Output from subprocess.call() should only be redirected to files.
You should use subprocess.Popen() instead. Then you can pass subprocess.PIPE for the stderr, stdout, and/or stdin parameters and read from the pipes by using the communicate() method:
from subprocess import Popen, PIPE
p = Popen(['program', 'arg1'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate(b"input data that is passed to subprocess' stdin")
rc = p.returncode
The reasoning is that the file-like object used by subprocess.call() must have a real file descriptor, and thus implement the fileno() method. Just using any file-like object won't do the trick.
See here for more info.
For python 3.5+ it is recommended that you use the run function from the subprocess module. This returns a CompletedProcess object, from which you can easily obtain the output as well as return code.
from subprocess import PIPE, run
command = ['echo', 'hello']
result = run(command, stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(result.returncode, result.stdout, result.stderr)
I have the following solution. It captures the exit code, the stdout, and the stderr too of the executed external command:
import shlex
from subprocess import Popen, PIPE
def get_exitcode_stdout_stderr(cmd):
"""
Execute the external command and get its exitcode, stdout and stderr.
"""
args = shlex.split(cmd)
proc = Popen(args, stdout=PIPE, stderr=PIPE)
out, err = proc.communicate()
exitcode = proc.returncode
#
return exitcode, out, err
cmd = "..." # arbitrary external command, e.g. "python mytest.py"
exitcode, out, err = get_exitcode_stdout_stderr(cmd)
I also have a blog post on it here.
Edit: the solution was updated to a newer one that doesn't need to write to temp. files.
I recently just figured out how to do this, and here's some example code from a current project of mine:
#Getting the random picture.
#First find all pictures:
import shlex, subprocess
cmd = 'find ../Pictures/ -regex ".*\(JPG\|NEF\|jpg\)" '
#cmd = raw_input("shell:")
args = shlex.split(cmd)
output,error = subprocess.Popen(args,stdout = subprocess.PIPE, stderr= subprocess.PIPE).communicate()
#Another way to get output
#output = subprocess.Popen(args,stdout = subprocess.PIPE).stdout
ber = raw_input("search complete, display results?")
print output
#... and on to the selection process ...
You now have the output of the command stored in the variable "output". "stdout = subprocess.PIPE" tells the class to create a file object named 'stdout' from within Popen. The communicate() method, from what I can tell, just acts as a convenient way to return a tuple of the output and the errors from the process you've run. Also, the process is run when instantiating Popen.
The key is to use the function subprocess.check_output
For example, the following function captures stdout and stderr of the process and returns that as well as whether or not the call succeeded. It is Python 2 and 3 compatible:
from subprocess import check_output, CalledProcessError, STDOUT
def system_call(command):
"""
params:
command: list of strings, ex. `["ls", "-l"]`
returns: output, success
"""
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call(["ls", "-l"])
If you want to pass commands as strings rather than arrays, use this version:
from subprocess import check_output, CalledProcessError, STDOUT
import shlex
def system_call(command):
"""
params:
command: string, ex. `"ls -l"`
returns: output, success
"""
command = shlex.split(command)
try:
output = check_output(command, stderr=STDOUT).decode()
success = True
except CalledProcessError as e:
output = e.output.decode()
success = False
return output, success
output, success = system_call("ls -l")
In Ipython shell:
In [8]: import subprocess
In [9]: s=subprocess.check_output(["echo", "Hello World!"])
In [10]: s
Out[10]: 'Hello World!\n'
Based on sargue's answer. Credit to sargue.
I am trying to std in an xml file. I read the xml file from subversion, updated a line in the file and now I am trying to create a jenkins job by using subporcess.Popen and stdin
test = subprocess.Popen('svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin', stdout=subprocess.PIPE, universal_newlines=True)
job = test.stdout.read().replace("#url#", "http://localhost/svn/WernerTest/TMS/branches/test1")
output = io.StringIO()
output.write(job)
subprocess.Popen('java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7', stdin=output)
and I am getting the following error:
Traceback (most recent call last): File "D:\scripts\jenkinsGetJobs.py", line 20, in <module>
subprocess.Popen('java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7', stdin=output)
File "D:\applications\Python 3.5\lib\subprocess.py", line 914,
in __init__errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "D:\applications\Python 3.5\lib\subprocess.py", line 1127, in _get_handles
p2cread = msvcrt.get_osfhandle(stdin.fileno())
io.UnsupportedOperation: fileno
So how do I pass in the updated file to the next subprocess?
Use a pipe and write the data directly to that pipe:
test = subprocess.Popen(
'svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin',
stdout=subprocess.PIPE, universal_newlines=True)
job = test.stdout.read().replace("#url#", "http://localhost/svn/WernerTest/TMS/branches/test1")
jenkins = subprocess.Popen(
'java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7',
stdin=subprocess.PIPE, universal_newlines=True)
jenkins.communicate(job)
The Popen.communicate() method takes the first argument and sends that as stdin to the subprocess.
Note that I set the universal_newlines argument to True for Jenkins as well; the alternative would be for you to explicitly encode the job string to a suitable codec that Jenkins will accept.
Popen() accepts only real files (valid .fileno() at least).
#Martijn Pieters♦' answer shows how pass the data if you can load it all at once in memory (also, jenkins process is not started until svn produces all output).
Here's how to read one line at a time (svn and jenkins processes run in parallel):
#!/usr/bine/env python3
from subprocess import Popen, PIPE
with Popen(svn_cmd, stdout=PIPE, bufsize=1, universal_newlines=True) as svn, \
Popen(java_cmd, stdin=PIPE, bufsize=1, universal_newlines=True) as java:
for line in svn.stdout:
line = line.replace('#url#', 'http://localhost/svn/WernerTest/TMS/branches/test1')
java.stdin.write(line)
if java.returncode != 0:
"handle error"
see svn_cmd, java_cmd definitions below (you don't need shlex.split(cmd) on Windows -- note: no shell=True).
If you didn't need to replace #url# then It would look like you are trying to emulate: svn_cmd | java_cmd pipeline, where:
svn_cmd = 'svn cat http://localhost/svn/WernerTest/JenkinsJobTemplates/trunk/smartTemplate.xml --username admin --password admin'
java_cmd = 'java -jar D:\\applications\\Jenkins\\war\\WEB-INF\\jenkins-cli.jar\\jenkins-cli.jar -s http://localhost:8080/ create-job test7'
The simplest way is to envoke the shell:
#!/usr/bin/env python
import subprocess
subprocess.check_call(svn_cmd + ' | ' + java_cmd, shell=True)
You could emulate it in Python:
#!/usr/bin/env python3
from subprocess import Popen, PIPE
#NOTE: use a list for compatibility with POSIX systems
with Popen(java_cmd.split(), stdin=PIPE) as java, \
Popen(svn_cmd.split(), stdout=java.stdin):
java.stdin.close() # close unused pipe in the parent
# no more code here (the for-loop is inside an OS code that implements pipes)
if java.returncode != 0:
"handle error here"
See How do I use subprocess.Popen to connect multiple processes by pipes?
This question already has answers here:
Store output of subprocess.Popen call in a string [duplicate]
(15 answers)
Closed 4 years ago.
I want to run a command in pythong, using the subprocess module, and store the output in a variable. However, I do not want the command's output to be printed to the terminal.
For this code:
def storels():
a = subprocess.Popen("ls",shell=True)
storels()
I get the directory listing in the terminal, instead of having it stored in a. I've also tried:
def storels():
subprocess.Popen("ls > tmp",shell=True)
a = open("./tmp")
[Rest of Code]
storels()
This also prints the output of ls to my terminal. I've even tried this command with the somewhat dated os.system method, since running ls > tmp in the terminal doesn't print ls to the terminal at all, but stores it in tmp. However, the same thing happens.
Edit:
I get the following error after following marcog's advice, but only when running a more complex command. cdrecord --help. Python spits this out:
Traceback (most recent call last):
File "./install.py", line 52, in <module>
burntrack2("hi")
File "./install.py", line 46, in burntrack2
a = subprocess.Popen("cdrecord --help",stdout = subprocess.PIPE)
File "/usr/lib/python2.6/subprocess.py", line 633, in __init__
errread, errwrite)
File "/usr/lib/python2.6/subprocess.py", line 1139, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
To get the output of ls, use stdout=subprocess.PIPE.
>>> proc = subprocess.Popen('ls', stdout=subprocess.PIPE)
>>> output = proc.stdout.read()
>>> print output
bar
baz
foo
The command cdrecord --help outputs to stderr, so you need to pipe that indstead. You should also break up the command into a list of tokens as I've done below, or the alternative is to pass the shell=True argument but this fires up a fully-blown shell which can be dangerous if you don't control the contents of the command string.
>>> proc = subprocess.Popen(['cdrecord', '--help'], stderr=subprocess.PIPE)
>>> output = proc.stderr.read()
>>> print output
Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...
If you have a command that outputs to both stdout and stderr and you want to merge them, you can do that by piping stderr to stdout and then catching stdout.
subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
As mentioned by Chris Morgan, you should be using proc.communicate() instead of proc.read().
>>> proc = subprocess.Popen(['cdrecord', '--help'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> out, err = proc.communicate()
>>> print 'stdout:', out
stdout:
>>> print 'stderr:', err
stderr:Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...
If you are using python 2.7 or later, the easiest way to do this is to use the subprocess.check_output() command. Here is an example:
output = subprocess.check_output('ls')
To also redirect stderr you can use the following:
output = subprocess.check_output('ls', stderr=subprocess.STDOUT)
In the case that you want to pass parameters to the command, you can either use a list or use invoke a shell and use a single string.
output = subprocess.check_output(['ls', '-a'])
output = subprocess.check_output('ls -a', shell=True)
With a = subprocess.Popen("cdrecord --help",stdout = subprocess.PIPE)
, you need to either use a list or use shell=True;
Either of these will work. The former is preferable.
a = subprocess.Popen(['cdrecord', '--help'], stdout=subprocess.PIPE)
a = subprocess.Popen('cdrecord --help', shell=True, stdout=subprocess.PIPE)
Also, instead of using Popen.stdout.read/Popen.stderr.read, you should use .communicate() (refer to the subprocess documentation for why).
proc = subprocess.Popen(['cdrecord', '--help'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = proc.communicate()