I'm trying to collect stderr in memory, instead of directly writing it to a file or stdout. I do this so I can generated the error log file in a certain way. I found a library called StringIO that is an in-memory 'file'. I don't think it does the trick. Here's my code:
buffer = StringIO.StringIO()
status = subprocess.call(args, stdout=log_fps["trace"], stderr=buffer)
if status and self.V_LEVEL:
sys.stderr.write(buffer.getvalue())
print "generated error"
if status:
log_fps["fail"].write("==> Error with files %s and %s\n" % (domain_file, problem_file))
log_fps["fail"].write(buffer.getvalue())
I get the following error:
Traceback (most recent call last):
File "./runit.py", line 284, in <module>
launcher.run_all_cff_domain_examples("ring")
File "./runit.py", line 259, in run_all_cff_domain_examples
result = self.run_clg(in_d["domain"], in_d["problem"], in_d["prefix"])
File "./runit.py", line 123, in run_clg
status = subprocess.call(args, stdout=log_fps["trace"], stderr=buffer)
File "/usr/lib/python2.7/subprocess.py", line 493, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib/python2.7/subprocess.py", line 672, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "/usr/lib/python2.7/subprocess.py", line 1075, in _get_handles
errwrite = stderr.fileno()
AttributeError: StringIO instance has no attribute 'fileno'
I guess this means that I can't use StringIO to collect stderr in memory. What else can I do, short of writing to a file in /tmp?
stdout = subprocess.check_output(args)
See check_output documentation for more options.
If you don't want to capture stdout, use Popen.communicate:
from subprocess import Popen, PIPE
p = Popen(args, stdout=log_fps["trace"], stderr=PIPE)
_, stderr = p.communicate()
import subprocess
p = subprocess.Popen(args, stdout=log_fps["trace"], stderr=subprocess.PIPE)
_, stderr = p.communicate()
print stderr,
Related
I've a main process where I open a multiprocessing.Pipe(False) and send the writing end to a worker Process. Then, in the worker process, I run a Java program using subprocces.Popen(['java', 'myprogram'], stdin=subprocess.PIPE, stdout=subprocess.PIPE). I need to redirect the error of this subprocess to the writing end of multiprocessing.Pipe
For this I referred to this answer by Ilija as this is exactly what I want to achieve, but on my machine(Windows), it throws OSError: [Errno 9] Bad file descriptor
Machine details:
OS - Windows 10 (64bit)
Python version - 3.7.4
Code:
Method 1 (Ilija's answer)
def worker(w_conn):
os.dup2(w_conn.fileno(), 2)
sp = subprocess.Popen(['java', 'myprogram'], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
sp.wait()
w_conn.close()
def main():
r_conn, w_conn = multiprocessing.Pipe(False)
process = multiprocessing.Process(target=worker, args=(w_conn,))
process.start()
while not r_conn.poll() and not w_conn.closed:
# Do stuff
else:
# Read error from r_conn, and handle it
r_conn.close()
process.join()
if __name__=='__main__':
main()
Error:
Process Process-1:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 297, in _bootstrap
self.run()
File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\User\Desktop\Workspace\Error.py", line 14, in worker
os.dup2(w_conn.fileno(), 2)
OSError: [Errno 9] Bad file descriptor
Method 2: In worker function, sending w_conn as argument to Popen
def worker(w_conn):
sp = subprocess.Popen(['java', 'myprogram'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=w_conn)
sp.wait()
w_conn.close()
Error:
Process Process-1:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 297, in _bootstrap
self.run()
File "C:\ProgramData\Anaconda3\lib\multiprocessing\process.py", line 99, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\User\Desktop\Workspace\Error.py", line 13, in worker
sp = subprocess.Popen(['java', 'myprogram'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=w_conn)
File "C:\ProgramData\Anaconda3\lib\subprocess.py", line 728, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "C:\ProgramData\Anaconda3\lib\subprocess.py", line 1077, in _get_handles
errwrite = msvcrt.get_osfhandle(stderr.fileno())
OSError: [Errno 9] Bad file descriptor
Is there any workaround/alternate method to achive this on Windows?
I still don't know why "Method 1" is not working. Any information regarding this will be appreciated.
"Method 2" is wrong altogether as we can't use Connection object (returned by multiprocessing.Pipe()) as a file handle in subprocess.Popen.
What works is checking for data in stderr of subprocess sp and sending the data through w_conn to main process.
def worker(w_conn):
sp = subprocess.Popen(['java', 'myprogram'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
sp.wait()
if sp.stderr.seek(0, io.SEEK_END)>0:
w_conn.send(sp.stderr.read())
w_conn.close()
I want to open git bash and write git commands into it.
I used following code:
from subprocess import Popen, PIPE
process = subprocess.run(['D:\\casdev\\SmartGit\\git\\git-bash.exe'],shell= "True", bufsize=0,stdin="git status",stdout=PIPE, stderr=PIPE, encoding="UTF8")
stdoutput, stderroutput = process.communicate()
response=process.stdout.read()
Output:
runfile('C:/Users/uib25171/Desktop/MiniProject/Trials/untitled3.py', wdir='C:/Users/uib25171/Desktop/MiniProject/Trials')
Traceback (most recent call last):
File "<ipython-input-66-00c2c0a9827c>", line 1, in <module>
runfile('C:/Users/uib25171/Desktop/MiniProject/Trials/untitled3.py', wdir='C:/Users/uib25171/Desktop/MiniProject/Trials')
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 827, in runfile
execfile(filename, namespace)
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 110, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)
File "C:/Users/uib25171/Desktop/MiniProject/Trials/untitled3.py", line 37, in <module>
process = subprocess.run(['D:\\casdev\\SmartGit\\git\\git-bash.exe'],shell= "True", bufsize=0,stdin="git status",stdout=PIPE, stderr=PIPE, encoding="UTF8")
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\subprocess.py", line 472, in run
with Popen(*popenargs, **kwargs) as process:
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\site-packages\spyder_kernels\customize\spydercustomize.py", line 143, in __init__
super(SubprocessPopen, self).__init__(*args, **kwargs)
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\subprocess.py", line 728, in __init__
errread, errwrite) = self._get_handles(stdin, stdout, stderr)
File "C:\Users\uib25171\AppData\Local\Continuum\anaconda3\lib\subprocess.py", line 1039, in _get_handles
p2cread = msvcrt.get_osfhandle(stdin.fileno())
**AttributeError: 'str' object has no attribute 'fileno'**
Can somebody help me?
It seems as if your passing a string instead of an input stream at stdin="git status"
You are mixing stdin, which should be a filehandle, with input, which lets you pass a string as input to your subprocess. But more fundamentally, you are mixing apples and oranges. communicate makes sense for a bare Popen object, but that's not what subprocess.run returns. The output from your command is simpy process.stdout.
But really you should probably be running
process = subprocess.run(['git', 'status'],
text=True, encoding="UTF8", capture=True)
response = process.stdout
If you need shell=True (which you don't here) you can pass the path to Bash in the executable= keyword parameter.
You messed up your code. #tripleee already explained. this is probably what you want -
from subprocess import PIPE
import subprocess
process = subprocess.run(['your_file_name'],shell= "True", bufsize=0,stdin=PIPE,stdout=PIPE, stderr=PIPE, encoding="UTF8")
response=process.stdout
How do I write a maven command in python? I saw this example online, but it doesn't seem to be working in Windows.
def local2(command, print_command=False):
from subprocess import Popen, PIPE
p = Popen(command, stdout=PIPE, stderr=PIPE)
if print_command: print " ".join(command)
output, errput = p.communicate()
return p.returncode, output, errput
def uploadQAJavaToNexus():
url = "example"
groupId = "example"
artifactId = "example"
repositoryId = "example"
# filePath =
version = "version"
status, stdout, stderr = local2([
"mvn",
"deploy:deploy-file",
"-Durl=" +url,
"-DrepositoryId=" +repositoryId,
"-Dversion=" + version,
"-Dfile=" + "path"
"-DartifactId=" + artifactId,
"-Dpackaging=" + "jar",
"-DgroupId" + groupId,
])
return status, stdout, stderr
UPDATE: This is the error I'm getting given below:
Traceback (most recent call last):
File "C:\PythonProject\src\Get.py", line 355, in <module>
uploadQAJavaToNexus()
File "C:\Get.py", line 250, in uploadQAJavaToNexus
"-DgroupId" + groupId,
File "C:\Get.py", line 227, in local2
p = Popen(command, stdout=PIPE, stderr=PIPE)
File "C:\Python27\lib\subprocess.py", line 710, in __init__
errread, errwrite)
File "C:\Python27\lib\subprocess.py", line 958, in _execute_child
startupinfo)
WindowsError: [Error 2] The system cannot find the file specified
I prefer to use fabric (http://www.fabfile.org/):
from fabric import local
def maven_start():
local('<cmd>')
$ fab maven_start
I need my script to execute a binary file a number of times and get some statistics about its execution time using the the "time" directive. However the following code crashes:
cmd = ["time", "./executable", "<", "input_file"]
result = subprocess.Popen(cmd, stdout = subprocess.PIPE, stderr = subprocess.PIPE)
with the following message:
File "exec_script.py", line 15, in
result = subprocess.Popen("time ./quake < small_input", stdout = subprocess.PIPE, stderr = subprocess.PIPE)
File "/usr/lib/python2.7/subprocess.py", line 710, in init
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
I have been cracking my skull for hours now and can't figure this out, any help? Note that if I just run that command from the same directory, it works. But not through the script.
Given the function
def get_files_from_sha(sha, files):
from subprocess import Popen, PIPE
import tarfile
if 0 == len(files):
return {}
p = Popen(["git", "archive", sha], bufsize=10240, stdin=PIPE, stdout=PIPE, stderr=PIPE)
tar = tarfile.open(fileobj=p.stdout, mode='r|')
p.communicate()
contents = {}
doall = files == '*'
if not doall:
files = set(files)
for entry in tar:
if (isinstance(files, set) and entry.name in files) or doall:
tf = tar.extractfile(entry)
contents[entry.name] = tf.read()
if not doall:
files.discard(entry.name)
if not doall:
for fname in files:
contents[fname] = None
tar.close()
return contents
which is called in a loop for some values of sha, after a while (in my case, 4 iterations) it starts to fail at the call to tf.read(), with the message:
Traceback (most recent call last):
File "../yap-analysis/extract.py", line 243, in <module>
commits, identities, identities_by_name, identities_by_email, identities_freq = build_commits(commits)
File "../yap-analysis/extract.py", line 186, in build_commits
commit = get_commit(commit)
File "../yap-analysis/extract.py", line 84, in get_commit
contents = get_files_from_sha(commit['sha'], files)
File "../yap-analysis/extract.py", line 42, in get_files_from_sha
contents[entry.name] = tf.read()
File "/usr/lib/python2.7/tarfile.py", line 817, in read
buf += self.fileobj.read()
File "/usr/lib/python2.7/tarfile.py", line 737, in read
return self.readnormal(size)
File "/usr/lib/python2.7/tarfile.py", line 746, in readnormal
return self.fileobj.read(size)
File "/usr/lib/python2.7/tarfile.py", line 573, in read
buf = self._read(size)
File "/usr/lib/python2.7/tarfile.py", line 581, in _read
return self.__read(size)
File "/usr/lib/python2.7/tarfile.py", line 606, in __read
buf = self.fileobj.read(self.bufsize)
ValueError: I/O operation on closed file
I suspect there is some parallelization that subprocess attempts to make (?).
What is the actual cause and how to solve it in a clean and robust way on python2?
Do not use .communicate() on the Popen instance; it'll read the stdout stream until it is finished. From the documentation:
Interact with process: Send data to stdin. Read data from stdout and stderr, until end-of-file is reached.
The code for .communicate() even adds an explicit .close() call on the stdout of the pipe.
Simply removing the call to .communicate() should be enough, but do also add a .wait() after reading the tarfile contents:
tar.close()
p.stdout.close()
p.wait()
It could be that tar.close() also closes p.stdout, but an extra .close() there should not hurt.
I think your problem is the p.communicate(). This method sends to stdin, reads from stdout and stderr (which you are not capturing) and waits for the process to terminate.
tarfile is trying to read from the processes stdout, and by the time it does then the process is finished, hence the error.
I have not tried running your code (I don't have access to git) but you probably don't want the p.communicate at all, try commenting it out.