Silence loggers and printing to screen - Python - python

I'm having a problem with my python script.
It's printing massive amounts of data on the screen, and I would like to prevent all sorts of printing to screen.
Edit:
The library I'm using is mechanize, and it's printing a LOT of data on screen.
I have set these to false with no luck!
br.set_debug_redirects(False)
br.set_debug_responses(False)
br.set_debug_http(False)
Any ideas?
Help would be amazing and very much appreciated!

(Based on your 2nd edit)
If you don't want to disable all output, you can try to be specific to mechanize itself. http://wwwsearch.sourceforge.net/mechanize/ provides a snippet, which I've modified (though I'm not sure if it will work):
import logging
logger = logging.getLogger("mechanize")
# only log really bad events
logger.setLevel(logging.ERROR)
When you print something it goes to the screen through the sys.stdout file. You can change this file to any other file (eg, a log file you open) so that nothing is printed to the screen:
import sys
# save the old stdout so you can print later (do sys.stdout = OLD_STDOUT)
OLD_STDOUT = sys.stdout
sys.stdout = open("logfile.txt", 'w')
Of course, if you're talking about some library that you're calling, it may be printing to sys.stderr. Luckily, you can do the exact same thing for this one (continuing from above):
OLD_STDERR = sys.stderr
sys.stderr = open("errorLog.txt", 'w')
Now if, for some reason, you want to completely ignore stdout (or stderr) and never see it again, you can define your own file-like classes that simply discard the objects:
class Discarder(object):
def write(self, text):
pass # do nothing
# now discard everything coming out of stdout
sys.stdout = Discarder()
And, to add to the din of possible solutions, here is a solution that works in Unix shells:
# discards all input (change /dev/null to a file name to keep track of output)
python yourScript.py > /dev/null

You may redirect sys.stdout and sys.stderr to a file or any file like object of yours e.g.
class EatLog(object):
def write(self):
pass
sys.stdout = EatLog()
but i would not recommend that, simpler option is to use OS level redirection e.g.
python myscript.py > out.log

you can use the StringIO module, too, instead of rolling your own stdout stream. Occasionally, the stdout needs more than a write method (flush is another common one), which StringIO will handle.
import StringIO
import sys
sys.stdout = StringIO.StringIO()

Related

streaming python commands without using flag -u [duplicate]

Is output buffering enabled by default in Python's interpreter for sys.stdout?
If the answer is positive, what are all the ways to disable it?
Suggestions so far:
Use the -u command line switch
Wrap sys.stdout in an object that flushes after every write
Set PYTHONUNBUFFERED env var
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
Is there any other way to set some global flag in sys/sys.stdout programmatically during execution?
If you just want to flush after a specific write using print, see How can I flush the output of the print function?.
From Magnus Lycka answer on a mailing list:
You can skip buffering for a whole
python process using python -u
or by
setting the environment variable
PYTHONUNBUFFERED.
You could also replace sys.stdout with
some other stream like wrapper which
does a flush after every call.
class Unbuffered(object):
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
def writelines(self, datas):
self.stream.writelines(datas)
self.stream.flush()
def __getattr__(self, attr):
return getattr(self.stream, attr)
import sys
sys.stdout = Unbuffered(sys.stdout)
print 'Hello'
I would rather put my answer in How to flush output of print function? or in Python's print function that flushes the buffer when it's called?, but since they were marked as duplicates of this one (what I do not agree), I'll answer it here.
Since Python 3.3, print() supports the keyword argument "flush" (see documentation):
print('Hello World!', flush=True)
# reopen stdout file descriptor with write mode
# and 0 as the buffer size (unbuffered)
import io, os, sys
try:
# Python 3, open as binary, then wrap in a TextIOWrapper with write-through.
sys.stdout = io.TextIOWrapper(open(sys.stdout.fileno(), 'wb', 0), write_through=True)
# If flushing on newlines is sufficient, as of 3.7 you can instead just call:
# sys.stdout.reconfigure(line_buffering=True)
except TypeError:
# Python 2
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
Credits: "Sebastian", somewhere on the Python mailing list.
Yes, it is.
You can disable it on the commandline with the "-u" switch.
Alternatively, you could call .flush() on sys.stdout on every write (or wrap it with an object that does this automatically)
This relates to Cristóvão D. Sousa's answer, but I couldn't comment yet.
A straight-forward way of using the flush keyword argument of Python 3 in order to always have unbuffered output is:
import functools
print = functools.partial(print, flush=True)
afterwards, print will always flush the output directly (except flush=False is given).
Note, (a) that this answers the question only partially as it doesn't redirect all the output. But I guess print is the most common way for creating output to stdout/stderr in python, so these 2 lines cover probably most of the use cases.
Note (b) that it only works in the module/script where you defined it. This can be good when writing a module as it doesn't mess with the sys.stdout.
Python 2 doesn't provide the flush argument, but you could emulate a Python 3-type print function as described here https://stackoverflow.com/a/27991478/3734258 .
def disable_stdout_buffering():
# Appending to gc.garbage is a way to stop an object from being
# destroyed. If the old sys.stdout is ever collected, it will
# close() stdout, which is not good.
gc.garbage.append(sys.stdout)
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
# Then this will give output in the correct order:
disable_stdout_buffering()
print "hello"
subprocess.call(["echo", "bye"])
Without saving the old sys.stdout, disable_stdout_buffering() isn't idempotent, and multiple calls will result in an error like this:
Traceback (most recent call last):
File "test/buffering.py", line 17, in <module>
print "hello"
IOError: [Errno 9] Bad file descriptor
close failed: [Errno 9] Bad file descriptor
Another possibility is:
def disable_stdout_buffering():
fileno = sys.stdout.fileno()
temp_fd = os.dup(fileno)
sys.stdout.close()
os.dup2(temp_fd, fileno)
os.close(temp_fd)
sys.stdout = os.fdopen(fileno, "w", 0)
(Appending to gc.garbage is not such a good idea because it's where unfreeable cycles get put, and you might want to check for those.)
The following works in Python 2.6, 2.7, and 3.2:
import os
import sys
buf_arg = 0
if sys.version_info[0] == 3:
os.environ['PYTHONUNBUFFERED'] = '1'
buf_arg = 1
sys.stdout = os.fdopen(sys.stdout.fileno(), 'a+', buf_arg)
sys.stderr = os.fdopen(sys.stderr.fileno(), 'a+', buf_arg)
Yes, it is enabled by default. You can disable it by using the -u option on the command line when calling python.
In Python 3, you can monkey-patch the print function, to always send flush=True:
_orig_print = print
def print(*args, **kwargs):
_orig_print(*args, flush=True, **kwargs)
As pointed out in a comment, you can simplify this by binding the flush parameter to a value, via functools.partial:
print = functools.partial(print, flush=True)
You can also run Python with stdbuf utility:
stdbuf -oL python <script>
You can create an unbuffered file and assign this file to sys.stdout.
import sys
myFile= open( "a.log", "w", 0 )
sys.stdout= myFile
You can't magically change the system-supplied stdout; since it's supplied to your python program by the OS.
You can also use fcntl to change the file flags in-fly.
fl = fcntl.fcntl(fd.fileno(), fcntl.F_GETFL)
fl |= os.O_SYNC # or os.O_DSYNC (if you don't care the file timestamp updates)
fcntl.fcntl(fd.fileno(), fcntl.F_SETFL, fl)
One way to get unbuffered output would be to use sys.stderr instead of sys.stdout or to simply call sys.stdout.flush() to explicitly force a write to occur.
You could easily redirect everything printed by doing:
import sys; sys.stdout = sys.stderr
print "Hello World!"
Or to redirect just for a particular print statement:
print >>sys.stderr, "Hello World!"
To reset stdout you can just do:
sys.stdout = sys.__stdout__
It is possible to override only write method of sys.stdout with one that calls flush. Suggested method implementation is below.
def write_flush(args, w=stdout.write):
w(args)
stdout.flush()
Default value of w argument will keep original write method reference. After write_flush is defined, the original write might be overridden.
stdout.write = write_flush
The code assumes that stdout is imported this way from sys import stdout.
Variant that works without crashing (at least on win32; python 2.7, ipython 0.12) then called subsequently (multiple times):
def DisOutBuffering():
if sys.stdout.name == '<stdout>':
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
if sys.stderr.name == '<stderr>':
sys.stderr = os.fdopen(sys.stderr.fileno(), 'w', 0)
(I've posted a comment, but it got lost somehow. So, again:)
As I noticed, CPython (at least on Linux) behaves differently depending on where the output goes. If it goes to a tty, then the output is flushed after each '\n'
If it goes to a pipe/process, then it is buffered and you can use the flush() based solutions or the -u option recommended above.
Slightly related to output buffering:
If you iterate over the lines in the input with
for line in sys.stdin:
...
then the for implementation in CPython will collect the input for a while and then execute the loop body for a bunch of input lines. If your script is about to write output for each input line, this might look like output buffering but it's actually batching, and therefore, none of the flush(), etc. techniques will help that.
Interestingly, you don't have this behaviour in pypy.
To avoid this, you can use
while True:
line=sys.stdin.readline()
...

print() without newline does not work when reading from stdin [duplicate]

Is output buffering enabled by default in Python's interpreter for sys.stdout?
If the answer is positive, what are all the ways to disable it?
Suggestions so far:
Use the -u command line switch
Wrap sys.stdout in an object that flushes after every write
Set PYTHONUNBUFFERED env var
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
Is there any other way to set some global flag in sys/sys.stdout programmatically during execution?
If you just want to flush after a specific write using print, see How can I flush the output of the print function?.
From Magnus Lycka answer on a mailing list:
You can skip buffering for a whole
python process using python -u
or by
setting the environment variable
PYTHONUNBUFFERED.
You could also replace sys.stdout with
some other stream like wrapper which
does a flush after every call.
class Unbuffered(object):
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
def writelines(self, datas):
self.stream.writelines(datas)
self.stream.flush()
def __getattr__(self, attr):
return getattr(self.stream, attr)
import sys
sys.stdout = Unbuffered(sys.stdout)
print 'Hello'
I would rather put my answer in How to flush output of print function? or in Python's print function that flushes the buffer when it's called?, but since they were marked as duplicates of this one (what I do not agree), I'll answer it here.
Since Python 3.3, print() supports the keyword argument "flush" (see documentation):
print('Hello World!', flush=True)
# reopen stdout file descriptor with write mode
# and 0 as the buffer size (unbuffered)
import io, os, sys
try:
# Python 3, open as binary, then wrap in a TextIOWrapper with write-through.
sys.stdout = io.TextIOWrapper(open(sys.stdout.fileno(), 'wb', 0), write_through=True)
# If flushing on newlines is sufficient, as of 3.7 you can instead just call:
# sys.stdout.reconfigure(line_buffering=True)
except TypeError:
# Python 2
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
Credits: "Sebastian", somewhere on the Python mailing list.
Yes, it is.
You can disable it on the commandline with the "-u" switch.
Alternatively, you could call .flush() on sys.stdout on every write (or wrap it with an object that does this automatically)
This relates to Cristóvão D. Sousa's answer, but I couldn't comment yet.
A straight-forward way of using the flush keyword argument of Python 3 in order to always have unbuffered output is:
import functools
print = functools.partial(print, flush=True)
afterwards, print will always flush the output directly (except flush=False is given).
Note, (a) that this answers the question only partially as it doesn't redirect all the output. But I guess print is the most common way for creating output to stdout/stderr in python, so these 2 lines cover probably most of the use cases.
Note (b) that it only works in the module/script where you defined it. This can be good when writing a module as it doesn't mess with the sys.stdout.
Python 2 doesn't provide the flush argument, but you could emulate a Python 3-type print function as described here https://stackoverflow.com/a/27991478/3734258 .
def disable_stdout_buffering():
# Appending to gc.garbage is a way to stop an object from being
# destroyed. If the old sys.stdout is ever collected, it will
# close() stdout, which is not good.
gc.garbage.append(sys.stdout)
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
# Then this will give output in the correct order:
disable_stdout_buffering()
print "hello"
subprocess.call(["echo", "bye"])
Without saving the old sys.stdout, disable_stdout_buffering() isn't idempotent, and multiple calls will result in an error like this:
Traceback (most recent call last):
File "test/buffering.py", line 17, in <module>
print "hello"
IOError: [Errno 9] Bad file descriptor
close failed: [Errno 9] Bad file descriptor
Another possibility is:
def disable_stdout_buffering():
fileno = sys.stdout.fileno()
temp_fd = os.dup(fileno)
sys.stdout.close()
os.dup2(temp_fd, fileno)
os.close(temp_fd)
sys.stdout = os.fdopen(fileno, "w", 0)
(Appending to gc.garbage is not such a good idea because it's where unfreeable cycles get put, and you might want to check for those.)
The following works in Python 2.6, 2.7, and 3.2:
import os
import sys
buf_arg = 0
if sys.version_info[0] == 3:
os.environ['PYTHONUNBUFFERED'] = '1'
buf_arg = 1
sys.stdout = os.fdopen(sys.stdout.fileno(), 'a+', buf_arg)
sys.stderr = os.fdopen(sys.stderr.fileno(), 'a+', buf_arg)
Yes, it is enabled by default. You can disable it by using the -u option on the command line when calling python.
In Python 3, you can monkey-patch the print function, to always send flush=True:
_orig_print = print
def print(*args, **kwargs):
_orig_print(*args, flush=True, **kwargs)
As pointed out in a comment, you can simplify this by binding the flush parameter to a value, via functools.partial:
print = functools.partial(print, flush=True)
You can also run Python with stdbuf utility:
stdbuf -oL python <script>
You can create an unbuffered file and assign this file to sys.stdout.
import sys
myFile= open( "a.log", "w", 0 )
sys.stdout= myFile
You can't magically change the system-supplied stdout; since it's supplied to your python program by the OS.
You can also use fcntl to change the file flags in-fly.
fl = fcntl.fcntl(fd.fileno(), fcntl.F_GETFL)
fl |= os.O_SYNC # or os.O_DSYNC (if you don't care the file timestamp updates)
fcntl.fcntl(fd.fileno(), fcntl.F_SETFL, fl)
One way to get unbuffered output would be to use sys.stderr instead of sys.stdout or to simply call sys.stdout.flush() to explicitly force a write to occur.
You could easily redirect everything printed by doing:
import sys; sys.stdout = sys.stderr
print "Hello World!"
Or to redirect just for a particular print statement:
print >>sys.stderr, "Hello World!"
To reset stdout you can just do:
sys.stdout = sys.__stdout__
It is possible to override only write method of sys.stdout with one that calls flush. Suggested method implementation is below.
def write_flush(args, w=stdout.write):
w(args)
stdout.flush()
Default value of w argument will keep original write method reference. After write_flush is defined, the original write might be overridden.
stdout.write = write_flush
The code assumes that stdout is imported this way from sys import stdout.
Variant that works without crashing (at least on win32; python 2.7, ipython 0.12) then called subsequently (multiple times):
def DisOutBuffering():
if sys.stdout.name == '<stdout>':
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
if sys.stderr.name == '<stderr>':
sys.stderr = os.fdopen(sys.stderr.fileno(), 'w', 0)
(I've posted a comment, but it got lost somehow. So, again:)
As I noticed, CPython (at least on Linux) behaves differently depending on where the output goes. If it goes to a tty, then the output is flushed after each '\n'
If it goes to a pipe/process, then it is buffered and you can use the flush() based solutions or the -u option recommended above.
Slightly related to output buffering:
If you iterate over the lines in the input with
for line in sys.stdin:
...
then the for implementation in CPython will collect the input for a while and then execute the loop body for a bunch of input lines. If your script is about to write output for each input line, this might look like output buffering but it's actually batching, and therefore, none of the flush(), etc. techniques will help that.
Interestingly, you don't have this behaviour in pypy.
To avoid this, you can use
while True:
line=sys.stdin.readline()
...

Python: Printing file to stream as file changes

I am running a large suite of unittests in a subprocess through another application (Autodesk Maya). Maya runs a special Python interpreter with it's own libraries that cannot be used outside of the application, thus the need to test within the application. I would like the parent process to print the results of the testing as it is happening. The subprocess is very 'noisy' though, so I do not want to simply redirect the subprocess's stdout to the parent process's stdout. Instead, I would like the test runner to somehow directly stream to the parent process's stdout
I am currently using a TextTestRunner in the subprocess with it's stdout set to an open file. The parent process knows where this file exists, and writes the contents of the tile to stdout once the subprocess is complete. Since the tests can take a long time to run though, I would prefer that the parent process can somehow 'stream' the contents of this file as it is being created by the subprocess. But I am not sure how to do this or if there is a better approach.
Here's an example of how this is currently set up.
module_path = 'my.test.module'
suite_callable = 'suite'
stream_fpath = '/tmp/the_test_results.txt'
script_fpath = '/tmp/the_test_script.py'
script = '''
import sys
if sys.version_info[0] <= 2 and sys.version_info[1] <= 6:
import unittest2 as unittest
else:
import unittest
import {module_path}
suite = {module_path}.{suite_callable}()
with open("{stream_path}", "w") as output:
runner = unittest.TextTestRunner(stream=output)
runner.run(suite)
output.close()
'''.format(**locals())
with open(script_fpath, 'w') as f:
f.write(script)
subprocess.call(['maya', '-command', '\'python("execfile(\\"{script_fpath}\\")")\''.format(**locals())]
with open(stream_fpath, 'r') as f:
print f.read()
Thanks for any info!
Rather than writing to a file, you shoud be able to make a file-like object to replace stderr. The object's write method could do something with each input as it comes in; you could squirt it to something listening on TCP, or print stuff to a TK window, or anything else in addition to logging to a file if you still want the results.
Implementing a stream replacement is pretty simple, in this case you probably only need to implement write, writelines, open and close (unless your testrunner also uses flush).
class FakeStdErr(object):
def __init__(self)
self.lines = []
def write(self, text):
self.lines.append(text)
def writelines(self, *args):
for item in args: self.lines.append(item)
def open(self):
self.lines = []
def close (self):
pass
In your use case you might want to use a silencer class (which is a variant on the same trick) to replace the default stdout (to shut up your chatty process) and direct your test runner stream to this guy; after all the tests are done you could dump the contents to disk as a file or print them to the screen by restoring the default stdout (the link shows how to do that if you're not familiar).
(Edited - suggest using stderr, or parsing)
Alternative 1: Intercept the output rather than having it go to a file.
Have script write to sys.stderr instead of the open() of stream_fpath:
runner = unittest.TextTestRunner(stream=sys.stderr)
Replace subprocess.call with running = subprocess.Popen(<existing parameters>, stderr=PIPE). Then read running.stderr until EOF or until running.poll() returns other than None. You can do what you want with the data. For example, you can print it to the screen and also print it to stream_fpath.
This assumes that the noisy output comes from maya, which will still be dumping to stdout.
Alternative 2: parse the noisy output from stdout=PIPE. If you can differentiate the test runner's output by adding some tag to each line, you can search for that tag and only print the lines that match.
Popen documentation (Python 2)

To prevent a function from printing in the batch console in Python

Well, the headline seems to me sufficient.
I use some function that at some points print something in the console.
As I can't modify them, I would like to know if there is a solution to not printing while using these functions.
Thanks a lot !
Nico
Yes, you can redirect sys.stdout:
import sys
import os
old_stdout = sys.stdout # backup current stdout
sys.stdout = open(os.devnull, "w")
my_nasty_function()
sys.stdout = old_stdout # reset old stdout
Just replace my_nasty_function with your actual function.
EDIT: Now should work on windows aswell.
EDIT: Use backup variable to reset stdout is better when someone wraps your function again
Constantinius' answer answer is ok, however there is no need to actually open null device. And BTW, if you want portable null device, there is os.devnull.
Actually, all you need is a class which will ignore whatever you write to it. So more portable version would be:
class NullIO(StringIO):
def write(self, txt):
pass
sys.stdout = NullIO()
my_nasty_function()
sys.stdout = sys.__stdout__
.
Another option would be to wrap your function in a decorator.
from contextlib import redirect_stdout
from io import StringIO
class NullIO(StringIO):
def write(self, txt):
pass
def silent(fn):
"""Decorator to silence functions."""
def silent_fn(*args, **kwargs):
with redirect_stdout(NullIO()):
return fn(*args, **kwargs)
return silent_fn
def nasty():
"""Useful function with nasty prints."""
print('a lot of annoying output')
return 42
# Wrap in decorator to prevent printing.
silent_nasty = silent(nasty)
# Same output, but prints only once.
print(nasty(), silent_nasty())
You could use a modified version of this answer to create a "null" output context to wrap the call the function in.
That can be done by just passing os.devnull as the new_stdout argument to the stdout_redirected() context manager function when it's used.
Constantinius' solution will work on *nix, but this should work on any platform:
import sys
import tempfile
sys.stdout = tempfile.TemporaryFile()
# Do crazy stuff here
sys.stdout.close()
#now the temp file is gone
sys.stdout = sys.__stdout__
The currently accepted answer by Constantinius works great in most circumstances, but not in Jupyter notebooks.
Here's how to get it to work (with a reusable function)...
TLDR~Instead of using sys.__stout__, backup sys.stdout and restore it later on.
In a Jupyter notebook, running sys.stdout == sys.__stdout__ returns false. This is because each cell has a separate output stream (instead of the one terminal instance, which is sys.__stdout__).
So for everyone working with Jupyter notebooks, make sure to back up the old sys.stdout path and restore it afterwards.
Here's a function for it:
import sys, os
def deafen(function, *args):
real_stdout = sys.stdout
sys.stdout = open(os.devnull, "w")
output = function(*args)
sys.stdout = real_stdout
return output
Pass to deafen a function along with its arguments/parameters (args). It backs up the old sys.stdout, switches to os.devnull and back again itself.
For a complete example we can create a second function (test_function):
def test_function(first_argument, second_argument, *args):
print(first_argument)
print(second_argument)
print(args)
Now if we try using the test_function like normal (a.k.a. without deafen) we will get a bunch of output printed onto the screen:
print("Printing should work fine here")
test_function(32, 12, 1, 32, 1)
However, when using deafen, we'll get no new output:
print("That'll be the last thing you see...")
deafen(test_function, 32, 12, 1, 32, 1)
On a side note, the deafen function still returns a functions output. You can also use deafen with sys.__stdout__ by replacing sys.stdout = real_stdout with sys.stdout = sys.__stdout__ (and may as well remove real_stdout = sys.stdout whilst you're at it).
Hope that helps anyone who is looking for a slightly more robust or flexible solution (likely for Jupyter notebooks, or use with multiple functions)!

Capturing stdout within the same process in Python

I've got a python script that calls a bunch of functions, each of which writes output to stdout. Sometimes when I run it, I'd like to send the output in an e-mail (along with a generated file). I'd like to know how I can capture the output in memory so I can use the email module to build the e-mail.
My ideas so far were:
use a memory-mapped file (but it seems like I have to reserve space on disk for this, and I don't know how long the output will be)
bypass all this and pipe the output to sendmail (but this may be difficult if I also want to attach the file)
I modified None's answer to make it a context manager:
import sys, StringIO, contextlib
class Data(object):
pass
#contextlib.contextmanager
def capture_stdout():
old = sys.stdout
capturer = StringIO.StringIO()
sys.stdout = capturer
data = Data()
yield data
sys.stdout = old
data.result = capturer.getvalue()
Usage:
with capture_stdout() as capture:
print 'Hello'
print 'Goodbye'
assert capture.result == 'Hello\nGoodbye\n'
It's pretty simple to capture output.
import sys, StringIO
old_stdout = sys.stdout
capturer = StringIO.StringIO()
sys.stdout = capturer
#call functions
print "Hi"
#end functions
sys.stdout = old_stdout
output = capturer.getvalue()
You said that your script "calls a bunch of functions" so I'm assuming that they're python functions accessible from your program. I'm also assuming you're using print to generate the output in all these functions. If that's the case, you can just replace sys.stdout with a StringIO.StringIO which will intercept all the stuff you're writing. Then you can finally call the .getValue method on your StringIO to get everything that has been sent to the output channel. This will also work for external programs using the subprocess module which write to sys.stdout.
This is a cheap way. I'd recommend that you do your output using the logging module. You'll have much more control over how it does it's output and you can control it more easily as well.
And I modified Gary Robinson's answer to make sure that stdout is always restored, even if there's an exception:
import sys, StringIO, contextlib
class Data(object):
pass
#contextlib.contextmanager
def capture_stdout():
old = sys.stdout
capturer = StringIO.StringIO()
data = Data()
try:
sys.stdout = capturer
yield data
finally:
sys.stdout = old
data.result = capturer.getvalue()

Categories

Resources