Python, how to send output to both file and terminal - python

I want to use Python to send output to both a file log.txt and STDOUT on the terminal. Here is what I have:
import sys
class Logger(object):
def __init__(self, filename="Default.log"):
self.terminal = sys.stdout
self.log = open(filename, "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
sys.stdout = Logger("log.txt")
print "Hello world !" #This line is saved in log.txt and STDOUT
This program sends output to the file and stdout. My question is: How did the write function to the file get called?

From the documentation for sys.stdout:
stdout and stderr needn’t be built-in file objects: any object is acceptable as long as it has a write() method that takes a string argument.

More specifically, the print function (in Python 2.X it is still a keyword, but it doesn't matter here) does something like this
import sys
def print(message):
sys.stdout.write(message)
so that, when you call it it will print your message on sys.stdout. However, if you overwrite sys.stdout with an object containing a .write method, well, it will call that method.
That's the magic of duck-typing.

Related

Save C-level stream print to file in Python 2.7

I am trying to save the terminal output of a python script to a file and print the output to the terminal. Using the logging script below I was able to successfully save the python output, but my script calls GNU Radio as a class and this output is not saved currectly. I believe the problem is that GNU Radio is compiled C code and not captured in the stdout redirect. I have tried using Eli Bendersky's script as a redirect but I am not understanding how to implement this in Python 2.7. Ideally I would be able to add this C-level redirect to the Logger class and save the GNU Radio output in the same generated log file. Thanks for the help!
import sys
class Logger(object):
def __init__(self, filename="Default.log"):
self.terminal = sys.stdout
self.log = open(filename, "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
sys.stdout = Logger("yourlogfilename.txt")
print "Hello world !" # this is should be saved in yourlogfilename.txt

Redirect all stdout/stderr globally to logger

Background
I have a very large python application that launches command-line utilities to get pieces of data it needs. I currently just redirect the python launcher script to a log file, which gives me all of the print() output, plus the output of the command-line utilities, i.e.:
python -m launcher.py &> /root/out.log
Problem
I've since implemented a proper logger via logging, which lets me format the logging statements more precisely, lets me limit log file size, etc. I've swapped out most of my print()statements with calls to my logger. However, I have a problem: none of the output from the command-line applications is appearing in my log. It instead gets dumped to the console. Also, the programs aren't all launched the same way: some are launched via popen(), some by exec(), some by os.system(), etc.
Question
Is there a way to globally redirect all stdout/stderr text to my logging function, without having to re-write/modify the code that launches these command-line tools? I tried setting setting the following which I found in another question:
sys.stderr.write = lambda s: logger.error(s)
However it fails with "sys.stderr.write is read-only".
While this is not a full answer, it may show you a redirect to adapt to your particular case. This is how I did it a while back. Although I cannot remember why I did it this way, or what the limitation was I was trying to circumvent, the following is redirecting stdout and stderr to a class for print() statements. The class subsequently writes to screen and to file:
import os
import sys
import datetime
class DebugLogger():
def __init__(self, filename):
timestamp = datetime.datetime.strftime(datetime.datetime.utcnow(),
'%Y-%m-%d-%H-%M-%S-%f')
#build up full path to filename
logfile = os.path.join(os.path.dirname(sys.executable),
filename + timestamp)
self.terminal = sys.stdout
self.log = open(logfile, 'a')
def write(self, message):
timestamp = datetime.datetime.strftime(datetime.datetime.utcnow(),
' %Y-%m-%d-%H:%M:%S.%f')
#write to screen
self.terminal.write(message)
#write to file
self.log.write(timestamp + ' - ' + message)
self.flush()
def flush(self):
self.terminal.flush()
self.log.flush()
os.fsync(self.log.fileno())
def close(self):
self.log.close()
def main(debug = False):
if debug:
filename = 'blabla'
sys.stdout = DebugLogger(filename)
sys.stderr = sys.stdout
print('test')
if __name__ == '__main__':
main(debug = True)
import sys
import io
class MyStream(io.IOBase):
def write(self, s):
logger.error(s)
sys.stderr = MyStream()
print('This is an error', stream=sys.stderr)
This make all call to sys.stderr go to the logger.
The original one is always in sys.__stderr__

How to set custom output handlers for argparse in Python?

I have configured logger to print both onto terminal stdout and to a file so I can have an archive of logging messages that I can refer to.
That is easily accomplished by adding a FileHandler to your logging object. Easy peasy.
What I want to accomplish now is to make argparse log also to the same file along with logs to stdout when it encounters parsing errors. So far it only prints to stdout. I looked in the argparse documentation but I can't find anything about setting a different output stream or pipe for argparse.
Is it possible to do? How?
Looking at the argparse.py source code there doesn't seem to be a way to configure this behaviour.
My suggestion(s) would be:
File a bug report with a patch :)
Override/patch:
print_* method(s)
error method.
The print_* method(s) seem to take an optional file argument which defaults to _sys.stdout.
Update: Alternatively you could do something like this whereby you redirect sys.stdout temporarily while you parse arguments:
from contextlib import contextmanager
#contextmanager
def redirect_stdout_stderr(stream):
old_stdout = sys.stdout
old_stderr = sys.stderr
sys.stdout = stream
sys.stderr = stream
try:
yield
finally:
sys.stdout = old_stdout
sys.stderr = old_stderr
with redirct_stdout_stderr(logstream):
args = parser.parse_args()
There seems to be no way to do this through the API.
However, you can do the following:
class LoggingArgumentParser(argparse.ArgumentParser):
"""Custom ArgumentPaarser that overrides _print_message"""
def _print_message(self, message, file=None):
if message:
logger.write(message)
While the answer given by #James Mills is great and solves the issue, there is no need for a generator in this case. Hence, the yield is redundant. Another way of achieving the same (without the generator) would be to write your own context manager without using the inbuilt contextlib.contextmanager decorator. Like the following.
class redirect_stdout_stderr(object):
def __init__(self, stream):
# Save the old std streams
self.old_stream = sys.stdout
self.old_error_stream = sys.stderr
self.fstream = stream
def __enter__(self):
# Change the std streams to your streams when entering
sys.stdout = self.fstream
sys.stderr = self.fstream
def __exit__(self, exc_type, exc_value, exc_traceback):
# Change the std streams back to the original streams while exiting
sys.stdout = self.old_stream
sys.stderr = self.old_error_stream
In your case you can do something as follows.
with redirect_stdout_stderr(logstream):
# __enter__() is executed
args = parser.parse_args()
# __exit__() is executed
Hope this helps!

Logging realtime stdout to a file in Python?

I want to log the real time stdout from a different Class to a log file.
Here is my demo code. Code works well. But it is waiting for the sample function to finish. I want to run it in parallel. So the log file will written in same time of stdout prints.
import sys
import time
class Logger(object):
def __init__(self):
self.terminal = sys.stdout
self.log = open("log.txt", "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
class sample:
def __init__(self):
print "TEST"
time.sleep(5)
if __name__ == "__main__":
a = sample()
sys.stdout = Logger()
Calls to file.write do not necessarily write the contents to the HD immediately. It depends on the buffer policy of the file object. If you want to force writing to disk at a certain moment in time, you can use the flush() method (see also this).
Note that sys.stdout flushing policy depends on the configuration of your installation and also on environmental variables, so if you want to guarantee "parallel" writes between standard output and the log file you must flush() both streams:
def write(self, message):
self.terminal.write(message)
self.log.write(message)
self.terminal.flush()
self.log.flush()

Intercept python's `print` statement and display in GUI

I have this somewhat complicated command line function in Python (lets call it myFunction()), and I am working to integrate it in a graphical interface (using PySide/Qt).
The GUI is used to help select inputs, and display outputs. However, myFunction is designed to work as a stand-alone command line function, and it occasionnaly prints out the progress.
My question is: how can I intercept these print calls and display them in the GUI?
I know it would be possible to modify myFunction() to send processEvents() to the GUI, but I would then lose the ability to execute myFunction() in a terminal.
Ideally, I would like something similar to Ubuntu's graphical software updater, which has a small embeded terminal-looking widget displaying what apt-get would display were it executed in a terminal.
you could redirect stdout and restore after. for example:
import StringIO
import sys
# somewhere to store output
out = StringIO.StringIO()
# set stdout to our StringIO instance
sys.stdout = out
# print something (nothing will print)
print 'herp derp'
# restore stdout so we can really print (__stdout__ stores the original stdout)
sys.stdout = sys.__stdout__
# print the stored value from previous print
print out.getvalue()
Wrap it with a function that hijacks stdout:
def stdin2file(func, file):
def innerfunc(*args, **kwargs):
old = sys.stdout
sys.stdout = file
try:
return func(*args, **kwargs)
finally:
sys.stdout = old
return innerfunc
Then simply provide a file like object that supports write():
class GUIWriter:
def write(self, stuff):
#send stuff to GUI
MyFunction = stdin2file(MyFunction, GUIWriter())
The wrapper can be turned into a decorator too:
def redirect_stdin(file):
def stdin2file(func, file):
def innerfunc(*args, **kwargs):
old = sys.stdout
sys.stdout = file
try:
return func(*args, **kwargs)
finally:
sys.stdout = old
return innerfunc
return stdin2file
The use it when declaring MyFunction():
#redirect_stdin(GUIWriter())
def MyFunction(a, b, c, d):
# any calls to print will call the 'write' method of the GUIWriter
# do stuff
Here is a Python 3 pattern using contextmanager that both encapsulates the monkey-patch technique and also ensures that sys.stdout is restored in case of an exception.
from io import StringIO
import sys
from contextlib import contextmanager
#contextmanager
def capture_stdout():
"""
context manager encapsulating a pattern for capturing stdout writes
and restoring sys.stdout even upon exceptions
Examples:
>>> with capture_stdout() as get_value:
>>> print("here is a print")
>>> captured = get_value()
>>> print('Gotcha: ' + captured)
>>> with capture_stdout() as get_value:
>>> print("here is a print")
>>> raise Exception('oh no!')
>>> print('Does printing still work?')
"""
# Redirect sys.stdout
out = StringIO()
sys.stdout = out
# Yield a method clients can use to obtain the value
try:
yield out.getvalue
finally:
# Restore the normal stdout
sys.stdout = sys.__stdout__
All printing is done via sys.stdout, which is a ordinary file-like object: iirc, it requires a method write(str). As long as your replacement has that method, it's quite easy to drop in your hook:
import sys
class CaptureOutput:
def write(self, message):
log_message_to_textbox(message)
sys.stdout = CaptureOutput()
The actual contents of log_message_to_textbox are up to you.

Categories

Resources