Logging realtime stdout to a file in Python? - python

I want to log the real time stdout from a different Class to a log file.
Here is my demo code. Code works well. But it is waiting for the sample function to finish. I want to run it in parallel. So the log file will written in same time of stdout prints.
import sys
import time
class Logger(object):
def __init__(self):
self.terminal = sys.stdout
self.log = open("log.txt", "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
class sample:
def __init__(self):
print "TEST"
time.sleep(5)
if __name__ == "__main__":
a = sample()
sys.stdout = Logger()

Calls to file.write do not necessarily write the contents to the HD immediately. It depends on the buffer policy of the file object. If you want to force writing to disk at a certain moment in time, you can use the flush() method (see also this).
Note that sys.stdout flushing policy depends on the configuration of your installation and also on environmental variables, so if you want to guarantee "parallel" writes between standard output and the log file you must flush() both streams:
def write(self, message):
self.terminal.write(message)
self.log.write(message)
self.terminal.flush()
self.log.flush()

Related

Save C-level stream print to file in Python 2.7

I am trying to save the terminal output of a python script to a file and print the output to the terminal. Using the logging script below I was able to successfully save the python output, but my script calls GNU Radio as a class and this output is not saved currectly. I believe the problem is that GNU Radio is compiled C code and not captured in the stdout redirect. I have tried using Eli Bendersky's script as a redirect but I am not understanding how to implement this in Python 2.7. Ideally I would be able to add this C-level redirect to the Logger class and save the GNU Radio output in the same generated log file. Thanks for the help!
import sys
class Logger(object):
def __init__(self, filename="Default.log"):
self.terminal = sys.stdout
self.log = open(filename, "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
sys.stdout = Logger("yourlogfilename.txt")
print "Hello world !" # this is should be saved in yourlogfilename.txt

How to log console records for stopped Python script?

I have the following code inside my script to create a log of console records every time the scrip is run:
class Logger(object):
def __init__(self):
pathLogs = 'logs/'
if not os.path.isdir(pathLogs):
os.makedirs(pathLogs)
date_stamp = str(datetime.datetime.now()).split('.')[0]
date_stamp = date_stamp.replace(" ", "_").replace(":", "")
file_name = pathLogs + date_stamp + ".log"
self.terminal = sys.stdout
self.log = open(file_name, "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
sys.stdout = Logger()
It works fine when the script's run finishes by itself, but if you interrupt it, no records that you see in the console get saved in a log file. How can I save records in a log file every time the script is run regardless if it was interrupted or not?
I think what you are looking for is the signal module.
Let's say you're interrupting your code by pressing CTRL+C. What you need to do to perform anything after the interuption signal is received is the following:
import signal
import sys
def my_callback(signal, frame):
# Do your stuff...
print("I have been killed!")
sys.exit(0)
signal.signal(signal.SIGINT, my_callback) # this is your handler
Basically, what happens is, as soon as CTRL+C is received (= the SIGINT signal), your callback is called and does whatever you want it to do.
For more details, as always, the signal doc is really clear and helpful.
Hope this helps!

python: multiprocessing.Pipe and redirecting stdout

I am using multiprocessing package to spawn a second process from which I would like to redirect stdout and stderr into the first process. I am using multiprocessing.Pipe object:
dup2(output_pipe.fileno(), 1)
Where output_pipe is an instance of multiprocessing.Pipe. However, when I try to read on the other end, it just hangs. I tried reading using Pipe.recv_bytes with a limit, but that raises an OSError. Is this possible at all or should I just switch to some lower level pipe functions?
After experimenting in Python 2.7 I got this working example. With os.dup2 pipe's file descriptor is copied to standard output file descriptor, and each print function ends up writing to a pipe.
import os
import multiprocessing
def tester_method(w):
os.dup2(w.fileno(), 1)
for i in range(3):
print 'This is a message!'
if __name__ == '__main__':
r, w = multiprocessing.Pipe()
reader = os.fdopen(r.fileno(), 'r')
process = multiprocessing.Process(None, tester_method, 'TESTER', (w,))
process.start()
for i in range(3):
print 'From pipe: %s' % reader.readline()
reader.close()
process.join()
Output:
From pipe: This is a message!
From pipe: This is a message!
From pipe: This is a message!
The existing answer works for the raw file descriptors, but this may be useful for using Pipe.send() and recv:
class PipeTee(object):
def __init__(self, pipe):
self.pipe = pipe
self.stdout = sys.stdout
sys.stdout = self
def write(self, data):
self.stdout.write(data)
self.pipe.send(data)
def flush(self):
self.stdout.flush()
def __del__(self):
sys.stdout = self.stdout
To use this, create the object in your multiprocess function, pass it the write side of multiprocessing.Pipe, and then use the read side on the parent process with recv, using poll to check if data exists.

Redirect all stdout/stderr globally to logger

Background
I have a very large python application that launches command-line utilities to get pieces of data it needs. I currently just redirect the python launcher script to a log file, which gives me all of the print() output, plus the output of the command-line utilities, i.e.:
python -m launcher.py &> /root/out.log
Problem
I've since implemented a proper logger via logging, which lets me format the logging statements more precisely, lets me limit log file size, etc. I've swapped out most of my print()statements with calls to my logger. However, I have a problem: none of the output from the command-line applications is appearing in my log. It instead gets dumped to the console. Also, the programs aren't all launched the same way: some are launched via popen(), some by exec(), some by os.system(), etc.
Question
Is there a way to globally redirect all stdout/stderr text to my logging function, without having to re-write/modify the code that launches these command-line tools? I tried setting setting the following which I found in another question:
sys.stderr.write = lambda s: logger.error(s)
However it fails with "sys.stderr.write is read-only".
While this is not a full answer, it may show you a redirect to adapt to your particular case. This is how I did it a while back. Although I cannot remember why I did it this way, or what the limitation was I was trying to circumvent, the following is redirecting stdout and stderr to a class for print() statements. The class subsequently writes to screen and to file:
import os
import sys
import datetime
class DebugLogger():
def __init__(self, filename):
timestamp = datetime.datetime.strftime(datetime.datetime.utcnow(),
'%Y-%m-%d-%H-%M-%S-%f')
#build up full path to filename
logfile = os.path.join(os.path.dirname(sys.executable),
filename + timestamp)
self.terminal = sys.stdout
self.log = open(logfile, 'a')
def write(self, message):
timestamp = datetime.datetime.strftime(datetime.datetime.utcnow(),
' %Y-%m-%d-%H:%M:%S.%f')
#write to screen
self.terminal.write(message)
#write to file
self.log.write(timestamp + ' - ' + message)
self.flush()
def flush(self):
self.terminal.flush()
self.log.flush()
os.fsync(self.log.fileno())
def close(self):
self.log.close()
def main(debug = False):
if debug:
filename = 'blabla'
sys.stdout = DebugLogger(filename)
sys.stderr = sys.stdout
print('test')
if __name__ == '__main__':
main(debug = True)
import sys
import io
class MyStream(io.IOBase):
def write(self, s):
logger.error(s)
sys.stderr = MyStream()
print('This is an error', stream=sys.stderr)
This make all call to sys.stderr go to the logger.
The original one is always in sys.__stderr__

Python, how to send output to both file and terminal

I want to use Python to send output to both a file log.txt and STDOUT on the terminal. Here is what I have:
import sys
class Logger(object):
def __init__(self, filename="Default.log"):
self.terminal = sys.stdout
self.log = open(filename, "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
sys.stdout = Logger("log.txt")
print "Hello world !" #This line is saved in log.txt and STDOUT
This program sends output to the file and stdout. My question is: How did the write function to the file get called?
From the documentation for sys.stdout:
stdout and stderr needn’t be built-in file objects: any object is acceptable as long as it has a write() method that takes a string argument.
More specifically, the print function (in Python 2.X it is still a keyword, but it doesn't matter here) does something like this
import sys
def print(message):
sys.stdout.write(message)
so that, when you call it it will print your message on sys.stdout. However, if you overwrite sys.stdout with an object containing a .write method, well, it will call that method.
That's the magic of duck-typing.

Categories

Resources