log messages to an array/list with logging - python

Currently am using python logging to log messages to a log file and to console (if --verbose).
How can I configure logging to also record messages into an array/list?

Figured this out after posting.
Used a Stream to a string.
Here is snippet of the code, not including the stdout Stream and the normal logger file handle:
import io
import logging
logger = logging.getLogger()
errors = io.StringIO()
formatter = logging.Formatter('%(asctime)s - %(module)s.%(funcName)s() - %(levelname)s - %(message)s',"%Y-%m-%d %H:%M:%S")
eh = logging.StreamHandler(errors)
eh.setFormatter(formatter)
logger.addHandler(eh)
logger.error("This is a test error message")
contents=errors.getvalue()
print("error string=>{}".format(contents))
errors.close()

Related

Prevent Generation of Log File with Python logging

I have a simple script that I run as an exe file on Windows. However, when I am developing the script, I run it from the command line and use the logging module to output debug info to a log file. I would like to turn off the generation of the log file for my production code. How would I go about doing that?
This is the logging config I have setup now:
import logging
...
logging.basicConfig(filename='file.log',
filemode="w",
level=logging.DEBUG,
format="%(asctime)s: %(name)s - %(levelname)s - %(message)s",
datefmt='%d-%b-%y %H:%M:%S',
)
...
logging.debug("Debug message")
If you don't mind the generation of an empty log file for production, you can simply increase the threshold of logging to a level above logging.DEBUG, such as logging.INFO, so that messages logged with logging.debug won't get output to the log file:
logging.basicConfig(filename='file.log', # creates an empty file.log
filemode="w",
level=logging.INFO,
format="%(asctime)s: %(name)s - %(levelname)s - %(message)s",
datefmt='%d-%b-%y %H:%M:%S',
)
logging.debug("Debug message") # nothing would happen
logging.info("FYI") # logs 'FYI'
If you don't want logging to function at all, an easy approach is to override logging with a Mock object:
import logging
from unittest.mock import Mock
environment = 'production'
if environment == 'production':
logging = Mock()
...
logging.basicConfig(...) # nothing would happen
logging.debug(...) # nothing would happen

python logging debug to file errors to stdout

Is there any way in python logging module to send info & errors to stdout and debug to file.
Some commands in my script produce a long output which I don't want to send to stdout.
I am using the following logging function which writes logs to file and stdout
def mylog(release_num, logdir='/tmp'):
applog = logging.getLogger()
applog.setLevel(logging.DEBUG)
formatter = logging.Formatter("%(asctime)s %(levelname)s %(message)s", "%b %d %Y %H:%M:%S")
logfile = "{}/{}.log".format(logdir, release_num)
if not os.path.exists(logdir):
os.makedirs(logdir)
fileHandler = logging.FileHandler(logfile, 'ab')
fileHandler.setLevel(logging.DEBUG)
fileHandler.setFormatter(formatter)
applog.addHandler(fileHandler)
cformat = logging.Formatter("[%(levelname)8s] : %(message)s")
consoleHandler = logging.StreamHandler(sys.stdout)
consoleHandler.setFormatter(cformat)
log.addHandler(consoleHandler)
return applog
You need to set the loglevel of consoleHandler to logging.INFO to log messages of level info or higher through the handler:
consoleHandler.setLevel(logging.INFO)

Python TimedRotatingFileHandler not rotating at midnight till something new is written to the log file. How to fix this?

In the flask server which runs 24/7, I set up the TimedRotatingFileHandler to rotate the log file at everyday midnight. However the log rotation is delayed till something new is written to the log file after the midnight.
How to fix this issue?
Here is the code I've used for log rotation.
config = ConfigParser.ConfigParser()
config.read("config.cnf")
#Logging
log_path = config.get('logging', 'log_path')
from logging.handlers import TimedRotatingFileHandler
log = logging.getLogger(__name__)
log.setLevel(logging.DEBUG)
# add a file handler
fh = logging.handlers.TimedRotatingFileHandler(log_path,when='midnight',interval=1,backupCount=0)
fh.setLevel(logging.DEBUG)
# create a formatter and set the formatter for the handler.
frmt = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
fh.setFormatter(frmt)
# add the Handler to the logger
log.addHandler(fh)

How to share a file between modules for logging in python

I wanted to log messages from different module in python to a file. Also I need to print some messages to console for debugging purpose. I used logger module for this purpose . But logger module will log all the logs with given severity and above to file or console.
I wanted only some messages logged to file and it should not include the messages from the console.
Similarly the console messages should not contain messages logged to file.
My approach would be to have a singleton class which shares file write operation between various modules.
Is there any easier approach than this in python ?
EDIT:
I am new to Python. Sample program I tried
logger = logging.getLogger('simple_example')
logger.setLevel(logging.INFO)
# create file handler which logs even debug messages
fh = logging.FileHandler('spam.log')
fh.setLevel(logging.CRITICAL)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
fh.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(ch)
logger.addHandler(fh)
# 'application' code
logger.debug('debug message')
logger.info('info message')
logger.warn('warn message')
logger.error('error message')
logger.critical('critical message')
Console prints :
2015-02-03 15:36:00,651 - simple_example - ERROR - error message
2015-02-03 15:36:00,651 - simple_example - CRITICAL - critical message
#I don't want critical messages in console.
Here is a script that creates two loggers, use the one you wish to log to a file or stdout. The question is : on which criteria do you choose to log to stdout or file, knowing that (from your question) you don't want the criteria to be the log level (debug, error, critical...)
#!/usr/bin/python
import logging
logger_stdout = logging.getLogger('logger_stdout')
logger_stdout.setLevel(logging.DEBUG)
sh = logging.StreamHandler()
sh.setFormatter(logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s'))
logger_stdout.addHandler(sh)
logger_stdout.debug('stdout debug message')
logger_file = logging.getLogger('logger_file')
logger_file.setLevel(logging.DEBUG)
fh = logging.FileHandler("foo.log")
fh.setFormatter(logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s'))
logger_file.addHandler(fh)
logger_file.debug('file debug message')
And when I run this script:
D:\jrx\jrxpython
λ python essai.py
2015-02-03 11:12:07,210 - logger_stdout - DEBUG - stdout debug message
D:\jrx\jrxpython
λ cat foo.log
2015-02-03 11:12:07,224 - logger_file - DEBUG - file debug message
D:\jrx\jrxpython
λ
CRITICAL is higher than ERROR:
You can also verify yourself:
>>> import logging
>>> print logging.CRITICAL
50
>>> print logging.ERROR
40
>>>
There are two cases in logging:
Logging the same process - you should have several handlers with different logging levels based on how verbose the logs should be. A higher level means less output. That's why DEBUG is the lowest predefined log level - it writes everything for debug purposes.
Logging different processes - you should have several loggers set up, they can be accessed from anywhere in your code using logging.getLogger(name). This gives the same logger every time, so that logger set-up persists through the code and only needs to be executed once.
The first case demonsrates that you can't have an "error but not critical" log, since this is the opposite of how logs should work. You can have a "critical but not error" log, that is less verbose. This is what you probably want.

python logging module is not writing anything to file

I'm trying to write a server that logs exceptions both to the console and to a file. I pulled some code off the cookbook. Here it is:
logger = logging.getLogger('server_logger')
logger.setLevel(logging.DEBUG)
# create file handler which logs even debug messages
fh = logging.FileHandler('server.log')
fh.setLevel(logging.DEBUG)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.ERROR)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s', datefmt='%Y-%m-%d %H:%M:%S')
ch.setFormatter(formatter)
fh.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(ch)
logger.addHandler(fh)
This code logs perfectly fine to the console, but nothing is logged to the file. The file is created, but nothing is ever written to it. I've tried closing the handler, but that doesn't do anything. Neither does flushing it. I searched the Internet, but apparently I'm the only one with this problem. Does anybody have any idea what the problem is? Thanks for your answers.
Try calling
logger.error('This should go to both console and file')
instead of
logging.error('this will go to the default logger which you have not changed the config of')
Try to put the import and the basicConfig at the very beggining of the script. Something like this:
import logging
logging.basicConfig(filename='log.log', level=logging.INFO)
.
.
import ...
import ...
Put this
for handler in logging.root.handlers[:]:
logging.root.removeHandler(handler)
in front of the
logging.basicConfig(...)
see also
Logging module not writing to file
I know that this question might be a bit too old but I found the above method a bit of an overkill. I ran into a similar issue, I was able to solve it by:
import logging
logging.basicConfig(format = '%(asctime)s %(message)s',
datefmt = '%m/%d/%Y %I:%M:%S %p',
filename = 'example.log',
level=logging.DEBUG)
This will write to example.log all logs that are of level debug or higher.
logging.debug("This is a debug message") will write This is a debug message to example.log. Level is important for this to work.
In order to both write to terminal and file you can do like below:
import logging.config
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s [%(levelname)s] %(message)s",
handlers=[
logging.FileHandler("log_file.log"),
logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
usage in the code:
logger.info('message')
logger.error('message')
If root.handlers is not empty, log file will not be created. We should empty root.handlers before calling basicConfig() method. source
Snippet:
for handler in logging.root.handlers[:]:
logging.root.removeHandler(handler)
The full code is below:
import logging
##loging
for handler in logging.root.handlers[:]:
logging.root.removeHandler(handler)
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(message)s',
datefmt='%a, %d %b %Y %H:%M:%S',
filename= 'log.txt',
filemode='w')
console = logging.StreamHandler()
console.setLevel(logging.INFO)
# add the handler to the root logger
logging.getLogger().addHandler(console)
logging.info("\nParameters:")
for i in range(10):
logging.info(i)
logging.info("end!")

Categories

Resources