code:
import logging
logging.root.setLevel(logging.NOTSET)
logger = logging.getLogger('name')
logger.setLevel(logging.NOTSET)
def func():
logger.info('1')
print(logging.getLevelName(logger.getEffectiveLevel()))
print(logger.handlers)
**some code**
logger.info('2')
print(logging.getLevelName(logger.getEffectiveLevel()))
print(logger.handlers)
logger.setLevel(logging.NOTSET)
logger.info('3')
print(logging.getLevelName(logger.getEffectiveLevel()))
print(logger.handlers)
output:
1
NOTSET
[]
WARNING
[]
WARNING
[]
I assume some code change the level of the logger (which I'm sure not the logger with name 'name').
But setting the level after some code does not work.
How should I set it back?
You are only setting the level on the logger itself. If a loggers level is NOTSET the effective level includes all the parent loggers, since logs propagate up the logger hierarchy. If the root logger has level WARNING and the 'name' logger has level NOTSET the effective level of the 'name' logger is WARNING. ('root' is an ancestor of every logger)
The root logger can be accessed with any of these: logging.root or logging.getLogger() or logging.getLogger('root'). The level can be set the same way it is set on any logger:
root = logging.getLogger()
root.setLevel(logging.NOTSET)
Related
I'm setting the log level based on a configuration. Currently I call Settings() from the inside of Logger, but I'd like to pass it instead or set it globally - for all loggers.
I do not want to call getLogger(name, debug=Settings().isDebugMode()).
Any ideas? Thanks!
class Logger(logging.getLoggerClass()):
def __init__(self, name):
super().__init__(name)
debug_mode = Settings().isDebugMode()
if debug_mode:
self.setLevel(level=logging.DEBUG)
else:
self.setLevel(level=logging.INFO)
def getLogger(name):
logging.setLoggerClass(Logger)
return logging.getLogger(name)
The usual way to achieve this would be to only set a level on the root logger and keep all other loggers as NOTSET. This will have the effect that every logger works as if they had the level that is set on root. You can read about the mechanics of how that works in the documentation of setLevel().
Here is what that would look like in code:
import logging
root = logging.getLogger()
root.setLevel(logging.DEBUG) # set this based on your Settings().isDebugMode()
logger = logging.getLogger('some_logger')
sh = logging.StreamHandler()
sh.setFormatter(logging.Formatter('%(name)s: %(message)s'))
logger.addHandler(sh)
logger.debug('this will print')
root.setLevel(logging.INFO) # change level of all loggers (global log level)
logger.debug('this will not print')
How can I change root logger level in it's submodule
#main.py
logging.basicConfig(filename=filename,format='%(asctime)s %(levelname)s %(message)s',filemode='w')
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
After certain event I want to change all the logging to CRITICAL
sub_logger = logging.getLogger('jdm_health')
#submodule.py
if event:
logger.setLevel(logging.CRITICAL)#root logger
I want to log to a single log file from main and all sub modules.
The log messages send from a main file, where I define the logger, work as expected. But the ones send from a call to an imported function are missing.
It is working if I use logging.basicConfig as in Example 1 below.
But the second example which allows for more custom settings does not work.
Any ideas why?
# in the submodule I have this code
import logging
logger = logging.getLogger(__name__)
EXAMPLE 1 - Working
Here I create two handlers and just pass them to basicConfig:
# definition of root looger in main module
formatter = logging.Formatter(fmt="%(asctime)s %(name)s.%(levelname)s: %(message)s", datefmt="%Y.%m.%d %H:%M:%S")
handler = logging.FileHandler('logger.log')
handler.setFormatter(formatter)
handler.setLevel(logging.DEBUG)
handler2 = logging.StreamHandler(stream=None)
handler2.setFormatter(formatter)
handler2.setLevel(logging.DEBUG)
logging.basicConfig(handlers=[handler, handler2], level=logging.DEBUG)
logger = logging.getLogger(__name__)
EXAMPLE 2 - Not working
Here I create two handlers and addHandler() them to the root logger:
# definition of root looger in main module
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
handler = logging.FileHandler('logger.log')
handler.setFormatter(formatter)
handler.setLevel(logging.DEBUG)
#handler.setLevel(logging.ERROR)
logger.addHandler(handler)
handler = logging.StreamHandler(stream=None)
handler.setFormatter(formatter)
handler.setLevel(logging.DEBUG)
logger.addHandler(handler)
You need to configure the (one and only) root logger in the main module of your software. This is done by calling
logger = logging.getLogger() #without arguments
instead of
logger = logging.getLogger(__name__)
(Python doc on logging)
The second example creates a separate, child logger with the name of your script.
If there are no handlers defined in the submodules, the log message is being passed down to the root logger to handle it.
A related question can be found here:
Python Logging - How to inherit root logger level & handler
I am having some difficulties using python's logging. I have two files, main.py and mymodule.py. Generally main.py is run, and it will import mymodule.py and use some functions from there. But sometimes, I will run mymodule.py directly.
I tried to make it so that logging is configured in only 1 location, but something seems wrong.
Here is the code.
# main.py
import logging
import mymodule
logger = logging.getLogger(__name__)
def setup_logging():
# only cofnigure logger if script is main module
# configuring logger in multiple places is bad
# only top-level module should configure logger
if not len(logger.handlers):
logger.setLevel(logging.DEBUG)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(levelname)s: %(asctime)s %(funcName)s(%(lineno)d) -- %(message)s', datefmt = '%Y-%m-%d %H:%M:%S')
ch.setFormatter(formatter)
logger.addHandler(ch)
if __name__ == '__main__':
setup_logging()
logger.info('calling mymodule.myfunc()')
mymodule.myfunc()
and the imported module:
# mymodule.py
import logging
logger = logging.getLogger(__name__)
def myfunc():
msg = 'myfunc is being called'
logger.info(msg)
print('done with myfunc')
if __name__ == '__main__':
# only cofnigure logger if script is main module
# configuring logger in multiple places is bad
# only top-level module should configure logger
if not len(logger.handlers):
logger.setLevel(logging.DEBUG)
# create console handler with a higher log level
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(levelname)s: %(asctime)s %(funcName)s(%(lineno)d) -- %(message)s', datefmt = '%Y-%m-%d %H:%M:%S')
ch.setFormatter(formatter)
logger.addHandler(ch)
logger.info('myfunc was executed directly')
myfunc()
When I run the code, I see this output:
$>python main.py
INFO: 2016-07-14 18:13:04 <module>(22) -- calling mymodule.myfunc()
done with myfunc
But I expect to see this:
$>python main.py
INFO: 2016-07-14 18:13:04 <module>(22) -- calling mymodule.myfunc()
INFO: 2016-07-14 18:15:09 myfunc(8) -- myfunc is being called
done with myfunc
Anybody have any idea why the second logging.info call doesn't print to screen? thanks in advance!
Loggers exist in a hierarchy, with a root logger (retrieved with logging.getLogger(), no arguments) at the top. Each logger inherits configuration from its parent, with any configuration on the logger itself overriding the inherited configuration. In this case, you are never configuring the root logger, only the module-specific logger in main.py. As a result, the module-specific logger in mymodule.py is never configured.
The simplest fix is probably to use logging.basicConfig in main.py to set options you want shared by all loggers.
Chepner is correct. I got absorbed into this problem. The problem is simply in your main script
16 log = logging.getLogger() # use this form to initialize the root logger
17 #log = logging.getLogger(__name__) # never use this one
If you use line 17, then your imported python modules will not log any messages
In you submodule.py
import logging
logger = logging.getLogger()
logger.debug("You will not see this message if you use line 17 in main")
Hope this posting can help someone who got stuck on this problem.
While the logging-package is conceptually arranged in a namespace hierarchy using dots as separators, all loggers implicitly inherit from the root logger (like every class in Python 3 silently inherits from object). Each logger passes log messages on to its parent.
In your case, your loggers are incorrectly chained. Try adding print(logger.name) in your both modules and you'll realize, that your instantiation of logger in main.py is equivalent to
logger = logging.getLogger('__main__')
while in mymodule.py, you effectively produce
logger = logging.getLogger('mymodule')
The call to log INFO-message from myfunc() passes directly the root logger (as the logger in main.py is not among its ancestors), which has no handler set up (in this case the default message dispatch will be triggered, see here)
I would like to log my python script that uses elasticsearch-py. In particular, I want to have three logs:
General log: log INFO and above both to the stdout and to a file.
ES log: only ES related messages only to a file.
ES tracing log: Extended ES logging (curl queries and their output for instance) only to a file.
Here is what I have so far:
import logging
import logging.handlers
es_logger = logging.getLogger('elasticsearch')
es_logger.setLevel(logging.INFO)
es_logger_handler=logging.handlers.RotatingFileHandler('top-camps-base.log',
maxBytes=0.5*10**9,
backupCount=3)
es_logger.addHandler(es_logger_handler)
es_tracer = logging.getLogger('elasticsearch.trace')
es_tracer.setLevel(logging.DEBUG)
es_tracer_handler=logging.handlers.RotatingFileHandler('top-camps-full.log',
maxBytes=0.5*10**9,
backupCount=3)
es_tracer.addHandler(es_tracer_handler)
logger = logging.getLogger('mainLog')
logger.setLevel(logging.DEBUG)
# create file handler
fileHandler = logging.handlers.RotatingFileHandler('top-camps.log',
maxBytes=10**6,
backupCount=3)
fileHandler.setLevel(logging.INFO)
# create console handler
consoleHandler = logging.StreamHandler()
consoleHandler.setLevel(logging.INFO)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
consoleHandler.setFormatter(formatter)
fileHandler.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(consoleHandler)
logger.addHandler(fileHandler)
My problem is that INFO messages of es_logger are displayed also on the terminal. As a matter of fact the log messages are saved to the right files!
If I remover the part related to logger, then the ES logging works fine, i.e. only saved to the corresponding files. But then I don't have the other part.... What is it that I'm doing wrong with the last part of the settings?
Edit
Possible hint: In the sources of elasticsearch-py there's a logger named logger. Could it be that it conflicts with mine? I tried to change the name of logger to main_logger in the lines above but it didn't help.
Possible hint 2: If I replace logger = logging.getLogger('mainLog') with logger = logging.getLogger(), then the format of the output to the console of es_logger changes and becomes identical to the one defined in the snippet.
I think you are being hit by the somewhat confusing logger hierarchy propagation. Everything that is logged in "elasticsearch.trace" that passes the loglevel of that logger, will propagate first to the "elasticsearch" logger and then to the root ("") logger. Note that once the message passes the loglevel of the "elasticsearch.trace" logger, the loglevels of the parents ("elasticsearch" and root) are not checked, but all messages will be sent to the handlers. (The handlers themselves have log levels that do apply.)
Consider the following example that illustrates the issue, and a possible solution:
import logging
# The following line will basicConfig() the root handler
logging.info('DUMMY - NOT SEEN')
ll = logging.getLogger('foo')
ll.setLevel('DEBUG')
ll.addHandler(logging.StreamHandler())
ll.debug('msg1')
ll.propagate = False
ll.debug('msg2')
Output:
msg1
DEBUG:foo:msg1
msg2
You see that "msg1" is logged both by the "foo" logger, and its parent, the root logger (as "DEBUG:foo:msg1"). Then, when propagation is turned off ll.propagate = False before "msg2", the root logger no longer logs it. Now, if you were to comment out the first line (logging.info("DUMMY..."), then the behavior would change so that the root logger line would not be shown. This is because the logging module top level functions info(), debug() etc. configure the root logger with a handler when no handler has yet been defined. That is also why you see different behavior in your example when you modify the root handler by doing logger = logging.getLogger().
I can't see in your code that you would be doing anything to the root logger, but as you see, a stray logging.info() or the like in your code or library code would cause a handler to be added.
So, to answer your question, I would set logger.propagate = False to the loggers where it makes sense for you and where you want propagation, check that the log level of the handlers themselves are as you want them.
Here is an attempt:
es_logger = logging.getLogger('elasticsearch')
es_logger.propagate = False
es_logger.setLevel(logging.INFO)
es_logger_handler=logging.handlers.RotatingFileHandler('top-camps-base.log',
maxBytes=0.5*10**9,
backupCount=3)
es_logger.addHandler(es_logger_handler)
es_tracer = logging.getLogger('elasticsearch.trace')
es_tracer.propagate = False
es_tracer.setLevel(logging.DEBUG)
es_tracer_handler=logging.handlers.RotatingFileHandler('top-camps-full.log',
maxBytes=0.5*10**9,
backupCount=3)
es_tracer.addHandler(es_tracer_handler)
logger = logging.getLogger('mainLog')
logger.propagate = False
logger.setLevel(logging.DEBUG)
# create file handler
fileHandler = logging.handlers.RotatingFileHandler('top-camps.log',
maxBytes=10**6,
backupCount=3)
fileHandler.setLevel(logging.INFO)
# create console handler
consoleHandler = logging.StreamHandler()
consoleHandler.setLevel(logging.INFO)
# create formatter and add it to the handlers
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
consoleHandler.setFormatter(formatter)
fileHandler.setFormatter(formatter)
# add the handlers to logger
logger.addHandler(consoleHandler)
logger.addHandler(fileHandler)