For example, i have some script test1.py with code like this:
import logging
from logging.handlers import RotatingFileHandler
import some_module
handler = RotatingFileHandler('TEST1.log', maxBytes=18000, backupCount=7)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logging.getLogger("some_module").addHandler(handler)
do_something():
some_module.do_smth()
do_something()
And I have another script test2.py with code like this:
import logging
from logging.handlers import RotatingFileHandler
import some_module
handler = RotatingFileHandler('TEST2.log', maxBytes=18000, backupCount=7)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logging.getLogger("some_module").addHandler(handler)
do_something():
some_module.do_smth_else()
do_something()
Then i import both scripts in file app.py, which can call one of the scripts for some reasons.
The problem is, that all log messages for module some_module from script test1.py are written to both log files: and TEST1.log, and TEST2.log.
As i understood, the problem is about singleton pattern, so module logging is something like global module for all my scripts, which are working in the same process. So, when i import test1.py to app.py it adds handler for some_module first time, then, when i import test2.py to app.py, it adds handler for some_module another time, and this module now has 2 handlers.
Is there a way to add handlers for this module separately, so all debug messages, which are being called by test1.py, will be written in TEST1.log, but not in TEST2.log.
UPDATE:
In my case i am trying to do it with this module, and it seems, that with it it's not working:
logging.getLogger("TeleBot.test1").setLevel(logging.DEBUG)
logging.getLogger("TeleBot.test1").addHandler(handler)
And nothing is being written in my log file, but if i just do simply:
logging.getLogger("TeleBot").setLevel(logging.DEBUG)
logging.getLogger("TeleBot").addHandler(handler)
It's working, but, as i mentioned in the question, it writes debug messages to all files.
So, is it a bug in this particular module?
Doing logging.getLogger("some_module") in both files returns the same Logger object as you have already observed.
To get a separate Logger in each file simply provide a different name in getLogger() each time.
E.g. in test1.py
logging.getLogger("some_module.test1").addHandler(handler)
and in test2.py
logging.getLogger("some_module.test2").addHandler(handler)
Related
I'm trying to establish logging in all modules I'm using. My project structure is
# driver.py
import logging
logger = logging.getLogger(__name__)
class driver:
....
# driver_wrapper.py
from driver import driver
device = driver(...)
def driver_func():
logging.info("...")
....
# main.py
import logging
import driver_wrapper
logging.basicConfig(stream=sys.stdout, level=logging.WARNING)
driver_wrapper.driver_func()
My problem now is that I still get INFO level messages and also the output is 'INFO:root'. But I would expect the module name instead of root.
Is there a way to set the logging level in the main.py for all modules or is it already correct how I do it? There are a lot of posts about this problem but the solutions don't seem to work for me.
All your modules that use logging should have the logger = logging.getLogger(__name__) line, and thereafter you always log to e.g.logger.info(...), and never call e.g. logging.info(...). The latter is equivalent to logging to the root logger, not the module's logger. That "all your modules" includes driver_wrapper.py in your example.
Sorry, it is a simple question, but I'm not understanding well what I have to do. There are two scripts: main.py and methods.py
main.py
import methods
methods.fnc_1()
methods.fnc_2()
methods.py
import logging
logger = logging.getLogger(__name__)
f_handler = logging.FileHandler('file.log')
f_handler.setLevel(logging.DEBUG)
c_handler = logging.StreamHandler()
c_handler.setLevel(logging.DEBUG)
logger.addHandler(c_handler)
logger.addHandler(f_handler)
def fnc_1():
logger.warning('warning_1!')
def fnc_2():
logger.warning('warning_2!')
How can I use the same logger object in main.py? So, I call main.py and get every log message from either main.py or methods.py in the same file, shown at the order of execution?
Use methods.logger in main.py to get the variable from the methods module.
import methods
methods.fnc_1()
methods.logger.warning("warning from main!")
methods.fnc_2()
I am getting duplicate (double) logs when using the python logging. I have 3 files :
1. main.py
2. dependencies.py
3. resources.py
I am making only 1 call to the python logger constructor which is done inside the main.py
Following are my import statements in the 3 files
main.py
import xml.etree.ElementTree as et
from configparser import ConfigParser
from Craftlogger import Craftlogger
logger = Craftlogger().getLogger()
dependencies.py
import os,sys
from main import getJobDetails,postRequest,logger
from configparser import ConfigParser
resources.py
import os,sys
import xml.etree.ElementTree as et
And inside the main method in the main.py, I have the imports
def main():
from resources import getResourceDetails,setResources
from dependencies import setDependencies
..... Remaining code .....
My logging file looks like this
import logging
class Craftlogger:
def __init__(self):
self.logger = logging.getLogger(__name__)
handler = logging.StreamHandler()
formatter_string = '%(asctime)s | %(levelname)-8s | %(filename)s-%(funcName)s-%(lineno)04d | %(message)s'
formatter = logging.Formatter(formatter_string)
handler.setFormatter(formatter)
self.logger.addHandler(handler)
self.logger.setLevel(logging.DEBUG)
self.logger.propagate = False
def getLogger(self):
return self.logger
Note : I had to do the imports inside of main so as to be able to do circular imports.
My guess would be that two CraftLogger objects exist and both have the same self.logger member. logging.getLogger(__name__) probably returns the same object for another CraftLogger object, resulting in two addHandler calls on the same logger. This is just a guess, no guarantee.
Logging is a cross cutting concern. As such, I frown upon classes which set up logging on their own. The responsibility to configure logging (especially handlers) should be solely with the main executing function, e.g. your main function. No submodule / class / function should modify logging, except getting a logger via logging.getlogger(name).
This avoids most of these pitfalls and allows easy composition of modules.
Imagine you have to import two modules who both modify the logging system...fun
I want to create my custom logger class
import logging
class mylooger():
def __init__(module_name):
logger=logging.getlogger(module_name)
And i want use logger in my main file and two other files
main.py
test1.py
test2.py
i want to decide my logfile path in main.py and keep main.py, test1.py and test2.py file log in same file.
Now suppose later i want to import test1.py and test2.py in some other file e.g main1.py. So i want to decide my logfile path from main1.py and keep main1.py, test1.py and test2.py log in same file.
That's not how stdlib logging is designed. If you want to use the same logger in multiple modules, just get the same logger:
# in main.py
logger = logging.getLogger("mylogger")
def main():
...
logging.basicConfig(...)
logger.info("some event")
And:
# in test1.py
logger = logging.getLogger("mylogger")
def some_lib_function():
...
logger.debug("some other event")
The logging framework itself maintains a global mutable state so that these are resolving to the same loggers and hence the same formatters/handlers.
Let's say I have code like this:
ModuleA.py:
import logging
logger = logging.getLogger('A')
def utility_func():
logger.info('hi')
print utility_func()
ModuleB.py:
import logging
logger = logging.getLogger('B')
from ModuleA import utility_func
print utility_func()
When utility_func is called from within ModuleA.py I want it to use the 'A' logger, and when it's called from ModuleB.py I want it to use the 'B' logger.
Is there a way to do this? Or a better way to set things up?
Update:
What about the idea of changing utility_func to:
def utility_func():
logging.info('hi')
Would that bubble up to whichever log the calling code is using?
Here's what I ended up going with. But I'm still curious to hear if there's a more elegant way.
ModuleB.py:
import logging
import ModuleA
logger = logging.getLogger('B')
ModuleA.logger = logger
print ModuleA.utility_func()
Put this in a separate file (like utils.py)
def utility_func(logger):
logger.info('hi')
Then in file ModuleA.py
import logging
import utils
logger = logging.getLogger(__name__)
print utility_func(logger)