I'm trying to import from the following module:
https://github.com/dmlc/gluon-cv/blob/master/gluoncv/torch/data/gluoncv_motion_dataset/dataset.py
However this includes the lines:
logging.basicConfig(level=logging.INFO)
log = logging.getLogger()
Which is messing up the logging settings I'm trying to apply in my main file. How can I import from this module, but overwrite the above log settings?
Related
I'm trying to establish logging in all modules I'm using. My project structure is
# driver.py
import logging
logger = logging.getLogger(__name__)
class driver:
....
# driver_wrapper.py
from driver import driver
device = driver(...)
def driver_func():
logging.info("...")
....
# main.py
import logging
import driver_wrapper
logging.basicConfig(stream=sys.stdout, level=logging.WARNING)
driver_wrapper.driver_func()
My problem now is that I still get INFO level messages and also the output is 'INFO:root'. But I would expect the module name instead of root.
Is there a way to set the logging level in the main.py for all modules or is it already correct how I do it? There are a lot of posts about this problem but the solutions don't seem to work for me.
All your modules that use logging should have the logger = logging.getLogger(__name__) line, and thereafter you always log to e.g.logger.info(...), and never call e.g. logging.info(...). The latter is equivalent to logging to the root logger, not the module's logger. That "all your modules" includes driver_wrapper.py in your example.
I'm trying to enable logging to stdout for requests_oauthlib. The example in the docs suggests this:
# Uncomment for detailed oauthlib logs
#import logging
#import sys
#log = logging.getLogger('oauthlib')
#log.addHandler(logging.StreamHandler(sys.stdout))
#log.setLevel(logging.DEBUG)
But it doesn't seem to have any effect. What's the proper way to do it?
The root logger's name should be requests_oauthlib, i.e. the package name. The modules in the package define loggers like this
logger = logging.getLogger(__name__)
so configuring the root logger as described in the example should work:
import logging
import sys
log = logging.getLogger('requests_oauthlib')
log.addHandler(logging.StreamHandler(sys.stdout))
log.setLevel(logging.DEBUG)
I created a Flask app. All Flask code is inside api.py. This app uses other files, for example utils.py. This file contains functions that will be used from the api.py.
Inside api.py I am using app.logger for logging, like
app.logger.debug('HI')
This log is displayed in the console.
but in utils.py I am using:
import logging
logger = logging.getLogger('utils')
...
logger.debug('SOME MESSAGE')
But nothing is displayed in the console.
One awful, awful, Awful hack that I am using now, is importing app from api.py
from . import api
api.app.logger.debug('SOME MESSAGE')
And this message is displayed in the console. But I know that I am wrongdoing here. Is there a better way?
Flask uses the global app object to store this stuff. What you most likely want to do is
from flask import current_app
current_app.logger.debug('hi')
Alternatively, you could configure the default logger to log to a file the same way that flask configures its logger to do so using standard python logging methods.
# app_logger.py
import logging
import logging.config
logging.config.fileConfig('logging_config.ini')
logger = logging.getLogger()
Then in other files...
#utils.py
from app_logger import logger
logger.debug('hi')
This second way is how I set up logging for my flask apps, create a global "logger" object and import that object everywhere I want to log (allowing me to easily figure out where to change the config to do things like log to stdout or a file or whatever)
For example, i have some script test1.py with code like this:
import logging
from logging.handlers import RotatingFileHandler
import some_module
handler = RotatingFileHandler('TEST1.log', maxBytes=18000, backupCount=7)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logging.getLogger("some_module").addHandler(handler)
do_something():
some_module.do_smth()
do_something()
And I have another script test2.py with code like this:
import logging
from logging.handlers import RotatingFileHandler
import some_module
handler = RotatingFileHandler('TEST2.log', maxBytes=18000, backupCount=7)
logger = logging.getLogger(__name__)
logger.addHandler(handler)
logging.getLogger("some_module").addHandler(handler)
do_something():
some_module.do_smth_else()
do_something()
Then i import both scripts in file app.py, which can call one of the scripts for some reasons.
The problem is, that all log messages for module some_module from script test1.py are written to both log files: and TEST1.log, and TEST2.log.
As i understood, the problem is about singleton pattern, so module logging is something like global module for all my scripts, which are working in the same process. So, when i import test1.py to app.py it adds handler for some_module first time, then, when i import test2.py to app.py, it adds handler for some_module another time, and this module now has 2 handlers.
Is there a way to add handlers for this module separately, so all debug messages, which are being called by test1.py, will be written in TEST1.log, but not in TEST2.log.
UPDATE:
In my case i am trying to do it with this module, and it seems, that with it it's not working:
logging.getLogger("TeleBot.test1").setLevel(logging.DEBUG)
logging.getLogger("TeleBot.test1").addHandler(handler)
And nothing is being written in my log file, but if i just do simply:
logging.getLogger("TeleBot").setLevel(logging.DEBUG)
logging.getLogger("TeleBot").addHandler(handler)
It's working, but, as i mentioned in the question, it writes debug messages to all files.
So, is it a bug in this particular module?
Doing logging.getLogger("some_module") in both files returns the same Logger object as you have already observed.
To get a separate Logger in each file simply provide a different name in getLogger() each time.
E.g. in test1.py
logging.getLogger("some_module.test1").addHandler(handler)
and in test2.py
logging.getLogger("some_module.test2").addHandler(handler)
I have a module which should do some logging:
import logging
logging.basicConfig(filename='example.log',level=logging.DEBUG)
def do_something():
logging.info("I did something")
Now if I call the module, let it be module.py, then it will not do the logging:
import module
module.do_something()
Not even a logfile is created! Where is the bug?
Sometimes you have to specify the full path of the log file. Try that. For example:
import logging
logging.basicConfig(filename='C:/workspace/logging_proj/src/example.log',level=logging.DEBUG)
or you can have Python do it for you:
import os
import logging
LOG_FILENAME = os.path.join(os.path.dirname(__file__), 'example.log')
logging.basicConfig(filename=LOG_FILENAME,level=logging.DEBUG)