How to use python logging for a single package - python

I am developing a package and using logging for my debug/info printing during development. Is there a good way to enable logging for just my package without enabling it for everything below root?
Say I have my_package:
# Some package from elsewhere that I need but don't want to see logging from
import other_package
import logging
from logging import NullHandler
logger = logging.getLogger(__name__)
logger.addHandler(NullHandler())
def my_func():
logger.debug("a message")
and a main function to use the package:
import my_package
# Some package from elsewhere that I need but don't want to see logging from
import another_package
import logging
logging.basicConfig(level=logging.DEBUG)
my_package.my_func()
This setup will let me see my_func()'s call to logger.debug(), but it will also show any logger.debug() calls from other_package and another_package, which I don't want to see. How can I set things where I only see the logging from my_package?
I could do hacky things like hard-code disable each other package's logging.propagate or similar, but that feels like there should be a better way.

You've already defined a unique logger for your package. You just need to configure that. Since the package logger inherits from the root logger, anything you specify with basicConfig applies to it as well. You just need to perform logger-specific overrides after you call basicConfig.
import logging
logging.basicConfig(level=logging.WARNING)
logging.getLogger('my.package').setLevel(logging.DEBUG)

Related

Python Logging from Library

Hi this is hopefully a basic question. I have a python library with a lot of logger messages such as this:
log = logging.getLogger(__name__)
log.info("blah")
log.warning("blah")
...
Then I have a separate code that imports and runs this library. In that code, I added this, thinking it would cause all logging messages to go to this file:
log = logging.getLogger(__name__)
fh = logging.FileHandler("/some/file/location/log.txt")
log.addHandler(fh)
This does successfully pass all log messages in that script to direct to that file, but the logging messages from the library aren't being passed along. I don't want to specify the file path from within the library, that doesn't make much sense, I want it specified in the code that runs the library. Most of the examples I'm seeing show imports happening with parent/child modules, but what about one module that calls a completely different module? Does my library need to accept a logger as an argument to use, or can I use the logging module to handle this?
Thanks.
Looks like you are creating two instances of the Logger class. Only the instance in your script is being configured to write to the file location.
Each time the 'getLogger' method is called, it provides a reference to the Logger with the specified name. If a Logger with that name doesn't exist, a new one is created. Note that in Python, __name__ specifies the module name. Since you are calling the library from a script, I'd assume you have two separate modules, hence two different Loggers.
For a quick-and-dirty approach, you can use:
import my_library
log = logging.getLogger(my_library.__name__)
fh = logging.FileHandler("/some/file/location/log.txt")
log.addHandler(fh)
Where my_library is your newly defined library. This will provide the logger which the library instantiated, instead of creating a new one.
Another approach would be to define a module-level function like this:
# In your script
import my_library
log_location = "/some/file/location/log.txt"
my_library.set_log_location(log_location)
# In your newly defined library
def set_log_location(path):
log = logging.getLogger(__name__)
fh = logging.FileHandler("/some/file/location/log.txt")
log.addHandler(fh)
The second approach wouldn't require the user knowing that your library uses the logging module.
In your application, e.g. in the if __name__ == '__main__' clause, configure the root logger with the handlers you want, using e.g.
logging.basicConfig(level=logging.DEBUG,
filename='/some/file/location/log.txt',
filemode='w',
format='%(asctime)s %(message)s')
and then all logging from your application, your libraries and third-party libraries should write log messages to the file. The sources of logged events (application or libraries) don't need to know or care where the events they log end up - that's taken care of by the configuration. If you need more involved configuration than basicConfig() provides, you can use logging's dictConfig() API to configure logging.

Initialise logging config from conf file in Python

I want to initialise the logging config using a config file(json or yaml) only once when I call my main module.
Is there a concept of context in python like we have in Java where I can take the logger from config whenever I need.
Something like this in the main module -
logging.config.fileConfig('log-conf.json')
I want to use the loaded config in my entire application without having to load the config in each module.
Also, should I do log = logging.getLogger(__name__) at module level or method level. What is the advantage of doing at method level.
This blog post of mine contains major answers to your question (YAML).
http://glenfant.github.io/the-zen-of-logging-and-yaml.html
You might also have inspiration from this recipe, if you prefer a pure Python config file.
http://glenfant.github.io/simple-customizable-configuration.html

How to use logger with one basic config across app in Python

I want to improve my understanding of how to use logging correctly in Python. I want to use .ini file to configure it and what I want to do:
define basic logger config through .fileConfig(...) in some .py file
import logger, call logger = logging.getLogger(__name__) across the app and be sure that it uses my config file that I was loaded recently in different .py file
I read few resources over Internet ofc but they are describing tricks of how to configure it etc, but want I to understand is that .fileConfig works across all app or works only for file/module where it was declared.
Looks like I missed some small tip or smth like that.
It works across the whole app. Be sure to configure the correct loggers in the config. logger = logging.getLogger(__name__) works well if you know how to handle having a different logger in every module, otherwise you might be happier just calling logger = logging.getLogger("mylogger") which always gives you the same logger. If you only configure the root logger you might even skip that and simply use logging.info("message") directly.

How to see the debug messages included in third-party modules of flask?

I would like to see the debug messages for the libraries/modules that I'm using to be displayed along with the debug messages that are present in my code:
app.logger.debug("Print this")
How do I go about it?
You could probably use logging.basicConfig but recommended way is to handle loggers individually. Here is example from official documentation:
from logging import getLogger
loggers = [app.logger, getLogger('sqlalchemy'),
getLogger('otherlibrary')]
for logger in loggers:
logger.addHandler(mail_handler)
logger.addHandler(file_handler)
You may be also find interesting this: How to configure all loggers in an application

Python logging with a library (namespaced packages)

My project consists of a number of namespaced packages and I want to set up logging properly for them: they are meant to be used as a library by other "frontends".
Suppose I have the case, for package foo.xyz:
foo/
__init__.py
xyz/
__init__.py
bar.py
baz.py
My idea would be to retain information from where the log is being generated, so for example in bar.py
import logging
log = logging.getLogger(__name__)
log.addHandler(logging.NullHandler()) # Python 2.7
log.setLevel(....)
However I'm not sure how to call this from the frontend (which imports several bits from different packages) to display everything without hassle. For example, I'm using foo.abc and foo.xyz, set up like above for logging. I would like to use propagation, but currently this doesn't work:
from foo.xyz import bar
from foo.abc import baz
log = logging.getLogger()
log.addHandler(logging.StreamHandler())
log.setLevel(logging.DEBUG)
do_my_stuff()
However, no output is being generated from the library's loggers. What am I doing wrong?
EDIT: So far I can get output if I get the logger corresponding to the parent module's namespace:
log = logging.getLogger("foo.xyz")
However I'm trying to grab everything in one call: I wonder if I can do that since, as I wrote earlier, this set of packages uses a namespace.
You don't need to add a NullHandler to all sub-packages of foo - you can just add the NullHandler to the foo logger, which could be done in foo/__init__.py.
The NullHandler is only added to library code to handle situations where the library is used but logging isn't configured by the using application. Thomas Vander Stichele is wrong to state that adding a NullHandler would cause messages to be dropped.
In your application, you can configure logging as you wish - whatever handlers, levels etc. Level setting should not (in general) be done in the modules themselves, but in some central place, typically called similarly to this:
if __name__ == '__main__':
configure_logging() # whatever configuration you need to do
main()
This allows REPL usage without logging being printed (other than WARNING or above), while logging occurs if the application is run.
What happens if you remove addHandler and setLevel from bar.py ? I don't see why you would want your package to actually do the logging instead of generating log messages, and the NullHandler sounds like it would be dropping your messages alltogether.

Categories

Resources