Resetting file handler name in python logging - python

I have a code structure that looks like this:
.
├── scripts
│ └── test.py
└── utils
├── __init__.py
└── lib.py
In my utils/__init__.py file I've set up logging like so:
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
ch = logging.FileHandler("logfile", mode="w")
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter(
"%(levelname)s:%(name)s:[%(funcName)s]:%(message)s", "%m/%d/%Y %I:%M:%S %p"
)
ch.setFormatter(formatter)
logger.addHandler(ch)
This works, and if I log in lib.py using code like so
LOGGER = logging.getLogger(__name__)
def a_function():
LOGGER.info('This works')
then everything runs and gets logged.
However, I often find myself wanting to run different scripts while in the same environment and I'd like to have a log that's named differently for each run (I don't want them writing over each other or appending to the same log). What is the best way to modify the logging name when logging is set up like this? I've tried doing things like the following in scripts/test.py:
import logging
from utils.lib import a_function
def main():
# rename handler
fh = logging.FileHandler("logfile2", "w")
formatter = logging.Formatter(
"%(levelname)s:%(name)s:[%(funcName)s]:%(message)s", "%m/%d/%Y %I:%M:%S %p"
)
fh.setFormatter(formatter)
log = logging.getLogger()
log.handlers.clear()
log.addHandler(fh)
a_function()
if __name__ == '__main__':
main()
This works in adding a new log called logfile2, but I still get logging in the original logfile. I would have thought that the log.handlers.clear() command would have cleared that out.
What am I doing wrong here?

Related

Getting Logging Working in a Python Click Application

Here is the current tree of my code:
Click_test/
├── cli.py
├── cool_stuff
│   ├── commands.py
│   ├── __init__.py
|
├── __init__.py
Parent CLI File looks like this:
import click
import cool_stuff.commands as cool_stuff
#click.group()
#click.argument("secret", case_sensitive=False)
#click.pass_context
def cli(ctx, secret):
ctx.obj['secret'] = secret
if __name__ == '__main__':
cli.add_command(cool_stuff.do_something)
cli(obj={})
Commands.py file looks like this:
import click
from tqdm import tqdm
import logging
FORMAT = "%(asctime)s - %(levelname)s - %(message)s"
logging.basicConfig(format=FORMAT, filename="a_log.log", filemode="w", level=logging.DEBUG)
#click.command("do_something")
#click.pass_context
def do_something(ctx):
click.echo(f'Doing some cool stuff...')
for number in tqdm(range(100), description="Do some cool things"):
logging.debug(f"Secret is {ctx.obj['secret']} {number}")
For some reason, when I run it in this format, the logger doesn't work properly. There is no "a_log.log" file while this runs or after. I even removed the filename to have it print to stdout and it still wouldn't.
When I don't use the .group() construct and just have it under a single command, no problem. Has anyone ever experienced this before? The intention of putting it in this format is I don't have one giant file full of commands. Another option is to eliminate the group portion of it if possible and still be able to break out the commands into separate files based on certain categories.

How to prevent logging to file and capture logs for assertions using Pytest

I have a suite of unit tests for a python module and use Pytest to run those tests. I am now starting to use the standard logging library in my library and would need help solving two issues:
How to prevent logs into files when running the test suite, to prevent files growing and to prevent unhelpful entries in real log files.
How to capture the logs inside unit tests, to be able to run assertions on the logs that the library generates.
The module that I am trying to test configures the logging library in __init__.py to log into a file and logs entries in the module code using the info method. That works fine and when I run the code the right log entries appear in the file ok. -- see code below --
I have tried to use the caplog fixture in pytest -- see code and output below -- but the effect I get is:
log entries are included into the file (logs generated in the runs from the test using caplog and all other tests)
caplog.text is empty
Unit Tests Code
import library.module
class TestFunction:
def test_something_else(self):
library.module.function():
assert True
def test_logs(self,caplog)
library.module.function():
assert "desired" in caplog.text
Test Output
(...)
> assert "desired" in caplog.text
E AssertionError: assert 'desired' in ''
E + where '' = <_pytest.logging.LogCaptureFixture object at (...).text
(...)
Log entries after running test suite
2021-12-07 11:10:05,915 - library.module - INFO - desired
2021-12-07 11:10:05,917 - library.module - INFO - desired
Logging module configuration
__init__.py
import logging.config
import yaml
with open("logging.yaml") as f:
conf_dict = yaml.safe_load(f)
logging.config.dictConfig(conf_dict)
logging.yaml
version: 1
formatters:
simple:
format: '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
handlers:
training:
class: logging.FileHandler
level: DEBUG
formatter: simple
filename: logs/test_log
loggers:
library.module:
level: DEBUG
handlers: [training]
propagate: no
root:
level: DEBUG
handlers: []
Module under test
import logging
logger = logging.getLogger(__name__)
def function():
logger.info("desired")
File Structure
.
├── library
│ ├── module.py
│ └── __init__.py
├── tests
│ └── test_module.py
└── logs
└── test_log
To avoid writing to the log file, I suggest that, in test_module.py, you simply mock the logger and use it in your test, like this:
import pytest
import library.module
#pytest.fixture
def logger(mocker):
return mocker.patch("library.module.logger.info")
class TestFunction:
def test_something_else(self):
library.module.function():
assert True
def test_logs(self,logger)
library.module.function():
logger.assert_called_with("desired")

How to set the log level for an imported module?

Write your code with a nice logger
import logging
def init_logging():
logFormatter = logging.Formatter("[%(asctime)s] %(levelname)s::%(module)s::%(funcName)s() %(message)s")
rootLogger = logging.getLogger()
LOG_DIR = os.getcwd() + '/' + 'logs'
if not os.path.exists(LOG_DIR):
os.makedirs(LOG_DIR)
fileHandler = logging.FileHandler("{0}/{1}.log".format(LOG_DIR, "g2"))
fileHandler.setFormatter(logFormatter)
rootLogger.addHandler(fileHandler)
rootLogger.setLevel(logging.DEBUG)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(logFormatter)
rootLogger.addHandler(consoleHandler)
return rootLogger
logger = init_logging()
works as expected. Logging using logger.debug("Hello! :)") logs to file and console.
In a second step you want to import an external module which is also logging using logging module:
Install it using pip3 install pymisp (or any other external module)
Import it using from pymisp import PyMISP (or any other external module)
Create an object of it using self.pymisp = PyMISP(self.ds_model.api_url, self.ds_model.api_key, False, 'json') (or any other...)
What now happens is, that every debug log output from the imported module is getting logged to the log file and the console. The question now is, how to set a different (higher) log level for the imported module.
As Meet Sinoja and anishtain4 pointed out in the comments, the best and most generic method is to retrieve the logger by the name of the imported module as follows:
import logging
import some_module_with_logging
logging.getLogger("some_module_with_logging").setLevel(logging.WARNING)
Another option (though not recommended if the generic method above works) is to extract the module's logger variable and customize it to your needs. Most third-party modules store it in a module-level variable called logger or _log. In your case:
import logging
import pymisp
pymisp.logger.setLevel(logging.INFO)
# code of module goes here
A colleague of mine helped with this question:
Get a named logger yourLogger = logging.getLogger('your_logger')
Add a filter to each handler prevents them to print/save other logs than yours
for handler in logging.root.handlers:
handler.addFilter(logging.Filter('your_logger'))

Python How can I set up global logger setting in multi python files?

I just started Python and I am struggling to use Logger. I have two python files: app.py and liba.py. I want to setup logging at app.py and use it for liba.py (and another libraries). Do you have any good ideas or can you share any references?
file structure
entry.py
lib/liba.py
app.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
from lib import liba
handler = logging.FileHandler('/tmp/app.log', 'a+')
logger.addHandler(handler)
logger.warn('sample')
liba.out()
lib/liba.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
def out():
logger.warn('liba')
run python
$ python3 app.py
liba
app.py output log to the logfile. liba.py does not output the log into the file. I want to save logs in the same file.
Do like so:
app.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger()
handler = logging.FileHandler('/tmp/app.log', 'a+')
logger.addHandler(handler)
logger.warn('sample')
from lib import liba
liba.out()
lib/liba.py
#! /usr/bin/env python3
import logging
def out():
logging.warn('liba')
You don't need to instantiate the logging, unless you want to configure handlers, which you only do in your main script. Then all logging will go to the root logger which is what you get when instantiating with no specific name logging.getLogger(). I like to use it this way as you don't need to match names across all your modules for it to work. In your modules you just send log messages out by using logging.warn('blabla'). You further need to make sure you define all your handlers before any call to logging.warn is made, otherwise some default handler will take its place.

Python global common path for logger output

I am dealing with a path problem in usage of logger. Well, I've createt a logger class which is located in file logger.py and all I want to do is to use the logger across my project. Here goes the structure:
project/
/docs
/log
/dir1
/dir2
__init__.py
a.py
...
__init__.py
b.py
logger.py
...
setup.py
Well, the problem is when I want to use logger in files a.py and b.py. I am unable to set logger ouput path properly. I found out that usage of relative paths is not the right way. How can I force the logger to put the log files a.log and b.log into the /log folder? Inside the logger.py file I am using logging.FileHandler(). Thanks in advance.
UPDATE:
logger.py:
class Logger:
LEVEL = logging.DEBUG
def __init__(self, name, logger_path, level=None):
self.logger = logging.getLogger(name)
formatter = logging.Formatter('%(asctime)s - %(name)s.%(lineno)d - [%(levelname)s] - %(message)s')
file_handler = logging.FileHandler("{0}/{1}.log".format(logger_path, name))
file_handler.setFormatter(formatter)
self.logger.addHandler(file_handler)
if level:
self.logger.setLevel(level)
else:
self.logger.setLevel(self.LEVEL)
Well I passed the relative path from a.py and it worked while I was running the script from the directory /dir2. But it doesn't work elsewhere in the project...

Categories

Resources