Python global common path for logger output - python

I am dealing with a path problem in usage of logger. Well, I've createt a logger class which is located in file logger.py and all I want to do is to use the logger across my project. Here goes the structure:
project/
/docs
/log
/dir1
/dir2
__init__.py
a.py
...
__init__.py
b.py
logger.py
...
setup.py
Well, the problem is when I want to use logger in files a.py and b.py. I am unable to set logger ouput path properly. I found out that usage of relative paths is not the right way. How can I force the logger to put the log files a.log and b.log into the /log folder? Inside the logger.py file I am using logging.FileHandler(). Thanks in advance.
UPDATE:
logger.py:
class Logger:
LEVEL = logging.DEBUG
def __init__(self, name, logger_path, level=None):
self.logger = logging.getLogger(name)
formatter = logging.Formatter('%(asctime)s - %(name)s.%(lineno)d - [%(levelname)s] - %(message)s')
file_handler = logging.FileHandler("{0}/{1}.log".format(logger_path, name))
file_handler.setFormatter(formatter)
self.logger.addHandler(file_handler)
if level:
self.logger.setLevel(level)
else:
self.logger.setLevel(self.LEVEL)
Well I passed the relative path from a.py and it worked while I was running the script from the directory /dir2. But it doesn't work elsewhere in the project...

Related

Resetting file handler name in python logging

I have a code structure that looks like this:
.
├── scripts
│ └── test.py
└── utils
├── __init__.py
└── lib.py
In my utils/__init__.py file I've set up logging like so:
import logging
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
ch = logging.FileHandler("logfile", mode="w")
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter(
"%(levelname)s:%(name)s:[%(funcName)s]:%(message)s", "%m/%d/%Y %I:%M:%S %p"
)
ch.setFormatter(formatter)
logger.addHandler(ch)
This works, and if I log in lib.py using code like so
LOGGER = logging.getLogger(__name__)
def a_function():
LOGGER.info('This works')
then everything runs and gets logged.
However, I often find myself wanting to run different scripts while in the same environment and I'd like to have a log that's named differently for each run (I don't want them writing over each other or appending to the same log). What is the best way to modify the logging name when logging is set up like this? I've tried doing things like the following in scripts/test.py:
import logging
from utils.lib import a_function
def main():
# rename handler
fh = logging.FileHandler("logfile2", "w")
formatter = logging.Formatter(
"%(levelname)s:%(name)s:[%(funcName)s]:%(message)s", "%m/%d/%Y %I:%M:%S %p"
)
fh.setFormatter(formatter)
log = logging.getLogger()
log.handlers.clear()
log.addHandler(fh)
a_function()
if __name__ == '__main__':
main()
This works in adding a new log called logfile2, but I still get logging in the original logfile. I would have thought that the log.handlers.clear() command would have cleared that out.
What am I doing wrong here?

How to prevent logging to file and capture logs for assertions using Pytest

I have a suite of unit tests for a python module and use Pytest to run those tests. I am now starting to use the standard logging library in my library and would need help solving two issues:
How to prevent logs into files when running the test suite, to prevent files growing and to prevent unhelpful entries in real log files.
How to capture the logs inside unit tests, to be able to run assertions on the logs that the library generates.
The module that I am trying to test configures the logging library in __init__.py to log into a file and logs entries in the module code using the info method. That works fine and when I run the code the right log entries appear in the file ok. -- see code below --
I have tried to use the caplog fixture in pytest -- see code and output below -- but the effect I get is:
log entries are included into the file (logs generated in the runs from the test using caplog and all other tests)
caplog.text is empty
Unit Tests Code
import library.module
class TestFunction:
def test_something_else(self):
library.module.function():
assert True
def test_logs(self,caplog)
library.module.function():
assert "desired" in caplog.text
Test Output
(...)
> assert "desired" in caplog.text
E AssertionError: assert 'desired' in ''
E + where '' = <_pytest.logging.LogCaptureFixture object at (...).text
(...)
Log entries after running test suite
2021-12-07 11:10:05,915 - library.module - INFO - desired
2021-12-07 11:10:05,917 - library.module - INFO - desired
Logging module configuration
__init__.py
import logging.config
import yaml
with open("logging.yaml") as f:
conf_dict = yaml.safe_load(f)
logging.config.dictConfig(conf_dict)
logging.yaml
version: 1
formatters:
simple:
format: '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
handlers:
training:
class: logging.FileHandler
level: DEBUG
formatter: simple
filename: logs/test_log
loggers:
library.module:
level: DEBUG
handlers: [training]
propagate: no
root:
level: DEBUG
handlers: []
Module under test
import logging
logger = logging.getLogger(__name__)
def function():
logger.info("desired")
File Structure
.
├── library
│ ├── module.py
│ └── __init__.py
├── tests
│ └── test_module.py
└── logs
└── test_log
To avoid writing to the log file, I suggest that, in test_module.py, you simply mock the logger and use it in your test, like this:
import pytest
import library.module
#pytest.fixture
def logger(mocker):
return mocker.patch("library.module.logger.info")
class TestFunction:
def test_something_else(self):
library.module.function():
assert True
def test_logs(self,logger)
library.module.function():
logger.assert_called_with("desired")

How to set the log level for an imported module?

Write your code with a nice logger
import logging
def init_logging():
logFormatter = logging.Formatter("[%(asctime)s] %(levelname)s::%(module)s::%(funcName)s() %(message)s")
rootLogger = logging.getLogger()
LOG_DIR = os.getcwd() + '/' + 'logs'
if not os.path.exists(LOG_DIR):
os.makedirs(LOG_DIR)
fileHandler = logging.FileHandler("{0}/{1}.log".format(LOG_DIR, "g2"))
fileHandler.setFormatter(logFormatter)
rootLogger.addHandler(fileHandler)
rootLogger.setLevel(logging.DEBUG)
consoleHandler = logging.StreamHandler()
consoleHandler.setFormatter(logFormatter)
rootLogger.addHandler(consoleHandler)
return rootLogger
logger = init_logging()
works as expected. Logging using logger.debug("Hello! :)") logs to file and console.
In a second step you want to import an external module which is also logging using logging module:
Install it using pip3 install pymisp (or any other external module)
Import it using from pymisp import PyMISP (or any other external module)
Create an object of it using self.pymisp = PyMISP(self.ds_model.api_url, self.ds_model.api_key, False, 'json') (or any other...)
What now happens is, that every debug log output from the imported module is getting logged to the log file and the console. The question now is, how to set a different (higher) log level for the imported module.
As Meet Sinoja and anishtain4 pointed out in the comments, the best and most generic method is to retrieve the logger by the name of the imported module as follows:
import logging
import some_module_with_logging
logging.getLogger("some_module_with_logging").setLevel(logging.WARNING)
Another option (though not recommended if the generic method above works) is to extract the module's logger variable and customize it to your needs. Most third-party modules store it in a module-level variable called logger or _log. In your case:
import logging
import pymisp
pymisp.logger.setLevel(logging.INFO)
# code of module goes here
A colleague of mine helped with this question:
Get a named logger yourLogger = logging.getLogger('your_logger')
Add a filter to each handler prevents them to print/save other logs than yours
for handler in logging.root.handlers:
handler.addFilter(logging.Filter('your_logger'))

Python How can I set up global logger setting in multi python files?

I just started Python and I am struggling to use Logger. I have two python files: app.py and liba.py. I want to setup logging at app.py and use it for liba.py (and another libraries). Do you have any good ideas or can you share any references?
file structure
entry.py
lib/liba.py
app.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
from lib import liba
handler = logging.FileHandler('/tmp/app.log', 'a+')
logger.addHandler(handler)
logger.warn('sample')
liba.out()
lib/liba.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
def out():
logger.warn('liba')
run python
$ python3 app.py
liba
app.py output log to the logfile. liba.py does not output the log into the file. I want to save logs in the same file.
Do like so:
app.py
#! /usr/bin/env python3
import logging
logger = logging.getLogger()
handler = logging.FileHandler('/tmp/app.log', 'a+')
logger.addHandler(handler)
logger.warn('sample')
from lib import liba
liba.out()
lib/liba.py
#! /usr/bin/env python3
import logging
def out():
logging.warn('liba')
You don't need to instantiate the logging, unless you want to configure handlers, which you only do in your main script. Then all logging will go to the root logger which is what you get when instantiating with no specific name logging.getLogger(). I like to use it this way as you don't need to match names across all your modules for it to work. In your modules you just send log messages out by using logging.warn('blabla'). You further need to make sure you define all your handlers before any call to logging.warn is made, otherwise some default handler will take its place.

How python logging get it's configuration

I used to Python logging, it works fine. The logging.basicConfig(...) set in one module (a some.py file), then we can use logging every where. Obviously, logging is global.
The question is how logging find it's settings, when we not call the module where basicConfig(...) appeared (in some.py file )? Is logging scan all the packages?
Even the logging.basicConfig(...) put into an any.py and the module (any.py) never get imported, or not used anywhere, the logging setting take effect!
To understand logging you have dive into Python's standard library sources.
Here is the trick:
#/usr/lib/python3.2/logging/__init__.py
...
root = RootLogger(WARNING)
Logger.root = root
Logger.manager = Manager(Logger.root)
...
# and
def basicConfig(**kwargs):
...
hdlr = StreamHandler(stream)
fs = kwargs.get("format", BASIC_FORMAT)
dfs = kwargs.get("datefmt", None)
style = kwargs.get("style", '%')
fmt = Formatter(fs, dfs, style)
hdlr.setFormatter(fmt)
root.addHandler(hdlr)
So, when you call basicconfig() with certain parameters, root logger is set.
Finally getLogger:
def getLogger(name=None):
"""
Return a logger with the specified name, creating it if necessary.
If no name is specified, return the root logger.
"""
if name:
return Logger.manager.getLogger(name)
else:
return root
I think there is no magic scanning here.
Try to test it this way in a separate test directory:
test/main.py:
import logging
logging.info('test')
test/any.py:
import logging
logging.basicConfig(filename='test.log', level=logging.INFO)
python main.py
Result: NO test.log file.
Now let's update the test:
test/main.py:
import logging
import any
logging.info('test')
python main.py
Result: new test.log file with INFO:root:test string inside.
So I guess that any.py in your case is imported somehow,
despite your expectations.
You may find the way any.py is imported easily,
just add few lines there:
test/any.py:
from traceback import print_stack
print_stack()
...
python main.py
Result:
File "main.py", line 2, in
import any
File "any.py", line 2, in
print_stack()
This stack shows that any.py is imported from main.py.
I hope you will find where it is imported from in your case.

Categories

Resources