I have the following file structure in my Python project:
Main.py
classes
|--- __init__.py
|--- Common.py
|--- Logger.py
|--- Dictionary.py
I'm setting a static variable of the Common class from within my Main file:
from classes.Common import Common
# set path to the log file for log output
Common.set_log_file_path(C:\\logging\\output.txt"))
which gets set in the Common class:
class Common():
log_file_path = ''
#staticmethod
def set_log_file_path(log_file_path):
Common.log_file_path = log_file_path
Now I instantiate a Logger object from within my Main file:
from classes.Logger import Logger
# initiate logger
log = Logger()
The Logger object reads the log file path from the Common object which works fine:
from Common import Common
class Logger():
log_file_path = ''
def __init__(self, log_file_path=''):
# if not specified, take default log file path from Common class
if log_file_path == '':
self.log_file_path = Common.log_file_path
Now comes the problem: From my Main file I instantiate a Dictionary object:
from classes.Dictionary import Dictionary
# load dictionary
dictionary = Dictionary()
In the dictionary object I also want to have a logger, so I create one there:
from Logger import Logger
class Dictionary():
log = Logger()
def __init__(self):
Dictionary.log.info('Dictionary > __init__()')
But this one doesn't work. Somehow when the Logger from within the Dictionary tries to load the log file path from the Common class, it is empty.
Why is that so? Shouldn't that be the same Common class and therefore holding the same static information here? Am I missing something? Do I do the imports in a wrong way?
I'm working with Python 2.6.5, and my imports are as follows:
Main.py imports Dictionary, Logger, Common
Dictionary.py imports Logger
Logger.py imports Common
Common has no imports
Almost everything in a Python module is dynamically executed — including import and class statements.
When you import the Directory module the first time the module is executed which means the import statement which imports Logger is executed and after the Logger is imported the class Directory: is executed which instantiates a Logger object. At this point your Common.set_log_file_path() was not yet called so that Logger object takes the default value which is defined when class Common: was executed.
The ”solution” would be importing Common and set the default path before executing anything that actually uses that default path attribute:
from classes.Common import Common
Common.log_file_path = 'C:/logging/output.txt'
from classes.Directory import Directory
from classes.Logger import Logger
Solution in quotes because having imports depending on other code being executed before is something that can turn into little nightmares very easily. Therefore it is something very seldom seen in productive code.
Related
Trying to understand how to create a single instance of a logging class in a separate module. I am trying to access the logging module from my python script. So, I have different files in this automation script and I am trying to record the logs in a file and displaying the logs in the console at the same time so I would be utilizing two handlers namely: FileHandler() and StreamHandler(). The initialization of logger is in a different file called debugLogs.py and I am accessing this file from multiple Python modules running the script. But if the separate modules call debugLogs.py it creates multiple instances of the logger which means it gets printed multiple times which is not what I want. That is why I need to use singleton method to create just one instance. How do you suggest I go about doing that? Should I create a method inside class called setLogger and use it to initialize logger, if so how to do declare method inside a singleton class? I have included my version of debugLogs.py in this code and I have shown:
#debugLogs.py
import logging
import logging.handlers
#open readme file and read name of latest log file created
def get_name():
with open("latestLogNames.txt") as f:
for line in f:
pass
latestLog = line
logfile_name = latestLog[:-1]
return logfile_name
class Logger(object):
_instance = None
def __new__(self, logfile_name):
if not self._instance:
self._instance = super(Logger, self).__new__(self)
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
formatter = logging.Formatter('%(message)s')
file_handler = logging.FileHandler(logfile_name)
file_handler.setFormatter(formatter)
stream_handler = logging.StreamHandler()
logger.addHandler(file_handler)
logger.addHandler(stream_handler)
return self._instance
So get_name() gets the latest log name already indicated in the readme file latestLogNames.txt and inputs the logs in the latest log file already there. I understand that my singleton class code is not right and I am confused on how to initialize the whole class structure. But somehow I would have to pass that logfile_name value to that class. So I am planning to call this logger from a different module with something like this:
#differentModule.py
import debugLogs
logger = debugLogs.Logger(debugLogs.get_name())
And then I would use logger.info("...") to print the logs as well as store it in the file. Please tell me how to restructure the debugLogs.py and how to call it from different modules of my script.
I want to use a memory logger in my project. It keeps track of the last n logging records. A minimal example main file looks like this:
import sys
import logging
from logging import StreamHandler
from test_module import do_stuff
logger = logging.getLogger(__name__)
class MemoryHandler(StreamHandler):
def __init__(self, n_logs: int):
StreamHandler.__init__(self)
self.n_logs = n_logs
self.my_records = []
def emit(self, record):
self.my_records.append(self.format(record))
self.my_records = self.my_records[-self.n_logs:]
def to_string(self):
return '\n'.join(self.my_records)
if __name__ == '__main__':
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
mem_handler = MemoryHandler(n_logs=10)
logger.addHandler(mem_handler)
logger.info('hello')
do_stuff()
print(mem_handler.to_string())
The test module I am importing do_stuff from looks like this:
import logging
logger = logging.getLogger(__name__)
def do_stuff():
logger.info('doing stuff')
When I run the main function two log statements appear. The one from main and the one from doing stuff, but the memory logger only receives "hello" and not "doing stuff":
INFO:__main__:hello
INFO:test_module:doing stuff
hello
I assume that this is because mem_handler is not added to the test_module logger. I can fix this by adding the mem_handler explicitely:
logging.getLogger('test_module').addHandler(mem_handler)
But in general I don't want to list all modules and add the mem_handler manually. How can I add the mem_handler to all loggers in my project?
The Python logging system is federated. That means there is a tree like structure similar to the package structure. This structure works by logger name and the levels are separated by dots.
If you use the module's __name__ to get the logger it will be equivalant to the dotted name of the package. for example:
package.subpackage.module
In this federated system a message is send up the loggers structure (unless one of the loggers is explicitly configured with propagate=False).
So, the best way to add a handler is to add it to the root logger on the top of the structure and make sure all loggers below propagate.
You can get the root logger with logging.getLogger() (without any name) and then add handlers or other configuration as you like.
I am getting duplicate (double) logs when using the python logging. I have 3 files :
1. main.py
2. dependencies.py
3. resources.py
I am making only 1 call to the python logger constructor which is done inside the main.py
Following are my import statements in the 3 files
main.py
import xml.etree.ElementTree as et
from configparser import ConfigParser
from Craftlogger import Craftlogger
logger = Craftlogger().getLogger()
dependencies.py
import os,sys
from main import getJobDetails,postRequest,logger
from configparser import ConfigParser
resources.py
import os,sys
import xml.etree.ElementTree as et
And inside the main method in the main.py, I have the imports
def main():
from resources import getResourceDetails,setResources
from dependencies import setDependencies
..... Remaining code .....
My logging file looks like this
import logging
class Craftlogger:
def __init__(self):
self.logger = logging.getLogger(__name__)
handler = logging.StreamHandler()
formatter_string = '%(asctime)s | %(levelname)-8s | %(filename)s-%(funcName)s-%(lineno)04d | %(message)s'
formatter = logging.Formatter(formatter_string)
handler.setFormatter(formatter)
self.logger.addHandler(handler)
self.logger.setLevel(logging.DEBUG)
self.logger.propagate = False
def getLogger(self):
return self.logger
Note : I had to do the imports inside of main so as to be able to do circular imports.
My guess would be that two CraftLogger objects exist and both have the same self.logger member. logging.getLogger(__name__) probably returns the same object for another CraftLogger object, resulting in two addHandler calls on the same logger. This is just a guess, no guarantee.
Logging is a cross cutting concern. As such, I frown upon classes which set up logging on their own. The responsibility to configure logging (especially handlers) should be solely with the main executing function, e.g. your main function. No submodule / class / function should modify logging, except getting a logger via logging.getlogger(name).
This avoids most of these pitfalls and allows easy composition of modules.
Imagine you have to import two modules who both modify the logging system...fun
Let's say I have code like this:
ModuleA.py:
import logging
logger = logging.getLogger('A')
def utility_func():
logger.info('hi')
print utility_func()
ModuleB.py:
import logging
logger = logging.getLogger('B')
from ModuleA import utility_func
print utility_func()
When utility_func is called from within ModuleA.py I want it to use the 'A' logger, and when it's called from ModuleB.py I want it to use the 'B' logger.
Is there a way to do this? Or a better way to set things up?
Update:
What about the idea of changing utility_func to:
def utility_func():
logging.info('hi')
Would that bubble up to whichever log the calling code is using?
Here's what I ended up going with. But I'm still curious to hear if there's a more elegant way.
ModuleB.py:
import logging
import ModuleA
logger = logging.getLogger('B')
ModuleA.logger = logger
print ModuleA.utility_func()
Put this in a separate file (like utils.py)
def utility_func(logger):
logger.info('hi')
Then in file ModuleA.py
import logging
import utils
logger = logging.getLogger(__name__)
print utility_func(logger)
I created some Python files keeping my functions a bit separated to ease working / fixing. All files are in one directory. The structure may get broken down to something like:
a.py (a class A with basic stuff)
b.py (a class B with basic stuff)
modA.py (create a class C deriving from A and B)
modB.py (create a class D deriving from A and B)
...
main_a.py (using class C)
main_b.py (using class D)
Every module uses the logging stuff from python. An why so ever - only the root logger messages are written. And I don't see my error.
Here is a minimal example.
a.py
import logging
logger = logging.getLogger(__name__)
class A(object):
def __init__(self):
logger.debug("Instance of A")
b.py
import logging
logger = logging.getLogger(__name__)
class B(object):
def __init__(self):
logger.debug("Instance of B")
ab.py
import a
import b
import logging
logger = logging.getLogger(__name__)
class AB(a.A, b.B):
def __init__(self):
logger.debug("Instance of AB")
a.A.__init__(self)
b.B.__init__(self)
main_one.py
import sys
import ab
import logging
import logging.handlers
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
handler = logging.StreamHandler(stream=sys.stderr)
handler.setLevel(logging.DEBUG)
handler.setFormatter(logging.Formatter('%(name)s: %(message)s'))
logger.addHandler(handler)
logger.warning("The trouble starts")
ab = ab.AB()
I also tried to use something like self.logger = logging.getLogger(type(self).__name__) to do logging on a per class base, but the result is the same. So may one of you point out where I went wrong when reading the python logging manuals?
TIA.
EDIT 1: My solution
Thanks to both, #falsetru and #Jakub M., using both answers leads to a working solution.
First I put everything in a hierarchy.
main_one.py
lib/
__init__.py
ab.py
basic/
__init__.py
a.py
b.py
Second I changed the logger = logging.getLogger(__name__) in main_one.py to logger = logging.getLogger() (No name for the root logger!).
That did the trick.
Very helpful was a code snippet on GitHub.
Do print __name__ for each of your modules and see what you get there actually. You should put your modules into proper directories so __name__ is a "period separated hierarchical value". For example, if your file hierarchy looked like:
main_a.py
libs/
__init__.py
a.py
modA.py
then __name__ of your modules (a.py and modA.py) would be libs.a and libs.modA
The name is potentially a period-separated hierarchical value, like
foo.bar.baz (though it could also be just plain foo, for example).
Loggers that are further down in the hierarchical list are children of
loggers higher up in the list. For example, given a logger with a name
of foo, loggers with names of foo.bar, foo.bar.baz, and foo.bam are
all descendants of foo. The logger name hierarchy is analogous to the
Python package hierarchy, and identical to it if you organise your
loggers on a per-module basis using the recommended construction
logging.getLogger(name). That’s because in a module, name is
the module’s name in the Python package namespace.
Use same logger name for all module. __name__ is module name; a, b, ab, main_one ...
logger = logging.getLogger('daniel_lib')