I cant get logger as a global name... I tried it inside a normal script, and later trying to debug inside the python cli, but its out of my reach apparently...
(as you will notice, I tried to define logger global everywhere, but also without that, no success)
Inside the python cli-program:
import time
import datetime
import subprocess
import re
import glob
import logging
from daemon import runner
from lockfile import LockTimeout
import RPIO
import picamera
import atexit
#From here, it should be global right?
global logger
logger = logging.getLogger("DoorcamLog")
import DoorcamExample
doorcam=DoorcamExample.Doorcam()
Error returned:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "DoorcamExample.py", line 28, in __init__
logger.info('Doorcam started capturing')
NameError: global name 'logger' is not defined
DoorcamExample.py:
#!/usr/bin/python
import sys
import os
if os.geteuid() != 0:
# This works perfect on a Raspbian system because there's no rootpassword required
os.execvp("sudo", ["sudo"] + sys.argv)
print('to far!') #This should NEVER be reached, there is something wrong...
sys.exit(1)
import time
import datetime
import subprocess
import re
import glob
import logging
from daemon import runner
from lockfile import LockTimeout
import RPIO
import picamera
import atexit
class Doorcam:
global logger
def __init__(self):
logger.info('Doorcam started capturing')
self.pollLightFile='/var/tmp/picam-pollLight'
atexit.register(self.stopListening)
def socket_callback(self, socket, val):
vals=val.split()
if len(vals) == 0 or len(vals) > 4:
number=1
notify=True
trigger='Socket'
triggernotify='Socket (value %s)'%val
elif len(vals) == 1:
number=int(vals[0])
notify=True
trigger='Socket'
triggernotify='Socket (value %s)'%val
elif len(vals) == 2:
number=int(vals[1])
notify=True
trigger=vals[0]
triggernotify=vals[0]
elif len(vals) == 3:
number=int(vals[1])
trigger=vals[0]
triggernotify=vals[0]
notify=self.boolval(vals[2])
elif len(vals) == 4:
number=int(vals[2])
trigger=vals[0]
triggernotify=vals[0], [1]
notify=self.boolval(vals[3])
socket.send('self.upload(self.shot(filename=self.filename, number=number, trigger=trigger), notify=notify,trigger=triggernotify)')
RPIO.close_tcp_client(socket.fileno())
def startListening(self,channel,port=8080, threaded=True):
#RPIO.add_interrupt_callback(channel, self.gpio_callback, pull_up_down=RPIO.PUD_DOWN, debounce_timeout_ms=1000)
RPIO.add_tcp_callback(port, self.socket_callback)
RPIO.wait_for_interrupts(threaded=threaded)
def stopListening(self):
logger.info('Stop listening')
RPIO.stop_waiting_for_interrupts()
global logger
"Global" variables are only global within a single module, so your DoorcamExample.py doesn't have access to logger that you defined in some other module.
In this case, you don't need a global variable, because the logging module already maintains a truly global (i.e., visible from all modules) registry of loggers. So if you do logging.getLogger("DoorcamLog") in any of your modules, you'll get a reference to the same logger.
You don't need a program-wide global variable in this case. The logging module tracks all loggers created via calls to getLogger, so as long as you use the same name, you'll get the same logging object. So calls to logging.getLogger("DoorcamLog") will return the same object in both scripts.
As BrenBarn commented, in Python, "global" only refers to the current module's namespace - hopefully, since you wouldn't want a module to depend on some name being defined in the importing module's namespace.
Also, the "global" keyword is only meaningful in a function - every name defined at the module's top-level is by definition global to the module - and it's only useful if you actually want to rebind that name within the function (so Python knows you're not creating a local variable instead).
wrt/ loggers naming, the simplest and more effective solution is to create a logger for each module and just pass the current module's name (ie the 'magic' variable __name__), except for executable scripts (which will be named "main").
Related
I have a main module, one module to hold constants, and another module that reads a config text file.
The main module contains the following:
# Main module
import constants
def main():
for _ in range(constants.CONFIG['DURATION']):
...
if __name__ == '__main__':
main()
The constants module:
# constants module
import helpers.configure()
# Constants
...
CONFIG = helpers.configure.configure()
and the helpers.configure module:
# helpers.configure module
def configure():
with open('./config.txt', 'r') as configure_file:
CONFIG = eval(configure_file.read())
return CONFIG
I'm concerned with the helpers.configure.configure() call in the constants module. Is it bad practice to put a function call loosely in a (non-main) module like this? It's only called because the constants module is loaded upon running the main module, but there's no direct connection between the helpers.configure.configure() function and what's in the main function (nothing in the main function calls the helpers.configure.configure() function). Would it be better to call the helpers.configure.configure() directly within the main module like what's below?
import constants
import helpers.configure
def main():
CONFIG = helpers.configure.configure()
for _ in range(CONFIG['DURATION']):
...
logging.LogRecord.getMessage() simplifies the manipulation of logging records by providing a factory. I use a module of mine, imported in each piece of code, to homogenize the logging:
# this is mylogging.py
import logging
import logging.handlers
def mylogging(name):
old_factory = logging.getLogRecordFactory()
def record_factory(*args, **kwargs):
record = old_factory(*args, **kwargs)
# send an SMS for critical events (level = 50)
if args[1] == 50:
pass # here is the code which sends an SMS
return record
logging.setLogRecordFactory(record_factory)
# common logging info
log = logging.getLogger(name)
log.setLevel(logging.DEBUG)
(...)
All my scripts bootstrap logging via a
log = mylogging.mylogging("name_of_the_project")
This works fine.
I now would like to keep track of the number of SMS sent. For this I would like to set a counter within mylogging.py, common to all scripts which import mylogging. The problem is that such a variable will be local to each script.
On the other hand, logging is peculiar in the sense that when different scripts call logging.getLogger(name) with the same name, the handler is reused - which means that there is some persistence between scripts (even though each of them does an independent import logging).
With this in mind, is there a way to use a variable which would be common to all logging, placed right after the here is the code which sends an SMS line, and which would be incremented no matter what script the logging request comes from?
An import such as
from mylogging import mycounter
mycounter += 1
adds a new reference to mycounter in the local module namespace. For an immutable type such as an integer counter, the addition rebinds the value in the local namespace only - other modules see the value at the point where they imported.
One solution is to keep the original namespace so that the rebinding happens in mylogger itself.
import mylogger
mylogger.mycounter += 1
This is fragile. Its not very obvious that it only works because of the way the import was done.
A better solution is to use a mutable type. itertools.count is interesting but doesn't let you view the current value of the counter. Here's a simple class that will do it. I've adding locking so that it works in a multithreaded environment also.
Add to mylogger.py
import threading
class MyCounter(object):
def __init__(self):
self.val = 0
self.lock = threading.Lock()
def inc(self):
with self.lock:
self.val += 1
return self.val
sms_counter = MyCounter()
Some other module
from mylogger import sms_counter
print('sms count is {}'.format(sms_counter.inc()))
I have a package with testing modules and inside the init file I have a setUp method with some operations. These operations are executed correctly before any unit test in the package's modules run. Inside the setUp method I'd like to initialize a global variable and then access it from other modules of the package. But this doesn't work.
# TestPackage/__init__.py
def setUp():
global spec_project
core_manager = get_core_manager()
spec_project = core_manager.get_spec()
#TestPackage/test_module.py
from TestPackage import spec_project
import unittest
class TestRules(unittest.TestCase):
def setUp(self):
spec_project.get_new_device()
Like this I get an
ImportError: cannot import name spec_project
If I initialize the spec_project variable outside of the setUp method in the init file I can have access to it but its content is not changed after the operations in the setUp method.
# TestPackage/__init__.py
spec_project = None
def setUp():
global spec_project
core_manager = get_core_manager()
spec_project = core_manager.get_spec()
#TestPackage/test_module.py
from TestPackage import spec_project
import unittest
class TestRules(unittest.TestCase):
def setUp(self):
spec_project.get_new_device()
Like this I get an
AttributeError: 'NoneType' object has no attribute 'get_new_device'
How can initialize the spec_project variable inside the setUp method of the init file and still have access to it from other module in the package?
It looks like setUp() isn't being called, but if you are certain that it is, then it could be the way that you are importing TestPackage. Try importing like this:
#TestPackage/test_module.py
import TestPackage
import unittest
class TestRules(unittest.TestCase):
def setUp(self):
TestPackage.spec_project.get_new_device()
The setUp() method has to be called before you use the global. This same thing should apply to the second way you tried. But again, that is assuming that setUp is run. You can alias TestPackage if you feel it is necessary, or you should be able to import it if it is defined outside the method.
Since you are explicitly importing it, it is likely trying to make a copy of it, which isn't possible, since it is inside of the setUp() body.
I have the following file structure in my Python project:
Main.py
classes
|--- __init__.py
|--- Common.py
|--- Logger.py
|--- Dictionary.py
I'm setting a static variable of the Common class from within my Main file:
from classes.Common import Common
# set path to the log file for log output
Common.set_log_file_path(C:\\logging\\output.txt"))
which gets set in the Common class:
class Common():
log_file_path = ''
#staticmethod
def set_log_file_path(log_file_path):
Common.log_file_path = log_file_path
Now I instantiate a Logger object from within my Main file:
from classes.Logger import Logger
# initiate logger
log = Logger()
The Logger object reads the log file path from the Common object which works fine:
from Common import Common
class Logger():
log_file_path = ''
def __init__(self, log_file_path=''):
# if not specified, take default log file path from Common class
if log_file_path == '':
self.log_file_path = Common.log_file_path
Now comes the problem: From my Main file I instantiate a Dictionary object:
from classes.Dictionary import Dictionary
# load dictionary
dictionary = Dictionary()
In the dictionary object I also want to have a logger, so I create one there:
from Logger import Logger
class Dictionary():
log = Logger()
def __init__(self):
Dictionary.log.info('Dictionary > __init__()')
But this one doesn't work. Somehow when the Logger from within the Dictionary tries to load the log file path from the Common class, it is empty.
Why is that so? Shouldn't that be the same Common class and therefore holding the same static information here? Am I missing something? Do I do the imports in a wrong way?
I'm working with Python 2.6.5, and my imports are as follows:
Main.py imports Dictionary, Logger, Common
Dictionary.py imports Logger
Logger.py imports Common
Common has no imports
Almost everything in a Python module is dynamically executed — including import and class statements.
When you import the Directory module the first time the module is executed which means the import statement which imports Logger is executed and after the Logger is imported the class Directory: is executed which instantiates a Logger object. At this point your Common.set_log_file_path() was not yet called so that Logger object takes the default value which is defined when class Common: was executed.
The ”solution” would be importing Common and set the default path before executing anything that actually uses that default path attribute:
from classes.Common import Common
Common.log_file_path = 'C:/logging/output.txt'
from classes.Directory import Directory
from classes.Logger import Logger
Solution in quotes because having imports depending on other code being executed before is something that can turn into little nightmares very easily. Therefore it is something very seldom seen in productive code.
Many Python modules preserve an internal state without defining classes, e.g. logging maintains several loggers accessible via getLogger().
How do you test such a module? Using the standard unittest tools, I would like the various tests inside a TestCase class to re-import my module-under-test so that each time it loses its context. Can this be done?
import unittest
import sys
class Test(unittest.TestCase):
def tearDown(self):
try:
del sys.modules['logging']
except KeyError:
pass
def test_logging(self):
import logging
logging.foo=1
def test_logging2(self):
import logging
print(logging.foo)
if __name__ == '__main__':
unittest.sys.argv.insert(1,'--verbose')
unittest.main(argv = unittest.sys.argv)
% test.py Test.test_logging passes:
test_logging (__main__.Test) ... ok
but
% test.py Test.test_logging2 does not:
test_logging2 (__main__.Test) ... ERROR
since the internal state of logging has been reset.
This will reimport the module as new for you:
import sys
del sys.modules['my_module']
import my_module