Best practice to write logs in /var/log from a python script? - python

I want to write some log informations from a python's main script into a file in /var/log.
When I call logger.info("Starting"), I get a PermissionError on the file, what is quite normal since files in /var/log belong to root and my program is not run as root.
I could of course set /var/log/my.log's rights in order to let myapp write into it. (Set the same group for instance). But it doesn't look like a good practice to me: what if I install myapp on another computer? Should I then change the rights on the log file during the install process? Or is there another more generic way to do that? (Like a generic way to send the logs to "the system"? By generic I mean also portable, what would work on linux, freebsd etc.)
Though I'm not sure it's relevant, for information, here are some portions of my code:
Main script:
import logging, logging.config
from lib import settings
settings.init()
logging.config.fileConfig(settings.logging_conf_file)
logger = logging.getLogger(__name__)
The handler matching settings.logging_conf_file, in the logging config file:
[handler_mainHandler]
class=FileHandler
level=INFO
formatter=defaultFormatter
filemode=w
args=('/var/log/myapp.log',)

If syslogd is running on your box, you can try to use SysLogHandler to avoid issue with folder permissions
(https://docs.python.org/2/library/logging.handlers.html#sysloghandler).
To specify your category, you need to set facility parameter to desired, for example LOG_LOCAL5. In this case, it will correspond to local5.* category of syslogd.
As you specify facility as handler parameter, but not file name, you need to adjust syslog configuration to say syslogd to write log records to particular file. In FreeBSD, syslog conf file is /etc/syslog.conf (syslog.conf(5)).
Also you can add syslog mapping like . to /var/log/all.log to handle all logs from all syslog producers. It's helpful to determine if logging works and what is your application category, if there is a doubt.
For rsyslogd, it's possible to get more informations here: How to configure rsyslog for use with SysLogHandler logging class?

Related

Python logging using modules

I have multiple modules, which are called by a principal script. Each one does log messages using logging Python builtin package.
How can I log an session ID, set during the execution of the main script, across all modules, without needing to push this variable in each module?
I have set up a python configuration file, called config, with:
import logging
logging.basicConfig(
level=logging.DEBUG,
format="%(filename)s:%(lineno)s|%(funcName)3s()|%(asctime)s|%(levelname)s|%(message)s",
handlers=[
logging.FileHandler("debug.log"),
logging.StreamHandler()
]
)
Other modules are using this pre configured logging object, via import, so I am using this:
from config import logging
But I need to log an ID from my current session, my log should look like:
module_name.py:25|function_name()|2020-04-27 18:28:26,518|INFO|Session_ID=abc123|some_message_here
I have tried to put this variable in the config file, set it, and then use it in a function named "log_info" and "log_debug" in this file, but my output log does not trace python script name and function name any more.
Does anyone knows how to handle this situation?
I don't know if this is the 100% best solution, or for sure if it'll work in your situation, but I used it for something similar (verbosity printing levels that would persist across several scripts).
Create a Python file, named, say, sessionid. Inside that, define a top-level setting named id. To set your id, import sessionid and sessionid.id = 'some_id'. Then have your config file import sessionid as well and use sessionid.id as needed.
It took me a bit to figure out that I had to access it like that; changing it using from sessionid import id; id = 'some_id' only persists within the script that does so.
It would make sense to put the id variable in your config file, but only if you don't need to set it in the same script that also needs to from config import logging.

How to use logger with one basic config across app in Python

I want to improve my understanding of how to use logging correctly in Python. I want to use .ini file to configure it and what I want to do:
define basic logger config through .fileConfig(...) in some .py file
import logger, call logger = logging.getLogger(__name__) across the app and be sure that it uses my config file that I was loaded recently in different .py file
I read few resources over Internet ofc but they are describing tricks of how to configure it etc, but want I to understand is that .fileConfig works across all app or works only for file/module where it was declared.
Looks like I missed some small tip or smth like that.
It works across the whole app. Be sure to configure the correct loggers in the config. logger = logging.getLogger(__name__) works well if you know how to handle having a different logger in every module, otherwise you might be happier just calling logger = logging.getLogger("mylogger") which always gives you the same logger. If you only configure the root logger you might even skip that and simply use logging.info("message") directly.

Linux: get the location of logfiles

I would like to have my (python) script (located anywhere) write a logfile
and place that logfile in an appropriate location. In Linux:Debian this could be /var/log.
I was wondering if such a logfile location could be retrieved from the system? Is there an environment variable or something?
The typical way to log on Linux/UNIX is to use the system logger. From your application (or daemon) you call the syslog system function (see manpage).
This function will forward your log to the system logger. From there, the system logger will take care of writing these log messages to a file. You can also then customize the behavior of the system logger to write some of your messages to a special file, or to ignore them.
For a direct answer to your question, /var/log/ is defined by the FHS as the location where log files must be written to. No need to rely on an environment variable.

Incompatibility between import-time logger naming with logging configuration

I am setting up my Python logging in main.py via reading in a file and using the fileConfig option. I want to be able to switch between testing and live logging configurations, so I want to read in a separate config file first and extract the logging config file path from there.
The problem here is that other files that I import from main.py grab their own logger via log = getLogger(__name__), and this happens at import time. These links then get broken when the new configuration is loaded in, and these modules end up without the logging working the way I expect.
I can't easily delay the importing of these modules without a lot of refactoring, so is there any other way of being able to keep this method of setting up loggers by module name while still loading in the log configuration later?
I'm not sure from your question exactly how things are breaking, but here's how I see it. The various modules which do log = logging.getLogger(__name__) will have valid names for their loggers (logger name = package name), unless you were to somehow actually move the modules to some other package location.
At import time, the logging configuration may or may not have been set, and there shouldn't be any actual logging calls made as a side-effect of the import (if there are, the messages may have nowhere to go).
Loading a new configuration using fileConfig typically just sets handlers, formatters and levels on loggers.
When you subsequently call code in the imported modules, they log via their loggers, which have handlers attached by your previous configuration call - so they will output according to the configuration.
You should be aware that on older versions of Python (<= 2.5), calls to fileConfig would unavoidably disable existing loggers which weren't named in the configuration - in more recent versions of Python (>= 2.6), this is configurable using a disable_existing_loggers=False keyword argument passed to fileConfig. You may want to check this, as it sometimes leads to unexpected behaviour (the default for that parameter is True, for compatibility with behaviour under the older Python versions).
If you post more details about what seems broken, I might be able to provide a better diagnosis of what's going on.

Python - logging and configuration change

Ok, it's a little hard to describe my problem, I will try as much as possible to explain it simply.
When my application starts up, it creates Logger from logging lib.
Before loading configuration file, the only handler for logger is stdout.
After several printed logs, application finally loads configuration file which include Logger configuration options, such as file for store logs. I modify my Logger options to use file handler for this file, simultaneously with stdout handler. But i need to store all previous logs generated by my script in this file, using formatting from configuration. I was thinking about MemoryHandler, running simultaneously with stdout handler, and after loading configuration write all logs from memory to file - before creating a file handler.
The problem is that MemoryHandler is not well documented, and this way to solve the problem is not pretty looks for me. So, in brief - I looking for way to save log into file from MemoryHandler, or better way to solve this problem.

Categories

Resources