Linux: get the location of logfiles - python

I would like to have my (python) script (located anywhere) write a logfile
and place that logfile in an appropriate location. In Linux:Debian this could be /var/log.
I was wondering if such a logfile location could be retrieved from the system? Is there an environment variable or something?

The typical way to log on Linux/UNIX is to use the system logger. From your application (or daemon) you call the syslog system function (see manpage).
This function will forward your log to the system logger. From there, the system logger will take care of writing these log messages to a file. You can also then customize the behavior of the system logger to write some of your messages to a special file, or to ignore them.
For a direct answer to your question, /var/log/ is defined by the FHS as the location where log files must be written to. No need to rely on an environment variable.

Related

How set line-buffering in FileHandler of python logging module?

I would like to be able to watch the messages in the log file. But it doesn't even appear until the script exits. For example, in the arguments of the open() function, you can specify a buffering=1.

Best practice to write logs in /var/log from a python script?

I want to write some log informations from a python's main script into a file in /var/log.
When I call logger.info("Starting"), I get a PermissionError on the file, what is quite normal since files in /var/log belong to root and my program is not run as root.
I could of course set /var/log/my.log's rights in order to let myapp write into it. (Set the same group for instance). But it doesn't look like a good practice to me: what if I install myapp on another computer? Should I then change the rights on the log file during the install process? Or is there another more generic way to do that? (Like a generic way to send the logs to "the system"? By generic I mean also portable, what would work on linux, freebsd etc.)
Though I'm not sure it's relevant, for information, here are some portions of my code:
Main script:
import logging, logging.config
from lib import settings
settings.init()
logging.config.fileConfig(settings.logging_conf_file)
logger = logging.getLogger(__name__)
The handler matching settings.logging_conf_file, in the logging config file:
[handler_mainHandler]
class=FileHandler
level=INFO
formatter=defaultFormatter
filemode=w
args=('/var/log/myapp.log',)
If syslogd is running on your box, you can try to use SysLogHandler to avoid issue with folder permissions
(https://docs.python.org/2/library/logging.handlers.html#sysloghandler).
To specify your category, you need to set facility parameter to desired, for example LOG_LOCAL5. In this case, it will correspond to local5.* category of syslogd.
As you specify facility as handler parameter, but not file name, you need to adjust syslog configuration to say syslogd to write log records to particular file. In FreeBSD, syslog conf file is /etc/syslog.conf (syslog.conf(5)).
Also you can add syslog mapping like . to /var/log/all.log to handle all logs from all syslog producers. It's helpful to determine if logging works and what is your application category, if there is a doubt.
For rsyslogd, it's possible to get more informations here: How to configure rsyslog for use with SysLogHandler logging class?

How do I define a different logger for an imported module in Python?

I'm using Advanced Python Scheduler in a Python script. The main program defines a log by calling logging.basicConfig with the file name of the log that I want. This log is also set to "DEBUG" as the logging level, since that's what I need at present for my script.
Unfortunately, because logging.basicConfig has been set up in this manner, apscheduler writes its log entries to the same log file. There are an awful lot of these, especially since I have one scheduled task that runs every minute.
Is there any way to redirect apscheduler's log output to another log file (without changing apscheduler's code) while using my log file for my own script? I.e. is there a way to change the file name for each module's output within my script?
I tried reading the module page and the HOWTO for logging, but could not find an answer to this.
Set the logger level for apscheduler to your desired value (e.g. WARNING to avoid seeing DEBUG and INFO messages from apscheduler like this:
logging.getLogger('apscheduler').setLevel(logging.WARNING)
You will still get messages for WARNING and higher severities. To direct messages from apscheduler into a separate file, use
aplogger = logging.getLogger('apscheduler')
aplogger.propagate = False
aplogger.setLevel(logging.WARNING) # or whatever
aphandler = logging.FileHandler(...) # as per what you want
aplogger.addHandler(aphandler)
Ensure the above code is only called once (otherwise you will add multiple FileHandler instances - probably not what you want).
maybe you want to call logging.getLogger("apscheduler") and setup its log file in there? see this answer https://stackoverflow.com/a/2031557/782168

Incompatibility between import-time logger naming with logging configuration

I am setting up my Python logging in main.py via reading in a file and using the fileConfig option. I want to be able to switch between testing and live logging configurations, so I want to read in a separate config file first and extract the logging config file path from there.
The problem here is that other files that I import from main.py grab their own logger via log = getLogger(__name__), and this happens at import time. These links then get broken when the new configuration is loaded in, and these modules end up without the logging working the way I expect.
I can't easily delay the importing of these modules without a lot of refactoring, so is there any other way of being able to keep this method of setting up loggers by module name while still loading in the log configuration later?
I'm not sure from your question exactly how things are breaking, but here's how I see it. The various modules which do log = logging.getLogger(__name__) will have valid names for their loggers (logger name = package name), unless you were to somehow actually move the modules to some other package location.
At import time, the logging configuration may or may not have been set, and there shouldn't be any actual logging calls made as a side-effect of the import (if there are, the messages may have nowhere to go).
Loading a new configuration using fileConfig typically just sets handlers, formatters and levels on loggers.
When you subsequently call code in the imported modules, they log via their loggers, which have handlers attached by your previous configuration call - so they will output according to the configuration.
You should be aware that on older versions of Python (<= 2.5), calls to fileConfig would unavoidably disable existing loggers which weren't named in the configuration - in more recent versions of Python (>= 2.6), this is configurable using a disable_existing_loggers=False keyword argument passed to fileConfig. You may want to check this, as it sometimes leads to unexpected behaviour (the default for that parameter is True, for compatibility with behaviour under the older Python versions).
If you post more details about what seems broken, I might be able to provide a better diagnosis of what's going on.

Python - logging and configuration change

Ok, it's a little hard to describe my problem, I will try as much as possible to explain it simply.
When my application starts up, it creates Logger from logging lib.
Before loading configuration file, the only handler for logger is stdout.
After several printed logs, application finally loads configuration file which include Logger configuration options, such as file for store logs. I modify my Logger options to use file handler for this file, simultaneously with stdout handler. But i need to store all previous logs generated by my script in this file, using formatting from configuration. I was thinking about MemoryHandler, running simultaneously with stdout handler, and after loading configuration write all logs from memory to file - before creating a file handler.
The problem is that MemoryHandler is not well documented, and this way to solve the problem is not pretty looks for me. So, in brief - I looking for way to save log into file from MemoryHandler, or better way to solve this problem.

Categories

Resources