How set line-buffering in FileHandler of python logging module? - python

I would like to be able to watch the messages in the log file. But it doesn't even appear until the script exits. For example, in the arguments of the open() function, you can specify a buffering=1.

Related

Dump curses window output to a file in python

I am trying to dump the output of curses window into a file without displaying it on stdout. Currently I am using addstr() function to print on stdout and at the end I am calling instr() function to dump the entire screen into a file. In certain cases, the ncurses does not work properly on xterm and hence I need to redirect the output to a file without actually printing it on stdout. I thought of using logger module but I lose color coding which addstr() provides. What is the best method to achieve this?
For example:
If I run the following command
$ python get_stats.py
it should display on stdout and when I run the command
$ python get_stats.py --dump-to-file
it should dump to a file without displaying on stdout.
Does addstr() takes additional parameters to determine whether the output should go to a file or stdout?
No, addstr does not take additional parameters. By default (using initscr, that is), curses writes to the standard output. You could simply do
python get_stats.py >myfile
But for making a program option, you would have to tell Python (or curses) to write to the given file. Python's binding for ncurses does not include newterm, which is the usual way of controlling the output stream. But in Python, you can redirect the standard output as discussed in these questions:
Temporarily Redirect stdout/stderr
Redirect stdout to a file in Python?
how to redirect stdout to file and console with scripting
Because curses saves a copy of the output stream locally, you must do the redirection before calling initscr. Also, keep in mind that curses may use the file descriptor rather than the buffered output stream.

How to log all print statements

My program is quite big and I want all its print statements to be logged so as a result I implemented
F = open('testy.txt','w')
sys.stdout = F
if app.button_press() == True and app.return_data():
data = app.return_data()
main(data[0],data[1],data[2],data[3],data[4],data[5],data[6],data[7],data[8])
F.close()
This is what I used to do with small programs with a few print statements but this program has few hundred of them I guess and as I run my program it freezes I think it has a lot of print statements and a memory overflow occurs therefore How could I log all my print statements to a .txt file without affecting the functionality of it?
You shouldn't be using print statements for logging, nor should you be redirecting standard out (or any other standard stream) in production code. Rather, you should be using the logging module to write messages, and setting up whatever handlers, probably a file handler or rotating file handler in your case, that you need to record your log messages to the right place.
If you already have too much existing code printing log messages to refactor in one sitting, I'd suggest implementing logging, using it for all logging going forward, setting up a file-like object that shunts standard out to logging to capture existing log messages, then refactoring out all your print-based logging over time.

How do I define a different logger for an imported module in Python?

I'm using Advanced Python Scheduler in a Python script. The main program defines a log by calling logging.basicConfig with the file name of the log that I want. This log is also set to "DEBUG" as the logging level, since that's what I need at present for my script.
Unfortunately, because logging.basicConfig has been set up in this manner, apscheduler writes its log entries to the same log file. There are an awful lot of these, especially since I have one scheduled task that runs every minute.
Is there any way to redirect apscheduler's log output to another log file (without changing apscheduler's code) while using my log file for my own script? I.e. is there a way to change the file name for each module's output within my script?
I tried reading the module page and the HOWTO for logging, but could not find an answer to this.
Set the logger level for apscheduler to your desired value (e.g. WARNING to avoid seeing DEBUG and INFO messages from apscheduler like this:
logging.getLogger('apscheduler').setLevel(logging.WARNING)
You will still get messages for WARNING and higher severities. To direct messages from apscheduler into a separate file, use
aplogger = logging.getLogger('apscheduler')
aplogger.propagate = False
aplogger.setLevel(logging.WARNING) # or whatever
aphandler = logging.FileHandler(...) # as per what you want
aplogger.addHandler(aphandler)
Ensure the above code is only called once (otherwise you will add multiple FileHandler instances - probably not what you want).
maybe you want to call logging.getLogger("apscheduler") and setup its log file in there? see this answer https://stackoverflow.com/a/2031557/782168

Linux: get the location of logfiles

I would like to have my (python) script (located anywhere) write a logfile
and place that logfile in an appropriate location. In Linux:Debian this could be /var/log.
I was wondering if such a logfile location could be retrieved from the system? Is there an environment variable or something?
The typical way to log on Linux/UNIX is to use the system logger. From your application (or daemon) you call the syslog system function (see manpage).
This function will forward your log to the system logger. From there, the system logger will take care of writing these log messages to a file. You can also then customize the behavior of the system logger to write some of your messages to a special file, or to ignore them.
For a direct answer to your question, /var/log/ is defined by the FHS as the location where log files must be written to. No need to rely on an environment variable.

Mysterious logging.basicConfig problem (Python)

I'm writing a Python script to retrieve data from Flickr. For logging purposes, I have the following setup function:
def init_log(logfile):
format = '%(asctime)s - %(levelname)s - %(message)s'
logging.basicConfig(filename=logfile,level=logging.DEBUG,format=format)
I've tested this using the python shell and it works as expected, creating a file if one doesn't already exist. But calling it from within my program is where it stops working. The function is definitely being called, and the logfile parameter is working properly – logging.basicConfig just isn't creating any file. I'm not even getting any errors or warnings.
My use of the Python Flickr API may be the culprit, but I doubt it. Any ideas?
The logging.basicConfig function only does anything if the root logger has no handlers configured. If called when there are already some handlers attached to the root, it's basically a no-op (as is documented).
Possibly the Python Flickr API does some logging, in which case you may find that basicConfig should be called earlier in your code.

Categories

Resources