I am trying to dump the output of curses window into a file without displaying it on stdout. Currently I am using addstr() function to print on stdout and at the end I am calling instr() function to dump the entire screen into a file. In certain cases, the ncurses does not work properly on xterm and hence I need to redirect the output to a file without actually printing it on stdout. I thought of using logger module but I lose color coding which addstr() provides. What is the best method to achieve this?
For example:
If I run the following command
$ python get_stats.py
it should display on stdout and when I run the command
$ python get_stats.py --dump-to-file
it should dump to a file without displaying on stdout.
Does addstr() takes additional parameters to determine whether the output should go to a file or stdout?
No, addstr does not take additional parameters. By default (using initscr, that is), curses writes to the standard output. You could simply do
python get_stats.py >myfile
But for making a program option, you would have to tell Python (or curses) to write to the given file. Python's binding for ncurses does not include newterm, which is the usual way of controlling the output stream. But in Python, you can redirect the standard output as discussed in these questions:
Temporarily Redirect stdout/stderr
Redirect stdout to a file in Python?
how to redirect stdout to file and console with scripting
Because curses saves a copy of the output stream locally, you must do the redirection before calling initscr. Also, keep in mind that curses may use the file descriptor rather than the buffered output stream.
Related
I'm building an electronjs python application and I'm using the pythonshell module. The electron application is supposed to log any messages my python script prints to the console, but rather than printing each message when it's supposed to be printed it waits until the script has finished executing and then prints everything. I've tried using sys.stdout.write("message") and then sys.stdout.flush(), but it still doesn't work.
The question I'm linking has a similar problem that I do, but the answer that worked for them didn't work for me on the electron application. It's flushing properly on the python backend, the frontend is what's causing the problem.
Similar question: Python sys.stdout.flush() doesn't work
file.flush() does not necessarily write the file’s data to disk!. you need to Use flush() followed by os.fsync(fd) to ensure this behavior.
see below:
sys.stdout.flush()
os.fsync(sys.stdout.fileno())
os.fsync(fd) documentation from python docs
Force write of file with file descriptor fd to disk. On Unix, this
calls the native fsync() function; on Windows, the MS _commit()
function.
If you’re starting with a Python file object f, first do f.flush(),
and then do os.fsync(f.fileno()), to ensure that all internal buffers
associated with f are written to disk.
I am using a 3rd-party python module which is normally called through terminal commands. When called through terminal commands it has a verbose option which prints to terminal in real time.
I then have another python program which calls the 3rd-party program through subprocess. Unfortunately, when called through subprocess the terminal output no longer flushes, and is only returned on completion (the process takes many hours so I would like real-time progress).
I can see the source code of the 3rd-party module and it does not set printing to be flushed such as print('example', flush=True). Is there a way to force the flushing through my module without editing the 3rd-party source code? Furthermore, can I send this output to a log file (again in real time)?
Thanks for any help.
The issue is most likely that many programs work differently if run interactively in a terminal or as part of a pipe line (i.e. called using subprocess). It has very little to do with Python itself, but more with the Unix/Linux architecture.
As you have noted, it is possible to force a program to flush stdout even when run in a pipe line, but it requires changes to the source code, by manually applying stdout.flush calls.
Another way to print to screen, is to "trick" the program to think it is working with an interactive terminal, using a so called pseudo-terminal. There is a supporting module for this in the Python standard library, namely pty. Using, that, you will not explicitly call subprocess.run (or Popen or ...). Instead you have to use the pty.spawn call:
def prout(fd):
data = os.read(fd, 1024)
while(data):
print(data.decode(), end="")
data = os.read(fd, 1024)
pty.spawn("./callee.py", prout)
As can be seen, this requires a special function for handling stdout. Here above, I just print it to the terminal, but of course it is possible to do other thing with the text as well (such as log or parse...)
Another way to trick the program, is to use an external program, called unbuffer. Unbuffer will take your script as input, and make the program think (as for the pty call) that is called from a terminal. This is arguably simpler if unbuffer is installed or you are allowed to install it on your system (it is part of the expect package). All you have to do then, is to change your subprocess call as
p=subprocess.Popen(["unbuffer", "./callee.py"], stdout=subprocess.PIPE)
and then of course handle the output as usual, e.g. with some code like
for line in p.stdout:
print(line.decode(), end="")
print(p.communicate()[0].decode(), end="")
or similar. But this last part I think you have already covered, as you seem to be doing something with the output.
Currently I am redirecting a script to a log file with the following command:
python /usr/home/scripts/myscript.py 2>&1 | tee /usr/home/logs/mylogfile.log
This seems to work but it does not write to the file as soon as there is a print command. Rather it waits until there is a group of lines that it can print. I want the console and the log file to be written to simultaneously. How can this be done with output redirection. Note that running the script on the console prints everything when it should. Though doing a tail -f on the logfile is not smooth since it writes about 50 lines at a time. Any suggestions?
It sounds like the shell is actually what's doing the buffering, since you say it outputs as expected to the console when not tee'd.
You could look at this post for potential solutions to undo that shell buffering: https://unix.stackexchange.com/questions/25372/turn-off-buffering-in-pipe
But I would recommend doing it entirely within Python, so you have more direct control, and instead of printing to stdout, use the logging module.
This would allow additional flexibility in terms of multiple logging levels, the ability to add multiple sources to the logging object centrally (i.e. stdout and a file -- and one which rotates with size if you'd like with logging.handlers.RotatingFileHandler) and you wouldn't be subject to the external buffering of the shell.
More info: https://docs.python.org/2/howto/logging.html
I write a python script in which there are several print statement. The printed information can help me to monitor the progress of the script. But when I qsub the bash script, which contains python my_script &> output, onto computing nodes, the output file contains nothing even when the script is running and printing something. The output file will contains the output when the script is done. So how can I get the output in real time through the output file when the script is running.
Actually write to the file rather than piping and flush after each write or after each write call sys.stdout.flush() but you are better off using a logger function and replacing the prints with logs.
From Comments:
A logger function is one that you call instead of print that will output to somewhere the text, possibly timestamped and with other information, they usually let you output various amounts of information to various destinations including stdout and files. See python 2 or 3 documents for information on pythons built in logging function.
I like to write data to sys.stderr sometimes for this sort of thing. It obviates the need to flush so much. But if you're generating output for piping sometimes, you remain better off with sys.stdout.
I need to monitor a screen session in real time using a Python script. It needs to know when the display changes. I believe this can be described as whenever stdout is flushed, or a character is entered to stdin. Is there some way to do this; perhaps with pipes?
I have some code found here that gets a character from stdin, and I assume works on a pipe (if I modify the code, or change sys.stdin)?
Does the flush function of a stream (like stdout) get called in a pipe, or is it just called explicitly? My understanding is that the display is only updated when stdout is flushed.
Probably you want to take a look at script, which already does pretty much everything you want.
Have you tried python curses? It is similar of Linux curses and provides a good way to handle terminal related i/o.