Due to Apache gateway timeouts, and a desire to display more information to the end user, I'd like to be able to flush STDOUT on a python CGI script hosted on PCF, essentially giving updates on the status of the script.
I have tried enabling the -u tag in python (#!/usr/python -u at head of my script), sys.stdout.flush() command, and even using subprocess.call to execute a perl script that is set to flush to STDOUT that prints some progress text ($| = 1; at beginning of perl script). Furthermore, I've double checked that I'm not using any Apache modules that would require buffering (no mod_deflate, for example). Finally, I'll mention that executing a standard perl CGI rather than a python CGI allows for the STDOUT flushing, so I figure it must be something with my python/Apache/PCF configuration.
I'm fresh out of ideas here, and would like some advice.
With any of these above methods, I would have thought stdout would flush. But, none of them have worked!
Thanks in advance for any assisstance.
You can disable buffering using something like this in your Python2 code:
# set stdout as non-buffered
if hasattr(sys.stdout, 'fileno'):
fileno = sys.stdout.fileno()
tmp_fd = os.dup(fileno)
sys.stdout.close()
os.dup2(tmp_fd, fileno)
os.close(tmp_fd)
sys.stdout = os.fdopen(fileno, "w", 0)
That is reopening sys.stdout with no buffer (i.e. the 0 as third arg). After you do that, anything writing to sys.stdout should not be buffered.
Related
I am including in my python code a function compiled in c via a cython wrapper. I have to take that function as given and cannot change it. Unfortunately, when I run that function, I see output that is bothering me.
I have tried a lot of tricks that are supposed to get rid of it, all of which play with sys.stdout or sys.stderr -- most noteably, the new contextlib.redirect_stdout. However, nothing I tried managed to silence the output.
At the most basic level, I simply tried setting
sys.stdout = open(os.devnull, 'w')
sys.stderr = open(os.devnull, 'w')
Which is not a safe, or practicable way of doing it, but it should shut the function up. Unfortunately, I can still see the output. What am I missing? Is there perhaps another "output type" besides stdout that this function might be using?
If it helps, I am inside a Pycharm debugging session and see this output in my debugging console.
Updated question to reflect that changing stderr did not help
A C function prints to a file descriptor (1 for stdout, 2 for stderr). If you want to prevent the printing, redirect that FD, that can be done also temporarily. Here is a litte demo:
import os
STDOUT = 1
saved_fd = os.dup(STDOUT)
null_fd = os.open(os.devnull, os.O_WRONLY)
os.dup2(null_fd, STDOUT)
os.system('echo TESTÂ 1') # redirected to /dev/null
os.dup2(saved_fd, STDOUT)
os.system('echo TEST 2') # normal
# note: close the null_fd, saved_fd when no longer needed
If the C code opens the terminal device itself, there is very little you can do to prevent it. But that would be very unusual (I would even say a bug) unless there is a specific reason to do so.
Is there perhaps another "output type" besides stdout that this
function might be using?
Yes, there exist stderr, which would be unaffected by stdout redirect, simple example, let printer.py content be
import sys
sys.stderr.write("printing to stderr")
then running in terminal
python printer.py > output.txt
lead to appearance of
printing to stderr
as > output.txt redirects only stdout.
I am using Python to call a Shell script with
def run_command(cmd):
print "Start to run: " + cmd
run = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
line = run.stdout.readline().decode()[:-1]
if line == '' and run.poll() is not None:
break
print line # print the log from shell
recode = run.returncode
if recode != 0:
raise Exception("Error occurs!")
print "End to run: " + cmd
Then I run
run_command("sh /home/tome/a.sh")
I notice the console output from a.sh is not in real time, looks like that there is a buffer for the stdout and when the stdout buffer is full, then the output is printed out.
I would ask how to disable the shell stdout buffer in my script a.sh
Thanks!
The buffering in question would largely be a problem on the script's side, not the Python side; while Python would buffer the reads, it wouldn't block unless the buffer was emptied and there was nothing available to read.
So really, you need to disable buffering in the script itself. Adding stdbuf -oL (or -o0 for completely unbuffered, but line buffering should cover you since you read by line as well) to your commands should help in some cases (where the programs don't adjust their own buffering internally).
If you're seeing this behavior only by looking at Python's output, be aware that Python itself can buffer output as well. You can disable this by passing -u when running Python, or setting the environment variable PYTHONUNBUFFERED=1 before running it, or from within a script, you can manually call sys.stdout.flush() after any writes (direct, or implicit via print) to stdout. On modern Python, print takes an argument to force a flush after printing, but since you're on Python 2.x, that's not an option.
I'm fairly new to Linux/Python programming. I tried googling about this but could not find anything useful.
I wrote a simple script that reads lines from a serial port and prints them (as they are read) to stdout. Here's the relevant code:
ser = serial.Serial(args.port)
while True:
print(ser.readline())
I also wrote a script (this is only for testing purposes) that echoes lines read from stdin to stdout. Here's the code for that:
while True:
print(args.prefix + input())
I'm using python3, and the scripts are named serial.py and echo.py respectively.
What I would like to do is to pipe the output of serial to the input of echo (echo will later be replaced by a script that writes to a database), and leave those running indefinitely.
I tried both scripts separately and they work fine, but nothing gets printed when I pipe both commands:
./serial.py --port /dev/ttyACM0 | ./echo.py
It does work when I pipe echo to itself:
awer#napalm:~$ ./echo.py --prefix AAA | ./echo.py --prefix BBB
hi!
BBBAAAhi!
What am I doing wrong?
Thanks for any help on this.
Best regards
This could be an issue related to a buffered stdout. Try to run the serial.py using the '-u' flag of the python3 interpreter, which will force stdout and stderr to be unbuffered as stated by the doc:
-u Force the binary I/O layers of stdout and stderr to be
unbuffered. stdin is always buffered. The text I/O layer will
still be line-buffered.
I am writing a program to communicate to two programs:
output = Popen(shlex.split(query_cmd), stdout=PIPE,stdin=None)
cmd_out = [output.stdout]
while cmd_out:
readable,writeready,exceptready = select.select(cmd_out,[],[],timeout)
for f in readable:
line = f.readline()
snap_result=Popen(shlex.split("snap %s" % (line) ),stdout=PIPE,close_fds=True).communicate()[0]
print snap_result
Supposedly query_cmd will continuously generate lines of result. snap should that use this line as argument, return results and terminate. This works on python2.4. However, on python2.6.6, it seems that the snap will hangs on reading the result.
If I change the query_cmd to "tail -f file". It seems to work too.
I am running this inside a csh script where both stdout/stderr are redirected to a log file.
EDIT: Actually, it is weird, in csh, I redirected both stdout&stderr to log file. If I only redirect stdout, it is running fine. If I redirect stderr, it hangs. I think somehow the stderr is messed up between parent process of python and child process.
Seems not be the problem of the script but because the subprocess is expecting stdin input. Redirect the stdin to null device solve this.
I'm writing some code which involves running a few shell commands from Python and ideally, I would like to integrate the output from these commands into the logger that I'm using. I know I can divert stdout into a file / socket as follows:
call( '<a-shell-cmd>', shell=True, stdout=myFile )
but I'd rather not have the bind of opening a temporary file, looping over the file writing the output, closing the file, deleting the file etc. If there's anyway that I can send the output directly to the logger, it would seem a lot neater to me. Any ideas?
Use the subprocess module.
Tip: you can go to the documentation for a particular version of python via http://docs.python.org/release/<major>.<minor>/
From Python 2.7 and above:
output = subprocess.check_output(["command", "arg1"], shell=True)
In Python 2.4:
process = subprocess.Popen(["command", "arg1"], shell=True, stdout=subprocess.PIPE)
stdout,stderr = process.communicate()
# not shown: how to use Popen.poll() to wait for process death.
# while filling an output buffer
print stdout
Below Python 2.4:
output = os.popen('ls')
Use os.popen
output = os.popen('ls')
You can then log output or do it directly when calling the above.