I am trying to log a python script. When the script is as follows there is no issue while logging:-
print('Test')
But when I enclose it in a while loop and sleep then it does not log the output for some reason.
while(1):
print('Test')
time.sleep(3600)
I am using Ubuntu and the command I run to log is:-
python3 script.py > /home/usr/Test.txt
So when you write to a file from bash like this, you will not get any output before your program has finished, so maybe it will be better to let python write to the file from within its own code.
Here is an example:
import time
while True:
new_data = 'hello fello!'
with open('file.txt', 'a+') as f:
f.write(new_data+'\n')
time.sleep(2)
Notice I added a newline \n after the new data to prevent everything being on the same line
the parameter 'a' in the open() means to append to the file, the + means also to read.
You need to flush the stream, to make sure it's written immediately.
import time
import sys
while True:
print('Test')
sys.stdout.flush()
time.seep(3600)
Related
I wanted to run a test function called test.counttest() that counts up to 10.
def counttest():
x = 0
for x in range(0,3):
x = x+1
print("Number: "+ str(x))
time.sleep(1)
I want to call just the function from the command line OR from subprocess popen. Not write the function, just call it. Everything I have google keeps bringing me back to how I can write a function from the command line which is NOT what I need.
I need to specifically run a function from subprocess popen so I can get the stdout in a forloop that can then be sent to a flask socket. (This is required)
Main point - How can Call (not write) a function from the command line or from subprocess?
Not this:
python -c 'import whatever then add code'
But something like this:
python "test.counttest()"
or like this:
subprocess.Popen(['python', ".\test.counttest()"],stdout=subprocess.PIPE, bufsize=1,universal_newlines=True)
EDIT:
This is for #Andrew Holmgren. Consider the following script:
def echo(ws):
data = ws.receive()
with subprocess.Popen(['powershell', ".\pingtest.ps1"],stdout=subprocess.PIPE, bufsize=1,universal_newlines=True) as process:
for line in process.stdout:
line = line.rstrip()
print(line)
try:
ws.send(line+ "\n")
except:
pass
this works perfectly for what I need as it:
takes the script's stdout and send's it to the ws.send() function which is a websocket.
However I need this same concept for a function instead. The only way I know how to get the stdout easily is from using subprocess.popen but if there is another way let me know. This is why I am trying to make a hackjob way of running a function through the subprocess module.
The question of Run python function from command line or subprocess popen relates in the fact that if I can get a function to run from subprocess, then I know how to get the stdout for a websocket.
Actually you have really a lot of questions inside this one.
How can I send output of function line-by-line to another one (and/or websocket)? Just avoid writing to stdout and communicate directly. yield (or other generator creation methods) are intended exatly for that.
import time
def counttest():
for i in range(10):
yield f'Item {i}'
time.sleep(1)
def echo(ws):
# data = ws.receive()
for row in counttest():
ws.send(row)
How to call a function func_name defined in file (suppose it's test.py) from command line? Being in directory with test.py, do
$ python -c 'from test import func_name; func_name()'
How to read from sys.stdout? The easiest will be to replace it with io.StringIO and restore thing back later.
from contextlib import redirect_stdout
import io
def echo(ws):
f = io.StringIO()
with redirect_stdout(f):
counttest()
output = f.getvalue()
ws.send(output)
It will return only after call_function(), so you cannot monitor real-time printed items.
Regarding
I need the stdout
I can say, that I'm sure your question is X-Y problem, thus I try to suggest alternatives. Solution you want will also work, but it's awkward. This will exactly run function counttest defined in test.py, capture its output and send it line-by-line to websocket. It will process output immediately when new line arrives. Note -u flag on python call (unbuffered), it's important.
import subprocess
import shlex
def echo(ws):
data = ws.receive()
with subprocess.Popen(shlex.split("python -u -c 'from test import counttest; counttest()'"),
stdout=subprocess.PIPE,
bufsize=1,
universal_newlines=True) as process:
for line in iter(process.stdout.readline, ''):
line = line.rstrip()
if not line:
break
print(line)
try:
ws.send(line + "\n")
except:
pass
I have thread which is supposed to close whole application, so I use os._exit(1). But I also want to redirect output from my program to file and the output file is empty after all. Simple example:
import os
print('something')
os._exit(1)
Running program with:
python myprogram.py > output.txt
Is there any way to do this?
So I´m trying to create a python backend script for an electron app. I want to be able to continually pass system inputs to the python file, and have it run a function whenever the system input changes. I don´t want to have to run the python script each time as it takes a few seconds to load modules and data and just slows down the app.
I can´t find a good description anywhere on how to do this.
import sys
sysArg = []
def sysPrint(sysArgs):
print(sysArgs[1:])
while True:
if sys.argv <> sysArg:
sysPrint(sys.argv)
sysArg = sys.argv
This doesn´t work for me and also a "while True" loop doesn´t feel very safe CPU wise.
I´m also thinking that sys.argv might not be the right choice as that is perhaps only generated when calling the Python script?
If I don't misunderstand your question, you can just redirect the system output to a file,use nohup run program as a background process.
like this:
nohup some_program &> output.log" &
And then you can handle the output file like this:
import time
def tail(f):
f.seek(0,2)
while True:
line = f.readline()
if not line:
time.sleep(0.1)
continue
yield line
if __name__ == '__main__':
output_file= open("output.log","r")
for line in tail(output_file):
print(line)
I got a simple python script which should read from stdin.
So if I redirect a stdout of a program to the stdin to my python script.
But the stuff that's logged by my program to the python script will only "reach" the python script when the program which is logging the stuff gets killed.
But actually I want to handle each line which is logged by my program as soon as it is available and not when my program which should actually run 24/7 quits.
So how can I make this happen? How can I make the stdin not wait for CTRL+D or EOF until they handle data?
Example
# accept_stdin.py
import sys
import datetime
for line in sys.stdin:
print datetime.datetime.now().second, line
# print_data.py
import time
print "1 foo"
time.sleep(3)
print "2 bar"
# bash
python print_data.py | python accept_stdin.py
Like all file objects, the sys.stdin iterator reads input in chunks; even if a line of input is ready, the iterator will try to read up to the chunk size or EOF before outputting anything. You can work around this by using the readline method, which doesn't have this behavior:
while True:
line = sys.stdin.readline()
if not line:
# End of input
break
do_whatever_with(line)
You can combine this with the 2-argument form of iter to use a for loop:
for line in iter(sys.stdin.readline, ''):
do_whatever_with(line)
I recommend leaving a comment in your code explaining why you're not using the regular iterator.
It is also an issue with your producer program, i.e. the one you pipe stdout to your python script.
Indeed, as this program only prints and never flushes, the data it prints is kept in the internal program buffers for stdout and not flushed to the system.
Add sys.stdout.flush() call right after you print statement in print_data.py.
You see the data when you quit the program as it automatically flushes on exit.
See this question for explanation,
As said by #user2357112 you need to use:
for line in iter(sys.stdin.readline, ''):
After that you need to start python with the -u flag to flush stdin and stdout immediately.
python -u print_data.py | python -u accept_stdin.py
You can also specify the flag in the shebang.
I have a Python script that I want to run in IPython. I want to redirect (write) the output to a file, similar to:
python my_script.py > my_output.txt
How do I do this when I run the script in IPython, i.e. like execfile('my_script.py')
There is an older page describing a function that could be written to do this, but I believe that there is now a built-in way to do this that I just can't find.
IPython has its own context manager for capturing stdout/err, but it doesn't redirect to files, it redirects to an object:
from IPython.utils import io
with io.capture_output() as captured:
%run my_script.py
print captured.stdout # prints stdout from your script
And this functionality is exposed in a %%capture cell-magic, as illustrated in the Cell Magics example notebook.
It's a simple context manager, so you can write your own version that would redirect to files:
class redirect_output(object):
"""context manager for reditrecting stdout/err to files"""
def __init__(self, stdout='', stderr=''):
self.stdout = stdout
self.stderr = stderr
def __enter__(self):
self.sys_stdout = sys.stdout
self.sys_stderr = sys.stderr
if self.stdout:
sys.stdout = open(self.stdout, 'w')
if self.stderr:
if self.stderr == self.stdout:
sys.stderr = sys.stdout
else:
sys.stderr = open(self.stderr, 'w')
def __exit__(self, exc_type, exc_value, traceback):
sys.stdout = self.sys_stdout
sys.stderr = self.sys_stderr
which you would invoke with:
with redirect_output("my_output.txt"):
%run my_script.py
To quickly store text contained in a variable while working in IPython use %store with > or >>:
%store VARIABLE >>file.txt (appends)
%store VARIABLE >file.txt (overwrites)
(Make sure there is no space immediately following the > or >>)
For just one script to run I would do the redirection in bash
ipython -c "execfile('my_script.py')" > my_output.txt
On python 3, execfile does not exist any more, so use this instead
ipython -c "exec(open('my_script.py').read())" > my_output.txt
Be careful with the double vs single quotes.
While this an old question, I found this and the answers as I was facing a similar problem.
The solution I found after sifting through IPython Cell magics documentation is actually fairly simple. At the most basic the solution is to assign the output of the command to a variable.
This simple two-cell example shows how to do that. In the first Notebook cell we define the Python script with some output to stdout making use of the %%writefile cell magic.
%%writefile output.py
print("This is the output that is supposed to go to a file")
Then we run that script like it was run from a shell using the ! operator.
output = !python output.py
print(output)
>>> ['This is the output that is supposed to go to a file']
Then you can easily make use of the %store magic to persist the output.
%store output >output.log
Notice however that the output of the command is persisted as a list of lines. You might want to call "\n".join(output) prior storing the output.
use this code to save the output to file
import time
from threading import Thread
import sys
#write the stdout to file
def log():
#for stop the thread
global run
while (run):
try:
global out
text = str(sys.stdout.getvalue())
with open("out.txt", 'w') as f:
f.write(text)
finally:
time.sleep(1)
%%capture out
run = True
print("start")
process = Thread(target=log, args=[]).start()
# do some work
for i in range(10, 1000):
print(i)
time.sleep(1)
run= False
process.join()
It is useful to use a text editor that tracer changes the file and suggest reloading the file like
notepad++
I wonder why the verified solution doesn't entirely work in a loop, the following:
for i in range(N):
with redirect_output("path_to_output_file"):
%run <python_script> arg1 arg2 arg3
creates N files in the directory with only output from the first print statement of the <python_script>. Just confirming- when run separately for each iteration of the for loop, the script produces the right result.
There's the hacky way of overwriting sys.stdout and sys.stderr with a file object, but that's really not a good way to go about it. Really, if you want to control where the output goes from inside python, you need to implement some sort of logging and/or output handling system that you can configure via the command line or function arguments instead of using print statements.
It seems a lot of code....
My solution.
redirect output of ipython script into a csv or text file like sqlplus spool
wonder there is an easy way like oracle sqlplus spool command..?