I created a python program, test.py, below:
import subprocess
import sys, os
FolderPath = subprocess.getoutput("cd . && pwd")
ProgramName = sys.argv[0]
LogName = ProgramName[:-3]+'_printout.txt'
ProgramFile = FolderPath+'/'+ProgramName
LogFile = FolderPath+'/'+LogName
_stdin = sys.stdin
_stdout = sys.stdout
_stderr = sys.stderr
sys.stdin = open(LogFile, 'w')
sys.stdout = open(LogFile, 'a')
sys.stderr = open(LogFile, 'a')
Prog = open(ProgramFile, 'r')
print(Prog.read())
TEST = str(input("Enter the name: \n TEST_NAME: "))
print(TEST)
sys.stdin = _stdin.flush()
sys.stdout = _stdout.flush()
sys.stderr = _stderr.flush()
After I executed on linux with command python test.py, I got the error in test_printout.txt.
Enter the name:
TEST_NAME: Traceback (most recent call last):
File "test.py", line 21, in <module>
TEST = str(input("Enter the name: \n TEST_NAME: "))
io.UnsupportedOperation: not readable
I modified the code:
import subprocess
import sys, os
FolderPath = subprocess.getoutput("cd . && pwd")
ProgramName = sys.argv[0]
LogName = ProgramName[:-3]+'_printout.txt'
ProgramFile = FolderPath+'/'+ProgramName
LogFile = FolderPath+'/'+LogName
_stdin = sys.stdin
_stdout = sys.stdout
_stderr = sys.stderr
sys.stdin = open(LogFile, 'w+')
sys.stdout = open(LogFile, 'a')
sys.stderr = open(LogFile, 'a')
Prog = open(ProgramFile, 'r')
print(Prog.read())
TEST = str(input("Enter the name: \n TEST_NAME: "))
print(TEST)
sys.stdin = _stdin.flush()
sys.stdout = _stdout.flush()
sys.stderr = _stderr.flush()
But got:
Enter the name:
TEST_NAME: import subprocess
It did not let me type anything. What I want is to let me type string and it also save to test_printout.txt.
Enter the name:
TEST_NAME: This Is What I Type And Save!
Does anyone know how to fix it?
Also, if I use w+ instead of w mode, it will take longer time to write to the test_printout.txt if I changed the program to import pandas.DataFrame and manipulate data.
Is there a way to only write simple print words to test_printout.txt without reading entire thing?
UPDATE
I modified the code as below:
import subprocess, sys, os
FolderPath = subprocess.getoutput("cd . && pwd")
ProgramName = sys.argv[0]
LogName = ProgramName[:-3]+'_printout.txt'
ProgramFile = FolderPath+'/'+ProgramName
LogFile = FolderPath+'/'+LogName
_stdin = sys.stdin
_stdout = sys.stdout
_stderr = sys.stderr
class stdout_Logger(object):
def __init__(self):
self.stdout = sys.stdout
self.log = open(LogFile, "a")
def write(self, message):
self.stdout.write(message)
self.log.write(message)
def flush(self):
#this flush method is needed for python 3 compatibility.
#this handles the flush command by doing nothing.
#you might want to specify some extra behavior here.
pass
sys.stdout = stdout_Logger()
class stderr_Logger(object):
def __init__(self):
self.stderr = sys.stderr
self.log = open("test_printout.txt", "a")
def write(self, message):
self.stderr.write(message)
self.log.write(message)
def flush(self):
#this flush method is needed for python 3 compatibility.
#this handles the flush command by doing nothing.
#you might want to specify some extra behavior here.
pass
sys.stderr = stderr_Logger()
Prog = open(ProgramFile, 'r')
print(Prog.read())
##START Program
TEST = str(input("Enter the name: \n TEST_NAME: "))
print(TEST)
#END Program
sys.stdin = _stdin.flush()
sys.stdout = _stdout.flush()
sys.stderr = _stderr.flush()
This got almost what I want. This also save my program to test_printout.txt at the top and do print(TEST) in the bottom.
However, it also prints all program to the linux terminal console which is not I desire. I only want it to print "Enter the name: \n TEST_NAME: " in linux terminal and I can type string instead of printing entire program.
I think the issue came from sys.stdin.
I think I figured it out. The problem is that when you substitute input with a file-handle in write mode you ban input() from reading it. You can get the same error if you tried this:
file = open("foo.txt",'w')
content = file.read()
The way to go around it is to log streams without redirecting them. So either you dump your console to file with python test.py > test_printout.txt or create a logger class to wrap around the streams (check out this answer: How to redirect stdout to both file and console with scripting?).
Perhaps its worth for you to look into the logging module, as I believe it handles these issues rather neatly.
EDIT:
From what you laid out in the comments, this is what you want:
import subprocess, sys, os
FolderPath = subprocess.getoutput("cd . && pwd")
ProgramName = sys.argv[0]
LogName = ProgramName[:-3]+'_printout.txt'
ProgramFile = FolderPath+'/'+ProgramName
LogFile = FolderPath+'/'+LogName
Prog = open(ProgramFile, 'r')
with open(LogFile, 'w') as logfile:
logfile.write(Prog.read())
_stdin = sys.stdin
_stdout = sys.stdout
_stderr = sys.stderr
class stdout_Logger(object):
def __init__(self):
self.stdout = sys.stdout
self.log = open(LogFile, "a")
def write(self, message):
self.stdout.write(message)
self.log.write(message)
def flush(self):
#this flush method is needed for python 3 compatibility.
#this handles the flush command by doing nothing.
#you might want to specify some extra behavior here.
pass
class stderr_Logger(object):
def __init__(self):
self.stderr = sys.stderr
self.log = open("test_printout.txt", "a")
def write(self, message):
self.stderr.write(message)
self.log.write(message)
def flush(self):
#this flush method is needed for python 3 compatibility.
#this handles the flush command by doing nothing.
#you might want to specify some extra behavior here.
pass
sys.stdout = stdout_Logger()
sys.stderr = stderr_Logger()
##START Program
TEST = str(input("Enter the name: \n TEST_NAME: "))
print(TEST)
#END Program
sys.stdin = _stdin.flush()
sys.stdout = _stdout.flush()
sys.stderr = _stderr.flush()
Related
I know that if you want to redirect stdout to a file, you can simply do it like this.
sys.stdout = open(fpath, 'w')
But how can I switch back stdout to write on the terminal?
You can assign it to variable and later assing it back
temp = sys.stdout
print('console')
sys.stdout = open('output.txt', 'w')
print('file')
sys.stdout = temp
print('console')
You can also find examples how to use it with context manager so you can change it using with
import sys
from contextlib import contextmanager
#contextmanager
def custom_redirection(fileobj):
old = sys.stdout
sys.stdout = fileobj
try:
yield fileobj
finally:
sys.stdout = old
# ---
print('console')
with open('output.txt', 'w') as out:
with custom_redirection(out):
print('file')
print('console')
Code from: Python 101: Redirecting stdout
Currently you can even find redirect_stdout in contextlib
import sys
from contextlib import redirect_stdout
print('console')
with open('output.txt', 'w') as out:
with redirect_stdout(out):
print('file')
print('console')
BTW: if you want to redirect all text to file then you can use system/shell for this
$ python script.py > output.txt
A better bet is to simply write to the file when you want.
with open('samplefile.txt', 'w') as sample:
print('write to sample file', file=sample)
print('write to console')
reassigning the stdout would mean you need to track the previous file descriptor and assign it back whenever you want to send text to the console.
If you really must reassign you could do it like this.
holder = sys.stdout
sys.stdout = open(fpath, 'w')
print('write something to file')
sys.stdout = holder
print('write something to console')
I'm having trouble writing the terminal output (all print statements) to a textfile then reading that textfile in the same script. I keep getting an I/O error if I close the program to finish writing to the file and then re-open the file to read it, or no output for the final print(file_contents) statement.
Here's my code:
import sys
filename = open("/Users/xxx/documents/python/dump.txt", 'r+')
filename.truncate()
sys.stdout = filename
print('Hello')
print('Testing')
filename.close()
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
Any suggestions would be great! I'm planning to use this to print output's from some longer scripts to a slack channel.
Thanks!
The error you get is:
IOError: [Errno 2] No such file or directory: '/Users/xxx/documents/python/dump.txt' because:
file open mode r+ does not create a file. Use mode w like this:
You have to reattach stdout to console again to print in console.
import sys
filename = open('/Users/xxx/documents/python/dump.txt', 'w')
# filename.truncate() # mode 'w' truncates file
sys.stdout = filename
print('Hello')
print('Testing')
filename.close()
# reattach stdout to console
sys.stdout = sys.__stdout__
with open('/Users/xxx/documents/python/dump.txt') as file:
data = file.read()
print(data)
will print:
Hello
Testing
The problem is you redirect sys.stdout to filename, and then you close the file. Afterwards you can't print anything anymore, since the file is closed.
sys.stdout = filename
..
..
filename.close()
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
The last print statement tries to print output to sys.stdout, which is a closed file.
If you want to get the old behavior back, you need to keep a reference to sys.stdout. This will solve it:
sys_out = sys.stdout
sys.stdout = filename
..
..
filename.close()
sys.stdout = sys_out
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
import sys
filename = open("/Users/xxx/documents/python/dump.txt", 'w')
sys_out = sys.stdout
sys.stdout = filename
print('Hello')
print('Testing')
print('Test')
filename.close()
sys.stdout = sys_out
with open("/Users/xxx/documents/python/dump.txt", 'r') as file:
data = file.read()
print(data)
My requirement is to print the output of the file in the console as well as a log file. The following piece of code does it for me , exept for a minor hiccup. I am calling a perl script at the end of the file , whose output is getting displayed in the console , but not getting printed to the file.
import subprocess
import sys
class Tee(object):
def __init__(self, *files):
self.files = files
def write(self, obj):
for f in self.files:
f.write(obj)
f = open('MyFile.txt', 'w')
original = sys.stdout
sys.stdout = Tee(sys.stdout, f)
print "Logging Started"
# My code
print "A"
subprocess.call(['./MyScript])
sys.stdout = original
print "Logging Stopped" # Only on stdout
f.close()
Can anyone please advise how can that be achieved? Or is it possible at all to achieve the same?
Use subprocess.check_output:
print subprocess.check_output(['./MyScript])
In Python 2.6, either use the backport subprocess32, or copy the 2.7 source for check_output.
If you look at check_output is implemented in Python2.7, you should be able to work out how to use subprocess.Popen
def check_output(*popenargs, **kwargs):
if 'stdout' in kwargs:
raise ValueError('stdout argument not allowed, it will be overridden.')
process = Popen(stdout=PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
raise CalledProcessError(retcode, cmd, output=output)
return output
The below code did the trick for me. Thanks everyone for helping.
#!/usr/bin/python
import os
import subprocess
import sys
class Tee(object):
def __init__(self, *files):
self.files = files
def write(self, obj):
for f in self.files:
f.write(obj)
f = open('MyFile.txt', 'w')
original = sys.stdout
sys.stdout = Tee(sys.stdout, f)
print "Logging Started"
# My code
print "A"
def check_output(*popenargs, **kwargs):
process = subprocess.Popen(stdout=subprocess.PIPE, *popenargs, **kwargs)
output, unused_err = process.communicate()
retcode = process.poll()
if retcode:
cmd = kwargs.get("args")
if cmd is None:
cmd = popenargs[0]
error = subprocess.CalledProcessError(retcode, cmd)
error.output = output
raise error
return output
location = "%s/folder"%(os.environ["Home"])
myoutput = check_output(['./MyFile'])
print myoutput
sys.stdout = original
print "Logging Stopped" # Only on stdout
f.close()
subprocess.check_ouput
is introduced in python 2.7, if you can onlu use previous versions, you can just use the Popen and set the stdout to your output stream (but in your case, you already overrided the sys.stdout, I think it's not needed), just change to:
p = subprocess.Popen(['./MyScript'])
I would like to create a function that keeps a record of every print command, storing each command's string into a new line in a file.
def log(line):
with open('file.txt', "a") as f:
f.write('\n' + line)
This is what I have, but is there any way to do what I said using Python?
Try replacing stdout with custom class:
import sys
class LoggedStdout():
def __init__(self, filename = None):
self.filename = filename
def write(self, text):
sys.__stdout__.write(text)
if not self.filename is None:
self.log(text)
def log(self, line):
with open(self.filename, "a") as f:
f.write('\n' + line)
sys.stdout = LoggedStdout('file.txt')
print 'Hello world!'
This would affect not only print, but also any other function that prints something to stdout, but it is often even better.
For production-mode logging it's much better to use something like logging module, rather than home-made hooks over standard IO streams.
I've got a Python program that reads from sys.stdin, so I can call it with ./foo.py < bar.png. How do I test this code from within another Python module? That is, how do I set stdin to point to the contents of a file while running the test script? I don't want to do something like ./test.py < test.png. I don't think I can use fileinput, because the input is binary, and I only want to handle a single file. The file is opened using Image.open(sys.stdin) from PIL.
You should generalise your script so that it can be invoked from the test script, in addition to being used as a standalone program. Here's an example script that does this:
#! /usr/bin/python
import sys
def read_input_from(file):
print file.read(),
if __name__ == "__main__":
if len(sys.argv) > 1:
# filename supplied, so read input from that
filename = sys.argv[1]
file = open(filename)
else:
# no filename supplied, so read from stdin
file = sys.stdin
read_input_from(file)
If this is called with a filename, the contents of that file will be displayed. Otherwise, input read from stdin will be displayed. (Being able to pass a filename on the command line might be a useful improvement for your foo.py script.)
In the test script you can now invoke the function in foo.py with a file, for example:
#! /usr/bin/python
import foo
file = open("testfile", "rb")
foo.read_input_from(file)
Your function or class should accept a stream instead of choosing which stream to use.
Your main function will choose sys.stdin.
Your test method will probably choose a StringIO instance or a test file.
The program:
# foo.py
import sys
from PIL import Image
def foo(stream):
im = Image.open(stream)
# ...
def main():
foo(sys.stdin)
if __name__ == "__main__":
main()
The test:
# test.py
import StringIO, unittest
import foo
class FooTest(unittest.TestCase):
def test_foo(self):
input_data = "...."
input_stream = StringIO.StringIO(input_data)
# can use a test file instead:
# input_stream = open("test_file", "rb")
result = foo.foo(input_stream)
# asserts on result
if __name__ == "__main__":
unittest.main()
A comp.lang.python post showed the way: Substitute a StringIO() object for sys.stdout, and then get the output with getvalue():
def setUp(self):
"""Set stdin and stdout."""
self.stdin_backup = sys.stdin
self.stdout_backup = sys.stdout
self.output_stream = StringIO()
sys.stdout = self.output_stream
self.output_file = None
def test_standard_file(self):
sys.stdin = open(EXAMPLE_PATH)
foo.main()
self.assertNotEqual(
self.output_stream.getvalue(),
'')
def tearDown(self):
"""Restore stdin and stdout."""
sys.stdin = self.stdin_backup
sys.stdout = self.stdout_backup
You can always monkey patch Your stdin. But it is quite ugly way. So better is to generalize Your script as Richard suggested.
import sys
import StringIO
mockin = StringIO.StringIO()
mockin.write("foo")
mockin.flush()
mockin.seek(0)
setattr(sys, 'stdin', mockin)
def read_stdin():
f = sys.stdin
result = f.read()
f.close()
return result
print read_stdin()
Also, do not forget to restore stdin when tearing down your test.