I always open and write into files using with statement:
with open('file_path', 'w') as handle:
print >>handle, my_stuff
However, there is one instance where I need to be able to be more flexible, and write to sys.stdout (or other types of streams), if that is provided instead of file path:
So, my question is this: Is there a way for using with statement both with real files and with sys.stdout?
Note that I can use the following code, but I think this defeats the purpose of using with:
if file_path != None:
outputHandle = open(file_path, 'w')
else:
outputHandle = sys.stdout
with outputHandle as handle:
print >>handle, my_stuff
You can create a context manager and use it like this
import contextlib, sys
#contextlib.contextmanager
def file_writer(file_name = None):
# Create writer object based on file_name
writer = open(file_name, "w") if file_name is not None else sys.stdout
# yield the writer object for the actual use
yield writer
# If it is file, then close the writer object
if file_name != None: writer.close()
with file_writer("Output.txt") as output:
print >>output, "Welcome"
with file_writer() as output:
print >>output, "Welcome"
If you don't pass any input to file_writer it will use sys.stdout.
Thing is, you don't need to use a context processor with stdout, because you're not opening or closing it. A less fancy way of abstracting this is:
def do_stuff(file):
# Your real code goes here. It works both with files or stdout
return file.readline()
def do_to_stdout():
return do_stuff(sys.stdout)
def do_to_file(filename):
with open(filename) as f:
return do_stuff(f)
print do_to_file(filename) if filename else do_to_stdout()
The simplest way is to simply use "old school" streamed filenames, that way your code doesn't have to change. In Unix this is "/dev/tty" or in Windows this is "con" (although there are other choices for both platforms).
if default_filename is None:
default_filename = "/dev/tty"
with open(default_filename, 'w') as handle:
handle.write("%s\n" % my_stuff)
This code tested in Python 2.7.3 and 3.3.5
With python3 optional closefd argument is recognized.
If set to False, resulting IO object won't close underlying fd:
if file_path != None:
outputHandle = open(file_path, 'w')
else:
outputHandle = open(sys.stdout.fileno(), 'w', closefd=False)
with outputHandle as handle:
print(my_stuff, file=handle)
Related
I know that if you want to redirect stdout to a file, you can simply do it like this.
sys.stdout = open(fpath, 'w')
But how can I switch back stdout to write on the terminal?
You can assign it to variable and later assing it back
temp = sys.stdout
print('console')
sys.stdout = open('output.txt', 'w')
print('file')
sys.stdout = temp
print('console')
You can also find examples how to use it with context manager so you can change it using with
import sys
from contextlib import contextmanager
#contextmanager
def custom_redirection(fileobj):
old = sys.stdout
sys.stdout = fileobj
try:
yield fileobj
finally:
sys.stdout = old
# ---
print('console')
with open('output.txt', 'w') as out:
with custom_redirection(out):
print('file')
print('console')
Code from: Python 101: Redirecting stdout
Currently you can even find redirect_stdout in contextlib
import sys
from contextlib import redirect_stdout
print('console')
with open('output.txt', 'w') as out:
with redirect_stdout(out):
print('file')
print('console')
BTW: if you want to redirect all text to file then you can use system/shell for this
$ python script.py > output.txt
A better bet is to simply write to the file when you want.
with open('samplefile.txt', 'w') as sample:
print('write to sample file', file=sample)
print('write to console')
reassigning the stdout would mean you need to track the previous file descriptor and assign it back whenever you want to send text to the console.
If you really must reassign you could do it like this.
holder = sys.stdout
sys.stdout = open(fpath, 'w')
print('write something to file')
sys.stdout = holder
print('write something to console')
I'm having trouble writing the terminal output (all print statements) to a textfile then reading that textfile in the same script. I keep getting an I/O error if I close the program to finish writing to the file and then re-open the file to read it, or no output for the final print(file_contents) statement.
Here's my code:
import sys
filename = open("/Users/xxx/documents/python/dump.txt", 'r+')
filename.truncate()
sys.stdout = filename
print('Hello')
print('Testing')
filename.close()
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
Any suggestions would be great! I'm planning to use this to print output's from some longer scripts to a slack channel.
Thanks!
The error you get is:
IOError: [Errno 2] No such file or directory: '/Users/xxx/documents/python/dump.txt' because:
file open mode r+ does not create a file. Use mode w like this:
You have to reattach stdout to console again to print in console.
import sys
filename = open('/Users/xxx/documents/python/dump.txt', 'w')
# filename.truncate() # mode 'w' truncates file
sys.stdout = filename
print('Hello')
print('Testing')
filename.close()
# reattach stdout to console
sys.stdout = sys.__stdout__
with open('/Users/xxx/documents/python/dump.txt') as file:
data = file.read()
print(data)
will print:
Hello
Testing
The problem is you redirect sys.stdout to filename, and then you close the file. Afterwards you can't print anything anymore, since the file is closed.
sys.stdout = filename
..
..
filename.close()
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
The last print statement tries to print output to sys.stdout, which is a closed file.
If you want to get the old behavior back, you need to keep a reference to sys.stdout. This will solve it:
sys_out = sys.stdout
sys.stdout = filename
..
..
filename.close()
sys.stdout = sys_out
with open("/Users/xxx/documents/python/dump.txt") as file:
data = file.read()
print(file)
import sys
filename = open("/Users/xxx/documents/python/dump.txt", 'w')
sys_out = sys.stdout
sys.stdout = filename
print('Hello')
print('Testing')
print('Test')
filename.close()
sys.stdout = sys_out
with open("/Users/xxx/documents/python/dump.txt", 'r') as file:
data = file.read()
print(data)
I have written script that parses a web page and saves data of interest in a CSV file. Before I open the data and use it in a second script I check if the file with data exist and if not I am running the parser script first. The odd behaviour of the second script is, that it is able to detect that there is no file, then the file is created, but when it is read for the first time it is empty (part of else statement). I tried to provide some delay by using the time.sleep() method, but it does not work. The explorer clearly shows that the file is not empty, but at the first run, script recognizes the file as empty. At the subsequent runs the scripts clearly sees the file and is able to properly recognize it content.
Maybe You have some explanation for this behaviour.
def open_file():
# TARGET_DIR and URL are global variables.
all_lines = []
try:
current_file = codecs.open(TARGET_DIR, 'r', 'utf-8')
except FileNotFoundError:
procesed_data = parse_site(URL)
save_parsed(procesed_data)
compare_parsed()
open_file()
else:
time.sleep(10)
data = csv.reader(current_file, delimiter=';')
for row in data:
all_lines.append(row)
current_file.close()
return all_lines
You got some recursion going on.
Another way to do it—assuming I understand correctly—is this:
import os
def open_file():
# TARGET_DIR and URL are global variables.
all_lines = []
# If the file is not there, make it.
if not os.path.isfile(TARGET_DIR):
procesed_data = parse_site(URL)
save_parsed(procesed_data)
compare_parsed()
# Here I am assuming the file has been created.
current_file = codecs.open(TARGET_DIR, 'r', 'utf-8')
data = csv.reader(current_file, delimiter=';')
for row in data:
all_lines.append(row)
current_file.close()
return all_lines
you should return the result of your internal open_file call, or just opening the file in your except block:
def open_file():
# TARGET_DIR and URL are hopefully constants
try:
current_file = codecs.open(TARGET_DIR, 'r', 'utf-8')
except FileNotFoundError:
procesed_data = parse_site(URL)
save_parsed(procesed_data)
compare_parsed()
current_file = codecs.open(TARGET_DIR, 'r', 'utf-8')
data = csv.reader(current_file, delimiter=';')
all_lines = list(data)
current_file.close()
return all_lines
I'm currently making a program that requires a JSON database file. I want the program to check for the file, if it's there then it's perfect, run the rest of the program, but if it doesn't exist create 'Accounts.json' with {} inside the file, instead then run the program.
How would I do this? Whats the most efficient way.
Note: I use this for checking, but how would I create the file:
def startupCheck():
if os.path.isfile(PATH) and os.access(PATH, os.R_OK):
# checks if file exists
print ("File exists and is readable")
else:
print ("Either file is missing or is not readable")
I believe you could simply do:
import io
import json
import os
def startupCheck():
if os.path.isfile(PATH) and os.access(PATH, os.R_OK):
# checks if file exists
print ("File exists and is readable")
else:
print ("Either file is missing or is not readable, creating file...")
with io.open(os.path.join(PATH, 'Accounts.json'), 'w') as db_file:
db_file.write(json.dumps({}))
This is how i did it. I hope it helps.
edit, yeey it looks like a code now :D
import json
import os
def where_json(file_name):
return os.path.exists(file_name)
if where_json('data.json'):
pass
else:
data = {
'user': input('User input: '),
'pass': input('Pass input: ')
}
with open('data.json', 'w') as outfile:
json.dump(data, outfile)
How about wrapping the open file in a try/except? I'm not a professional Python coder, so feel free to weigh in if this is not a kosher approach.
try:
with open('Accounts.json', 'r') as fp:
accounts = json.load(fp)
except IOError:
print('File not found, will create a new one.')
accounts = {}
# do stuff with your data...
with open('Accounts.json', 'w') as fp:
json.dump(accounts, fp, indent=4)
w+ opens with write permissions.
The + creates a new file if one is not found.
filename = 'jsonDB.json'
def openFile():
with open(filename, 'w+') as f:
f.write('{}')
f.close
openFile()
I have a script that accepts as an argument a filename than opens it and writes some stuff.
I use the with statement:
with open(file_name, 'w') as out_file:
...
out_file.write(...)
Now what if I want to write to sys.stdout if no file_name is provided?
Do I necessarily need to wrap all actions in a function and put a condition before?
if file_name is None:
do_everything(sys.stdout)
else:
with open(file_name, 'w') as out_file:
do_everything(out_file)
from contextlib import contextmanager
#contextmanager
def file_or_stdout(file_name):
if file_name is None:
yield sys.stdout
else:
with open(file_name, 'w') as out_file:
yield out_file
Then you can do
with file_or_stdout(file_name) as wfile:
do_stuff_writing_to(wfile)
How do you handle command line arguments? If you use argparse you could use the type and default parameters of add_argument to handle this. For example, try something like the following:
import sys
import argparse
def main(argv=None):
if argv is None:
argv=sys.argv[1:]
parser = argparse.ArgumentParser()
parser.add_argument('infile', nargs='?',
type=argparse.FileType('w'),
default=sys.stdin)
args = parser.parse_args(argv)
print args.infile
return 0
if __name__=="__main__":
sys.exit(main(sys.argv[1:]))
If a file name is present as an argument the the script argparse will automatically open and close this file and args.infile will be a handle to this file. Otherwise args.infile will simply be sys.stdin.
You could write your own context manager. I'll post sample code later if noone else does
if file_name is None:
fd = sys.stdout
else:
fd = open(file_name, 'w')
# write to fd
if fd != sys.stdout:
fd.close();
Using the with ... as construct is useful to close the file automatically. This means that using it with sys.stdout, as I guess you know, would crash your program, because it would attempt at closing the system stdout!
This means something like with open(name, 'w') if name else sys.stdout as: would not work.
This make me say there isn't any simple-nice way to write your snippet better... but there are probably better ways to think on how to construct such a code!
The main point to clarify is when you need to open (and, more importantly, close) the filehandler for file_name, when file_name exists.
Personally I would simply drop the with .. as and take care of opening the file - and, more importantly, close it! - somewhere else. Mileage for that might vary depending on how your software is working.
This means you can simply do:
out_file = open(file_name, 'w') if file_name else sys.stdout
and work with out_file throughout your program.
When you close, remember to check if it's a file or not :)
And have you thought about simply using the logging module? That easily allows you to add different handlers, print to file, print to stdout...