Python/Flask - ValueError: I/O operation on closed file - python

Before anyone says that this is a duplicate, I do not think it is because I have looked at the similar questions and they have not helped me!
I am creating a Flask server in python, and I need to be able to have a url that shows a pdf.
I tried to use the following code:
#app.route('/pdf')
def pdfStuff():
with open('pdffile.pdf', 'rb') as static_file:
return send_file(static_file, attachment_filename='pdffile.pdf')
This is supposed to make it so when I go to /pdf it will show the pdf file pdffile.pdf.
However, this does not work because when I run the code I get this error:
ValueError: I/O operation on closed file
How is this the case? My return statement is inside the with statement, therefore shouldn't the file be open?
I tried to use a normal static_file = open(...) and used try and finally statements, like this:
static_file = open('pdffile.pdf','rb')
try:
return send_file(static_file, attachment_filename='pdffile.pdf')
finally:
static_file.close()
The same error happens with the above code, and I have no idea why. Does anyone know what I could be doing wrong?
Sorry if I am being stupid and there is something simple that I made a mistake with!
Thank you very much in advance !!

Use send_file with the filename, it'll open, serve and close it the way you expect.
#app.route('/pdf')
def pdfStuff():
return send_file('pdffile.pdf')

Despite #iurisilvio's answer solves this specific problem, is not a useful answer in any other case. I was struggling with this myself.
All the following examples are throwing ValueError: I/O operation on closed file. but why?
#app.route('/pdf')
def pdfStuff():
with open('pdffile.pdf', 'rb') as static_file:
return send_file(static_file, attachment_filename='pdffile.pdf')
#app.route('/pdf')
def pdfStuff():
static_file = open('pdffile.pdf','rb')
try:
return send_file(static_file, attachment_filename='pdffile.pdf')
finally:
static_file.close()
I am doing something slightly different. Like this:
#page.route('/file', methods=['GET'])
def build_csv():
# ... some query ...
ENCODING = 'utf-8'
bi = io.BytesIO()
tw = io.TextIOWrapper(bi, encoding=ENCODING)
c = csv.writer(tw)
c.writerow(['col_1', 'col_2'])
c.writerow(['1', '2'])
bi.seek(0)
return send_file(bi,
as_attachment=True,
attachment_filename='file.csv',
mimetype="Content-Type: text/html; charset={0}".format(ENCODING)
)
In the first two cases, the answer is simple:
You give a stream to send_file, this function will not immediatelly transmit the file, but rather wrap the stream and return it to Flask for future handling. Your pdfStuff function will allready return before Flask will start handling your stream, and in both cases (with and finally) the stream will be closed before your function returns.
The third case is more tricky (but this answer pointed me in the right direction: Why is TextIOWrapper closing the given BytesIO stream?). In the same fashion as explained above, bi is handled only after build_csv returns. Hence tw has allready been abandoned to the garbage collector. When the collector will destroy it, tw will implicitly close bi. The solution to this one is tw.detach() before returning (this will stop TextIOWrapper from affecting the stream).
Side note (please correct me if I'm wrong):
This behaviour is limiting, unless when send_file is provided with a file-like object it will handle the closing on its own. It is not clear from the documentation (https://flask.palletsprojects.com/en/0.12.x/api/#flask.send_file) if closing is handled. I would assume so (there are some .close() present in the source code + send_file uses werkzeug.wsgi.FileWrapper which has .close() implemented too), in which case your approach can be corrected to:
#app.route('/pdf')
def pdfStuff():
return send_file(open('pdffile.pdf','rb'), attachment_filename='pdffile.pdf')
Ofcourse in this case, would be stright forward to provide the file name. But in other cases, may be needed to wrap the file stream in some manipulation pipeline (decode / zip)

Related

Is there a problem in creating a fileobject inside a function, returning it to the parent function, and closing it there?

consider the python code below:
def create_fout_bt(location):
fout = open(os.path.join(location, 'The Book Thief.txt'), 'w')
# Added 'w' in Edit 1. Can't believe none of you guys noticed it! :P
return fout
def main():
location = r'E:\Books\Fiction'
fout_bt = create_fout_bt(location)
fout_bt.write('Author: Markus Zusak\n')
fout_bt.close()
main()
In this code, the fileobject named fout is created inside the function create_fout_bt, but not closed within the same function. What I understand is that we have to close every fileobject we create; so is this ok? In practice, the code works fine and the output file is generated with the content I wrote to it, but just wondering if a fileobject is dangling somewhere out there.
Thanks for your time.
Edit 1:
Thank you for introducing me to the python with statement. Hopefully I'll use it in the future.
Also, let me clarify that the code I mentioned here is a generic, simple case. Of course it doesn't make sense to define a function just to create a fileobject! In the real scenario, I will be writing to many different files concurrently. For example:
fout1.write('%s: %f' %('Magnetic Field', magnetic_field))
fout2.write('%s: %f' %('Power', power))
fout3.write('%s: %f' %('Cadence', cadence))
Now this requires creating the fileobjects fout1, fout2, fout3:
fout1 = open(os.path.join(rootPath, 'filename1.txt'), 'w')
fout2 = open(os.path.join(rootPath, 'filename2.txt'), 'w')
fout3 = open(os.path.join(rootPath, 'filename3.txt'), 'w')
Since there are many of them, I wanted to put them inside a function to make it look better - now a single function call will get me all the fileobjects:
fout1, fout2, fout3 = create_file_objects(rootPath)
Moreover, in the real scenario, I have to write into a file at multiple locations in the program. From what I have understood, if I'm using 'with', I'll have to open the file in append mode each time I have to write into it (making the code look cluttered); compared to using an 'open()' function which will keep the file open till I use the close() function.
Like deceze commented, the problem I'm worried about is spreading the responsibility of the fileobject to multiple functions. In my first example,
'fout' is the variable created inside the function 'create_fout_bt' and 'fout_bt' is the variable to which that value is assigned by the latter. Now, I know 'fout_bt' is taken care of with the statement 'fout_bt.close()', but what about 'fout' inside the function 'create_fout_bt'? Will it be disposed off when the function 'create_fout_bt' returns?
Hope my doubt is more clear. Do let me know if I just missed something obvious. Any comments on how to make my future posts more palatable will also be much appreciated. :)
Your code works fine, I try #Sujay 's suggestion, it raises an error I/O operation on closed file after fout_bt.close()
If you afraid of your code style, you can use with to do it.
code:
def create_fout_bt(location):
fout = open(os.path.join(location, 'The Book Thief.txt'),"a")
return fout
def main():
location = r'E:\Books\Fiction'
with create_fout_bt(location) as fout_bt:
fout_bt.write('Author: Markus Zusak\n')
main()
The only thing is that the code that opens the file (create_fout_bt) cannot guarantee that the file will also be closed. Which isn't an issue per se, but it spreads that responsibility around and may lead to situations in which the file isn't closed, because the caller doesn't handle the returned file handle correctly. It's still fine to do this, you just need to be diligent. One way this could be improved is with this:
with create_fout_bt(location) as fout_bt:
fout_bt.write('Author: Markus Zusak\n')
Using a with context manager on the file object, regardless of whether directly created with open or "indirectly" via create_fout_bt, guarantees that the file will be closed, regardless of errors happening in your code.
you can use 'with'.
with 'with' you don't need to close your files anymore and it automatically close it self.
do it like this :
with create_fout_bt(location) as fout_bt:
fout_bt.write('Author: Markus Zusak\n')

Python file object allows you to close a file that is already closed

Creating code in 2.7 (for a Python class I am taking) and cross-testing it in 3.x (for me). This code is not required. I decided to build objects that might be useful for the future instead of simply answering the original question directly.
Since 3.x has no file object to inherit, my object uses file instances within it instead of inheriting from it directly. During testing of the code which follows, I encountered an oddity. I can close a file and then close it again and this does not trigger any kind of error. To sanity checking this code is being built right, two questions about this:
* why can I close the same file twice
* the way I have written this, when I close the file, am I really closing it and freeing up the memory the file object is taking up?
Here is the code:
class FileData(object):
def __init__(self, xfilename, xmode):
self.filename = xfilename
self.mode = xmode
self.f = open(xfilename, 'r')
def getAllFileData(self):
self.lines = f.readlines()
self.f.close()
def getLineFromFile(self):
if f.closed:
self.f = open(xfilename, 'r')
self.f.readline()
def fileHead(self, numRows):
if self.f.closed:
self.f = open(xfilename, 'r')
for i in range(numRows):
print(self.f.readline())
self.f.close()
Then I ran these test lines and accidentally re-ran the line to close the file multiple times. To make sure I wasn't missing something, I later organized these lines in a Jupyter cell and ran them together.
chatLog = FileData("script/record.txt", "r")
chatLog.fileHead(15)
chatLog.f.close()
chatLog.f.close()
Note that fileHead() also closes the file when done, so really above code should have tried to close the same file 3 times.
Why no error? And is this closing the file the way I think it is? Learning the language so any input would be appreciated.
f.close() won't do anything if the file is already closed as you noticed yourself.
You could protect against it but seeing how complex it is I wouldn't advise to do it:
import _io
def override_close(f):
if f.closed:
raise IOError("already closed error")
else:
_io.TextIOWrapper.close(f)
f = open("foo.c")
f.close = lambda : override_close(f)
print("closing")
f.close()
print("protected")
f.close() # raises IOError
I have overridden the close method for the f object so it checks against already closed file and raises an exception.
For your example, the best way would be to hide f (and other data) from the outside by making them not directly invisible from the outside:
self.__f = open(...)
so callers cannot mess (easily :)) with your file handle anymore.
Because using .close() repeatedly is okay. Internally, it might be checking .closed to be True then do nothing, else actually close the file.

Deciphering large program flow in Python

I'm in the process of learning how a large (356-file), convoluted Python program is set up. Besides manually reading through and parsing the code, are there any good methods for following program flow?
There are two methods which I think would be useful:
Something similar to Bash's "set -x"
Something that displays which file outputs each line of output
Are there any methods to do the above, or any other ways that you have found useful?
I don't know if this is actually a good idea, but since I actually wrote a hook to display the file and line before each line of output to stdout, I might as well give it to you…
import inspect, sys
class WrapStdout(object):
_stdout = sys.stdout
def write(self, buf):
frame = sys._getframe(1)
try:
f = inspect.getsourcefile(frame)
except TypeError:
f = 'unknown'
l = frame.f_lineno
self._stdout.write('{}:{}:{}'.format(f, l, buf))
def flush(self):
self._stdout.flush()
sys.stdout = WrapStdout()
Just save that as a module, and after you import it, every chunk of stdout will be prefixed with file and line number.
Of course this will get pretty ugly if:
Anyone tries to print partial lines (using stdout.write directly, or print magic comma in 2.x, or end='' in 3.x).
You mix Unicode and non-Unicode in 2.x.
Any of the source files have long pathnames.
etc.
But all the tricky deep-Python-magic bits are there; you can build on top of it pretty easily.
Could be very tedious, but using a debugger to trace the flow of execution, instruction by instruction could probably help you to some extent.
import pdb
pdb.set_trace()
You could look for a cross reference program. There is an old program called pyxr that does this. The aim of cross reference is to let you know how classes refer to each other. Some of the IDE's also do this sort of thing.
I'd recommend running the program inside an IDE like pydev or pycharm. Being able to stop the program and inspect its state can be very helpful.

List all currently open file handles? [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
check what files are open in Python
Hello,
Is it possible to obtain a list of all currently open file handles, I presume that they are stored somewhere in the environment.
I am interested in theis function as I would like to safely handle any files that are open when a fatal error is raised, i.e. close file handles and replace potentially corrupted files with the original files.
I have the handling working but without knowing what file handles are open, I am unable to implement this idea.
As an aside, when a file handle is initialised, can this be inherited by another imported method?
Thank you
lsof, /proc/pid/fd/
The nice way of doing this would be to modify your code to keep track of when it opens a file:
def log_open( *args, **kwargs ):
print( "Opening a file..." )
print( *args, **kwargs )
return open( *args, **kwargs )
Then, use log_open instead of open to open files. You could even do something more hacky, like modifying the File class to log itself. That's covered in the linked question above.
There's probably a disgusting, filthy hack involving the garbage collector or looking in __dict__ or something, but you don't want to do that unless you absolutely really truly seriously must.
If you're using python 2.5+ you can use the with keyword (though 2.5 needs `from future import with_statement)
with open('filename.txt', 'r') as f:
#do stuff here
pass
#here f has been closed and disposed properly - even with raised exceptions
I don't know what kind of catastrophic failure needs to bork the with statement, but I assume it's a really bad one. On WinXP, my quick unscientific test:
import time
with open('test.txt', 'w') as f:
f.write('testing\n')
while True:
time.sleep(1)
and then killing the process with Windows Task Manager still wrote the data to file.

Why doesn't Python release file handles after calling file.close()?

I am on windows with Python 2.5. I have an open file for writing. I write some data. Call file close. When I try to delete the file from the folder using Windows Explorer, it errors, saying that a process still holds a handle to the file.
If I shutdown python, and try again, it succeeds.
It does close them.
Are you sure f.close() is getting called?
I just tested the same scenario and windows deletes the file for me.
Are you handling any exceptions around the file object? If so, make sure the error handling looks something like this:
f = open("hello.txt")
try:
for line in f:
print line
finally:
f.close()
In considering why you should do this, consider the following lines of code:
f = open('hello.txt')
try:
perform_an_operation_that_causes_f_to_raise_an_exception()
f.close()
except IOError:
pass
As you can see, f.close will never be called in the above code. The problem is that the above code will also cause f to not get garbage collected. The reason is that f will still be referenced in sys.traceback, in which case the only solution is to manually call close on f in a finally block or set sys.traceback to None (and I strongly recommend the former).
Explained in the tutorial:
with open('/tmp/workfile', 'r') as f:
read_data = f.read()
It works when you writing or pickling/unpickling, too
It's not really necessary that try finally block: Java way of doing things, not Python
I was looking for this, because the same thing happened to me. The question didn't help me, but I think I figured out what happened.
In the original version of the script I wrote, I neglected to add in a 'finally' clause to the file in case of an exception.
I was testing the script from the interactive prompt and got an exception while the file was open. What I didn't realize was that the file object wasn't immediately garbage-collected. After that, when I ran the script (still from the same interactive session), even though the new file objects were being closed, the first one still hadn't been, and so the file handle was still in use, from the perspective of the operating system.
Once I closed the interactive prompt, the problem went away, at which I remembered that exception occurring while the file was open and realized what had been going on. (Moral: Don't try to program on insufficient sleep. : ) )
Naturally, I have no idea if this is what happened in the case of the original poster, and even if the original poster is still around, they may not remember the specific circumstances, but the symptoms are similar, so I thought I'd add this as something to check for, for anyone caught in the same situation and looking for an answer.
I did it using intermediate file:
import os
f = open("report.tmp","w")
f.write("{}".format("Hello"))
f.close()
os.system("move report.tmp report.html") #this line is for Windows users

Categories

Resources