I'm using python 2.7.8. I'm working with 30 open docx files simultaneously.
Is there some way with python code to close all the files simultaneously instead closing every file separately ?
UPDATE:
I'm using different files every day so the names of the files change every time. My code must generally without specific names files (if it is possible)
I suggest is using the with statement when opening files:
with open('file1.txt', 'w') as file1, open('file2.txt', 'w') as file2:
file1.write('stuff to write')
file2.write('stuff to write')
...do other stuff...
print "Both files are closed because I'm out of the with statement"
When you leave the with statement, your file closes. You can even open all of your files on one line, but it's not recommended unless you are actively using all 20 files at once.
You need to find the pid of your word files and then use kill method to terminate the word file's process. e.g
import os
os.kill(pid, signal.SIGTERM)
first you have to append all opened file object in list.
l = []
f1 = open('f1.txt'):
#...do something
l.append(f1)
f2 = open('f2.txt'):
#...do something
l.append(f2)
Now get all files object from list and close them.
for files in l:
files.close()
Related
This question already has answers here:
How to open a list of files in Python
(6 answers)
Closed 1 year ago.
If I want to open one file I can do
with open(filename, "w") as f:
...
For a fixed number of files:
with open(name1, "w") as f1, \
open(name2, "w") as f2, \
open(name3, "w") as f3:
...
But that would only work if I know the number of files when writing the code. What would be the correct way to open files if the names were in a list?
My first inclination was to make a list of the file objects as they were opened, then use try...finally, something like
try:
files = []
for name in namelist:
files.append(open(name, "w"))
... do stuff with the files list ...
finally:
for f in files:
f.close()
If there is a problem opening one of the files, the script tidies up and quits without writing any of them, and that seems good to me.
Not sure what the best way is to handle errors when closing; a problem closing one file would prevent the ones after that from being closed, unless I trap everything; but that doesn't seem good because the error is lost.
Is there a neater/better/more elegant way? A way to extend with to a list maybe?
The standard library offers a nice way of doing this using contextlib.ExitStack.
from contextlib import ExitStack
with ExitStack() as stack:
files = [stack.enter_context(open(fin, "w")) for fin in files]
# All opened files will automatically be closed at the end of
# the with statement, even if attempts to open files later
# in the list raise an exception
I am reading sensor data and saving in file like this:
with open('serial_data.txt','a') as f:
Problem is, if I write the code five times, it appends in the same file. I want the data of each test in separate file for example if I run code four times, then it should save as: "serial_data_1.txt", "serial_data_2.txt", "serial_data_3.txt", "serial_data_4.txt"..... Is there any way to do this?
I would suggest using CLI parameters
import sys
run = sys.argv[1]
with open('serial_data_{}.txt'.format(run), 'a') as f:
Then do python app.py 1 for the first run
How to read/process command line arguments?
Otherwise, you need to save the number externally, or write a loop in your code that is processing each of your test conditions
If you want to create new files with your data in it you must use the 'w+' tag in your open function as so:
# Looping and creating multiple files
for i in range(1, 4):
# Using 'w+' to create file with such name if
# it doesn't actually exit
f = open('serial_data_{}.txt'.format(i), 'w+')
# Now you can write any data to your file
f.write('{} squared is {}'.format(i, i*i))
# Close your file
f.close()
This will produce 3 files with the following content:
serial_data_1 = "1 squared is 1"
serial_data_2 = "2 squared is 4"
serial_data_3 = "3 squared is 9"
Note: You must close the files after writing.
Additionally using 'w+' will overwrite the files every time you run it, use 'a' instead of 'w+' if you want to add/append to the file's current data.
Hopefully that helped :)
I will suggest you to write à loop in your code as:
for x in range(1, numberOfTest):
with open("serial_data_{0}.txt".format(x),'a') as f
I have a noob python question... so bear with me.
Can I open multiple files before closing the previos.
So... can I run
import os
files=[]
for file in os.listdir(os.curdir):
files.append(open(file,'w'))
Then edit each file as I want and finish with
for file in files:
file.close()
Thanks in advance
Seems legit and works fine.
Doing operations would be hard for you this way. the list "files" doesn't contain the filenames. You would not know which file is what.
It is perfectly fine to open each file using open and later close all of them. However, you will want to make sure all of your files are closed properly.
Normally, you would do this for one file:
with open(filename,'w') as f:
do_something_with_the_file(f)
# the file is closed here, regardless of what happens
# i.e. even in case of an exception
You could do the same with multiple files:
with open(filename1,'w') as f1, open(filename2,'w') as f2:
do_something_with_the_file(f)
# both files are closed here
Now, if you have N files, you could write your own context manager, but that would probably be an overkill. Instead, I would suggest:
open_files = []
try:
for filename in list_of_filenames:
open_files.append(open(filename, 'w'))
# do something with the files here
finally:
for file in open_files:
file.close()
BTW, your own code deltes the contents of all files in the current directory. I am not sure you wanted that:
for file in os.listdir(os.curdir):
files.append(open(file,'w')) # open(file,'w') empties the file!!!
Maybe you wanted open(file, 'r') or open(file, 'a') instead? (see https://docs.python.org/2/library/functions.html#open)
Your solution will certainly work but the recommended way would be to use contextmanager so that the files gets handled seamlessly. For example
for filename in os.listdir(os.curdir):
with open(filename, 'w') as f:
# do some actions on the file
The with statement will take care of closing the file for you.
I have a Python script that opens a lot (over 2 million) small text files in a for loop. However, it stops when I reach approximately 150'000 files, which indicates for me that I reached the default limit of open files in the Linux kernel.
But, I'm closing the files, so I'm not sure why I hit that limit. The interesting part breaks down to that:
import os
files = os.listdir('/var/tmp/files')
for file in files:
fd = open('/var/tmp/files/{}'.format(file), 'r')
content = fd.readlines()
# Doing stuff
fd.close()
The code works, but apparently it doesn't close files. At first i tried the better with open() statement, but that didn't work either.
Why doesn't Python close the files?
Thanks guys. The problem was that my user had no access to one specific file. So, Python did everything as it should.
I expected that it had to do with Linux' max number of open files as the number of processed files were really near to that max value. It was a coincidence, though.
Thanks for all your help, and sorry for the noise.
I don't know if this will solve the problem, but try. It may be that the program is opening multiple files in the same variable or loop prevents the program to close the files.
import os
files = os.listdir('/var/tmp/files')
fd = list()
for file in files:
if files > 100000:
break
fd.append(open('/var/tmp/files/{}'.format(file), 'r'))
content = fd[file].readlines()
# Doing stuff
for file in files:
if files > 100000:
break
fd[files].close()
I think that you use multiprocess or something in your "Doing Stuff" block. You can assume that you will always face problems related file descriptor when you use multiprocess, so you need more attention.
To solve this problem, simply don't open file before you start another process. You should open file after another process started.
There must be something else happening in your code. Check the status of your file object before you reopen the file:
import os
files = os.listdir('/var/tmp/files')
fileClosed=True
for file in files:
if fileClosed:
with open('/var/tmp/files/{}'.format(file), 'r') as fd:
content = fd.readlines()
## DO Stuff
else:
print "File not closed properly"
break
fileClosed = fd.closed
I have written a small program in python where I need to open many files and close it at a later stage, I have stored all the file handles in a list so that I can refer to it later for closing.
In my program I am storing all the file handles (fout) in the list foutList[]
for cnt in range(count):
fileName = "file" + `cnt` + ".txt"
fullFileName = path + fileName
print "opening file " + fullFileName
try:
fout = open(fullFileName,"r")
foutList.append(fout)
except IOError as e:
print "Cannot open file: %s" % e.strerror
break
Some people suggested me that do no store it in a List, but did not give me the reason why. Can anyone explain why it is not recommended to store it in a List and what is the other possible way to do the same ?
I can't think of any reasons why this is really evil, but possible objections to doing this might include:
It's hard to guarantee that every single file handle will be closed when you're done. Using the file handle with a context manager (see the with open(filename) as file_handle: syntax) always guarantees the file handle is closed, even if something goes wrong.
Keeping lots of files open at the same time may be impolite if you're going to have them open for a long time, and another program is trying to access the files.
This said - why do you want to keep a whole bunch of files open for writing? If you're writing intermittently to a bunch of files, a better way to do this is to open the file, write to it, and then close it until you're ready to write again.
All you have to do is open the file in append mode - open(filename,'a'). This lets you write to the end of an existing file without erasing what's already there (like the 'w' mode.)
Edit(1) I slightly misread your question - I thought you wanted to open these files for writing, not reading. Keeping a bunch of files open for reading isn't too bad.
If you have the files open because you want to monitor the files for changes, try using your platform's equivalent of Linux's inotify, which will tell you when a file has changed (without you having to look at it repeatedly.)
If you don't store them at all, they will eventually be garbage collected, which will close them.
If you really want to close them manually, use weak references to hold them, which will not prevent garbage collection: http://docs.python.org/library/weakref.html