Data not getting written in file [Python] - python

final=open("war.txt","w+")
for line in madList:
line=line.split('A ')
dnsreg= line[1]
print dnsreg
final.write(dnsreg)
While printing dnsreg I can see the output, but when I write it to a file, nothing is being written. No syntax error is there either. Any idea?

The data written to a file is not written immediately, it's kept in a buffer, and large amounts are written at a time so save the writing-to-disk overhead. However, upon closing a file, all the buffered data is flushed to the disk.
So, you can do two things:
Call final.close() when you are done, or
Call final.flush() after final.write() if you don't want to close the file.
Thanks to #Matt Tanenbaum, a really nice way to handle this in python is to do the writing inside a with block:
with open("war.txt","w+") as final:
for line in madList:
line=line.split('A ')
dnsreg= line[1]
print dnsreg
final.write(dnsreg)
Doing this, you'll never have to worry about closing the file! But you may need to flush in case of premature termination of the program (e.g. due to exceptions).

You should use the with statement in Python when using resources that have to be setup and tear down, like opening and closing of files. Something like:
with open("war.txt","w+") as myFile:
for line in madList:
line=line.split('A ')
dnsreg= line[1]
myFile.write(dnsreg)
If you do not want to use with, you will have to manually close the file. In that case, you can use the try...finally blocks to handle this.
try:
myFile = open("war.txt", "w+")
for line in madList:
line=line.split('A ')
dnsreg= line[1]
myFile.write(dnsreg)
finally:
myFile.close()
finally will always work, so your file is closed, and changes are written.

Related

Python saving data inside Memory? (ram)

I am new to Python, but I didn't know this til yet.
I have a basic program inside a for loop, that requests data from a site and saves it to a text file
But when I checked inside my task manager I saw that the memory usage only increase? This might be a problem for me when running this for a long time.
Is it standard for Python to do this or can you change it?
Here is a what the program basically is
savefile = open("file.txt", "r+")
for i in savefile:
#My code goes here
savefile.write(i)
#end of loop
savefile.close()
Python does not write to file until you call .close() or .flush() or until it hits a specified buffer size. This question might help you: How often does python flush to a file?
As #Almog said, Python does not write to the file immediately. Because of this, every line you write to the file gets stored into RAM until you use savefile.close(), which flushes the internal buffer and writes everything to the file. This would explain the extra memory usage.
Try changing the loop to this:
savefile = open('file.txt', 'r+')
for i in savefile:
savefile.write(i)
savefile.flush() #flushes buffer, saving RAM
savefile.close()
There is a better Solution, in pythonic way, to this:
with open("your_file.txt", "write_mode") as file_variable_name:
for line in file_name:
file_name.write(line)
file_name.flush()
This code flushes the File for each line and after it's execution it closes the File thanks to the with-Statement

writing output for python not functioning

I am attempting to output a new txt file but it come up blank. I am doing this
my_file = open("something.txt","w")
#and then
my_file.write("hello")
Right after this line it just says 5 and then no text comes up in the file
What am I doing wrong?
You must close the file before the write is flushed. If I open an interpreter and then enter:
my_file = open('something.txt', 'w')
my_file.write('hello')
and then open the file in a text program, there is no text.
If I then issue:
my_file.close()
Voila! Text!
If you just want to flush once and keep writing, you can do that too:
my_file.flush()
my_file.write('\nhello again') # file still says 'hello'
my_file.flush() # now it says 'hello again' on the next line
By the way, if you happen to read the beautiful, wonderful documentation for file.write, which is only 2 lines long, you would have your answer (emphasis mine):
Write a string to the file. There is no return value. Due to buffering, the string may not actually show up in the file until the flush() or close() method is called.
If you don't want to care about closing file, use with:
with open("something.txt","w") as f:
f.write('hello')
Then python will take care of closing the file for you automatically.
As Two-Bit Alchemist pointed out, the file has to be closed. The python file writer uses a buffer (BufferedIOBase I think), meaning it collects a certain number of bytes before writing them to disk in bulk. This is done to save overhead when a lot of write operations are performed on a single file.
Also: When working with files, try using a with-environment to make sure your file is closed after you are done writing/reading:
with open("somefile.txt", "w") as myfile:
myfile.write("42")
# when you reach this point, i.e. leave the with-environment,
# the file is closed automatically.
The python file writer uses a buffer (BufferedIOBase I think), meaning
it collects a certain number of bytes before writing them to disk in
bulk. This is done to save overhead when a lot of write operations are
performed on a single file. Ref #m00am
Your code is also okk. Just add a statement for close file, then work correctly.
my_file = open("fin.txt","w")
#and then
my_file.write("hello")
my_file.close()

Prints on stdout but unable to write to file in Python

I'm really frustrated with this strange behaviour of Python all of a sudden. I've been writing to files all sorts of data but since today morning it just doesn't seem to work. I've referred to all these before posting:
How to redirect 'print' output to a file using python?
Failed to write to file but generates no Error
Unable to write data into a file using python
Unable to write list of elements to a file using python
Have tried all following commands but it just doesn't write anything to the delete.txt file. What's happening?
fl=open('delete.txt','w')
fl.write(msg) <--Doesnt work,tried this also
fl.write('%s' %msg) <--Doesnt work,tried this also
fl.write("at least write this") <-- Doesnt work,tried this also
print (msg) <- WORKS
Code:
for i in hd_com.comment_message[1:500]:
fl=open('delete.txt','wb')
try:
if len(i)>40:
mes=processComUni(i)
proc=nltk.tokenize.word_tokenize(mes)
#print proc
pos=nltk.pos_tag(proc)
for i in pos:
if ((i[1]=="NN") or (i[1]=="NNP") or (i[1]=="NNS")) and len(i[0])>2:
#print i[0],i[1]
for j in home_depo_inv:
if i[0] in j.split() and (i[0]!='depot' and i[0]!='home' and i[0]!='store' and i[0]!='por' and i[0]!='get' and i[0]!='house' and i[0]!='find' and i[0]!='part' and i[0]!='son' and i[0]!='put' and i[0]!='lot' and i[0]!='christmas' and i[0]!='post'):
a=re.findall(i[0],j)
fl.write(str(i))<--Doesnt work,tried this also
fl.write(str(mes))<--Doesnt work,tried this also
fl.write("\n")<--Doesnt work,tried this also
fl.write("hello")<--Doesnt work,tried this also
fl.flush()
break
except:
continue
fl.close()
More code:
type(mes) = str
mes="omg would love front yard"
Your snippet's indentation is totally messed up, but anyway: your code starts with:
for i in hd_com.comment_message[1:500]:
fl=open('delete.txt','wb')
which means you reopen the file for writing on each iteration, erasing whatever might have been written by the previous iteration.
You need to flush the output stream explicitly when writing to a file handle like that.
f = open("test.txt", "w")
f.write("this is a test\n")
# no text in the output file at this point
f.flush()
# buffers are flushed to the file
f.write("this, too, is a test\n")
# again, line doesn't show in the file
f.close()
# closing the file flushes the buffers, text appears in file
From the documentation of file.write:
Note that due to buffering, flush() or close() may be needed before the file on disk reflects the data written.

What happens if I read a file without closing it afterwards?

I used to read files like this:
f = [i.strip("\n") for i in open("filename.txt")]
which works just fine. I prefer this way because it is cleaner and shorter than traditional file reading code samples available on the web (e.g. f = open(...) , for line in f.readlines() , f.close()).
However, I wonder if there can be any drawback for reading files like this, e.g. since I don't close the file, does Python interpreter handles this itself? Is there anything I should be careful of using this approach?
This is the recommended way:
with open("filename.txt") as f:
lines = [line.strip("\n") for line in f]
The other way may not close the input file for a long time. This may not matter for your application.
The with statement takes care of closing the file for you. In CPython, just letting the file handle object be garbage-collected should close the file for you, but in other flavors of Python (Jython, IronPython, PyPy) you definitely can't count on this. Also, the with statement makes your intentions very clear, and conforms with common practice.
From the docs:
When you’re done with a file, call f.close() to close it and free up any system resources taken up by the open file.
You should always close a file after working with it. Python will not automatically do it for you. If you want a cleaner and shorter way, use a with statement:
with open("filename.txt") as myfile:
lines = [i.strip("\n") for i in myfile]
This has two advantages:
It automatically closes the file after the with block
If an exception is raised, the file is closed regardless.
It might be fine in a limited number of cases, e.g. a temporary test.
Python will only close the file handle after it finishes the execution.
Therefore this approach is a no-go for a proper application.
When we write onto a file using any of the write functions. Python holds everything to write in the file in a buffer and pushes it onto the actual file on the storage device either at the end of the python file or if it encounters a close() function.
So if the file terminates in between then the data is not stored in the file. So I would suggest two options:
use with because as soon as you get out of the block or encounter any exception it closes the file,
with open(filename , file_mode) as file_object:
do the file manipulations........
or you can use the flush() function if you want to force python to write contents of buffer onto storage without closing the file.
file_object.flush()
For Reference: https://lerner.co.il/2015/01/18/dont-use-python-close-files-answer-depends/

Subprocess file output needs to close before reading

I'm trying to use a subprocess to write the output to a data file, and then parse through it in order to check for some data in it. However, when I need to do the reading through the file's lines, I always get a blank file unless I close the file and then reopen it. While it works, I just don't like having to do this and I want to know why it happens. Is it an issue with subprocess, or another intricacy of the file mode?
dumpFile=open(filename,"w+")
dump = subprocess.Popen(dumpPars,stdout=dumpFile)
dump.wait()
At this point, if I try to read the file, I get nothing. However, it works fine by doing these commands after:
dumpFile.close()
dumpFile=open(filename,"r")
The with statement automatically closes the file after the block ends:
with open(filename, "w+") as dumpFile:
dump = subprocess.Popen(dumpPars, stdout=dumpFile)
dump.wait()
with open(filename, "r") as dumpFile:
# dumpFile reading code goes here
You probably need to seek back to the beginning of the file, otherwise the file pointer will be at the end of the file when you try to read it:
dumpFile.seek(0)
However, if you don't need to actually store dumpFile, it's probably better to do something like:
dump = = subprocess.Popen(dumpPars,stdout=subprocess.PIPE)
stdoutdata,_ = dump.communicate() #now parse stdoutdata
unless your command produces large volumes of data.
If you want to read what you've already written, either close and reopen the file, or "rewind" it - seek to offset 0.
If you want to read the file while it is being written, you can do so (don't even need to write it to disk), see this other question Capture output from a program

Categories

Resources