Python results output to txt file - python

I tried this code posted 2 years ago:
import subprocess
with open("output.txt", "wb") as f:
subprocess.check_call(["python", "file.py"], stdout=f)
import sys
import os.path
orig = sys.stdout
with open(os.path.join("dir", "output.txt"), "wb") as f:
sys.stdout = f
try:
execfile("file.py", {})
finally:
sys.stdout = orig
It hangs up the terminal until I ctl-z and then it crashes the terminal but prints the output.
I'm new to coding and am not sure how to resolve. I'm obviously doing something wrong. Thanks for your help.

You can simply open and write to the file with write.
with open('output.txt', 'w') as f:
f.write('output text') # You can use a variable from other data you collect instead if you would like
Since you are new to coding, i'll just let you know that opening a file using with will actually close it automatically after the indented code is ran. Good luck with your project!

Related

Having Trouble Loading a Pickle File

I am trying to create a small game for fun, and I want to save and load previous run scores. I started a test file to mess around and try to figure out how pickling works. I have a pickle file with a small set of number. How do I add numbers to the pickle file and save it for the next run.
Currently I have it like this:
new_score = 9
filename = "scoreTest.pk"
outfile = open(filename,'wb')
infile = open(filename,'rb')
with infile as f:
scores = pickle.load(f)
scores.add(new_score)
pickle.dump(scores, outfile)
When I run it like this I get this error:
EOFError: Ran out of input
If someone could please tell me what is wrong and how to do it correctly that would be great. Apologies for any un-optimal code, I'm new to code.
You are trying to juggle a reader and writer on the same file at the same time. The open(filename, 'wb') of the write deletes whatever happened to be in the file so there is no data for the reader. You should only open the file when you really need to use it. And its better to write to a temporary file and rename it. If something goes wrong you haven't lost your data.
import pickle
import os
new_score = 9
filename = "scoreTest.pk"
tmp_filename = "scoreTest.tmp"
try:
with open(filename, 'rb') as infile:
scores = pickle.load(f)
except (IOError, EOFError) as e:
scores = default # whatever that is
scores.add(new_score)
with open(tmp_filename, 'wb') as outfile:
pickle.dump(scores, outfile)
os.rename(tmp_filename, filename)

python readlines not working during incron

I'm trying to call a python script through incron:
/data/alucard-ops/drop IN_CLOSE_WRITE /data/alucard-ops/util/test.py $#/$#
but I cant seem to read from the file passed. Here is the script:
#!/usr/bin/env /usr/bin/python3
import os,sys
logfile = '/data/alucard-ops/log/'
log = open(logfile + 'test.log', 'a')
log.write(sys.argv[1] + "\n")
log.write(str(os.path.exists(sys.argv[1])) + "\n")
datafile = open(sys.argv[1], 'r')
log.write('Open\n')
data = datafile.readlines()
log.write("read\n")
datafile.close()
The output generated by the script:
/data/alucard-ops/drop/nsco-20180219.csv
True
Open
It seems to stop at the readlines() call. I dont see any errors in the syslog.
Update: It seems that i can use a subprocess to cat the file and it retrieves the contents. But, when i decode it, data.decode('utf-8') I'm back to nothing in the variable.
I ended up using watchdog instead.

How to execute a python script and write output to txt file?

I'm executing a .py file, which spits out a give string. This command works fine
execfile ('file.py')
But I want the output (in addition to it being shown in the shell) written into a text file.
I tried this, but it's not working :(
execfile ('file.py') > ('output.txt')
All I get is this:
tugsjs6555
False
I guess "False" is referring to the output file not being successfully written :(
Thanks for your help
what your doing is checking the output of execfile('file.py') against the string 'output.txt'
you can do what you want to do with subprocess
#!/usr/bin/env python
import subprocess
with open("output.txt", "w+") as output:
subprocess.call(["python", "./script.py"], stdout=output);
This'll also work, due to directing standard out to the file output.txt before executing "file.py":
import sys
orig = sys.stdout
with open("output.txt", "wb") as f:
sys.stdout = f
try:
execfile("file.py", {})
finally:
sys.stdout = orig
Alternatively, execute the script in a subprocess:
import subprocess
with open("output.txt", "wb") as f:
subprocess.check_call(["python", "file.py"], stdout=f)
If you want to write to a directory, assuming you wish to hardcode the directory path:
import sys
import os.path
orig = sys.stdout
with open(os.path.join("dir", "output.txt"), "wb") as f:
sys.stdout = f
try:
execfile("file.py", {})
finally:
sys.stdout = orig
If you are running the file on Windows command prompt:
python filename.py >> textfile.txt
The output would be redirected to the textfile.txt in the same folder where the filename.py file is stored.
The above is only if you have the results showing on cmd and you want to see the entire result without it being truncated.
The simplest way to run a script and get the output to a text file is by typing the below in the terminal:
PCname:~/Path/WorkFolderName$ python scriptname.py>output.txt
*Make sure you have created output.txt in the work folder before executing the command.
Use this instead:
text_file = open('output.txt', 'w')
text_file.write('my string i want to put in file')
text_file.close()
Put it into your main file and go ahead and run it. Replace the string in the 2nd line with your string or a variable containing the string you want to output. If you have further questions post below.
file_open = open("test1.txt", "r")
file_output = open("output.txt", "w")
for line in file_open:
print ("%s"%(line), file=file_output)
file_open.close()
file_output.close()
using some hints from Remolten in the above posts and some other links I have written the following:
from os import listdir
from os.path import isfile, join
folderpath = "/Users/nupadhy/Downloads"
filenames = [A for A in listdir(folderpath) if isfile(join(folderpath,A))]
newlistfiles = ("\n".join(filenames))
OuttxtFile = open('listallfiles.txt', 'w')
OuttxtFile.write(newlistfiles)
OuttxtFile.close()
The code above is to list all files in my download folder. It saves the output to the output to listallfiles.txt. If the file is not there it will create and replace it with a new every time to run this code. Only thing you need to be mindful of is that it will create the output file in the folder where your py script is saved. See how you go, hope it helps.
You could also do this by going to the path of the folder you have the python script saved at with cmd, then do the name.py > filename.txt
It worked for me on windows 10

Reading a file in python via read()

Consider this snippet
from sys import argv
script, input_file = argv
def print_all(f):
print f.read()
current_file = open(input_file)
print_all(current_file)
Ref. line 4: Why do I have to use "print" along with "f.read()". When I use just f.read() it doesnt print anything, why ?
f.read() reads the file from disk into memory. print prints to the console. You will find more info on input and output in the documentation

Fallback to stdout if no file name provided

I have a script that accepts as an argument a filename than opens it and writes some stuff.
I use the with statement:
with open(file_name, 'w') as out_file:
...
out_file.write(...)
Now what if I want to write to sys.stdout if no file_name is provided?
Do I necessarily need to wrap all actions in a function and put a condition before?
if file_name is None:
do_everything(sys.stdout)
else:
with open(file_name, 'w') as out_file:
do_everything(out_file)
from contextlib import contextmanager
#contextmanager
def file_or_stdout(file_name):
if file_name is None:
yield sys.stdout
else:
with open(file_name, 'w') as out_file:
yield out_file
Then you can do
with file_or_stdout(file_name) as wfile:
do_stuff_writing_to(wfile)
How do you handle command line arguments? If you use argparse you could use the type and default parameters of add_argument to handle this. For example, try something like the following:
import sys
import argparse
def main(argv=None):
if argv is None:
argv=sys.argv[1:]
parser = argparse.ArgumentParser()
parser.add_argument('infile', nargs='?',
type=argparse.FileType('w'),
default=sys.stdin)
args = parser.parse_args(argv)
print args.infile
return 0
if __name__=="__main__":
sys.exit(main(sys.argv[1:]))
If a file name is present as an argument the the script argparse will automatically open and close this file and args.infile will be a handle to this file. Otherwise args.infile will simply be sys.stdin.
You could write your own context manager. I'll post sample code later if noone else does
if file_name is None:
fd = sys.stdout
else:
fd = open(file_name, 'w')
# write to fd
if fd != sys.stdout:
fd.close();
Using the with ... as construct is useful to close the file automatically. This means that using it with sys.stdout, as I guess you know, would crash your program, because it would attempt at closing the system stdout!
This means something like with open(name, 'w') if name else sys.stdout as: would not work.
This make me say there isn't any simple-nice way to write your snippet better... but there are probably better ways to think on how to construct such a code!
The main point to clarify is when you need to open (and, more importantly, close) the filehandler for file_name, when file_name exists.
Personally I would simply drop the with .. as and take care of opening the file - and, more importantly, close it! - somewhere else. Mileage for that might vary depending on how your software is working.
This means you can simply do:
out_file = open(file_name, 'w') if file_name else sys.stdout
and work with out_file throughout your program.
When you close, remember to check if it's a file or not :)
And have you thought about simply using the logging module? That easily allows you to add different handlers, print to file, print to stdout...

Categories

Resources