I would like assign a file to sys.stdin so that I can read contents of the file with input(). The code below runs as expected as script but it is problematic when it is written in notebook. After calling function input() it shows me an input textbook, which I do not want since I reassigned stdin to a file. So, I expect that the line in the file would be read instead.
import sys
file = open("input.in")
sys.stdin = file
val = input()
print(val)
It seems to me notebook ignores sys.stdin. I couldn't find why this is happening and how to fix it.
Thanks.
Update
I end up with overriding input function. It will do the job but I leave question open to see if someone has a better solution.
file = open("input.in")
input = file.readline
You can read the file to an in-memory buffer and then direct stdin to read from that. For example, to redisplay a file
import sys
import io # in python2, import StringIO
input_file = open('myfile.txt', 'r')
sys.stdin = io.StringIO(input_file.read())
for line in sys.stdin:
print(line, end='')
For your purpose, you may want
import sys
import io # in python2, import StringIO
input_file = open('myfile.txt', 'r')
sys.stdin = io.StringIO(input_file.read())
val = sys.stdin.readline()
# Rest of program using val
Related
I tried this code posted 2 years ago:
import subprocess
with open("output.txt", "wb") as f:
subprocess.check_call(["python", "file.py"], stdout=f)
import sys
import os.path
orig = sys.stdout
with open(os.path.join("dir", "output.txt"), "wb") as f:
sys.stdout = f
try:
execfile("file.py", {})
finally:
sys.stdout = orig
It hangs up the terminal until I ctl-z and then it crashes the terminal but prints the output.
I'm new to coding and am not sure how to resolve. I'm obviously doing something wrong. Thanks for your help.
You can simply open and write to the file with write.
with open('output.txt', 'w') as f:
f.write('output text') # You can use a variable from other data you collect instead if you would like
Since you are new to coding, i'll just let you know that opening a file using with will actually close it automatically after the indented code is ran. Good luck with your project!
I would like a Python script to prompt me for a string, but I would like to use Vim to enter that string (because the string might be long and I want to use Vim's editing capability while entering it).
You can call vim with a file path of your choice:
from subprocess import call
call(["vim","hello.txt"])
Now you can use this file as your string:
file = open("hello.txt", "r")
aString = file.read()
Solution:
#!/usr/bin/env python
from __future__ import print_function
from os import unlink
from tempfile import mkstemp
from subprocess import Popen
def callvim():
fd, filename = mkstemp()
p = Popen(["/usr/bin/vim", filename])
p.wait()
try:
return open(filename, "r").read()
finally:
unlink(filename)
data = callvim()
print(data)
Example:
$ python foo.py
This is a big string.
This is another line in the string.
Bye!
Consider the following python script
#test.py
import sys
inputfile=sys.argv[1]
with open(inputfile,'r') as f:
for line in f.readlines():
print line
with open(inputfile,'r') as f:
for line in f.readlines():
print line
Now I want to run test.py on a substituted process, e.g.,
python test.py <( cat file | head -10)
It seems the second f.readlines returns empty. Why is that and is there a way to do it without having to specify two input files?
Why is that.
Process substitution works by creating a named pipe. So all the data consumed at the first open/read loop.
Is there a way to do it without having to specify two input files.
How about buffering the data before using it.
Here is a sample code
import sys
import StringIO
inputfile=sys.argv[1]
buffer = StringIO.StringIO()
# buffering
with open(inputfile, 'r') as f:
buffer.write(f.read())
# use it
buffer.seek(0)
for line in buffer:
print line
# use it again
buffer.seek(0)
for line in buffer:
print line
readlines() will read all available lines from the input at once. This is why the second call returns nothing because there is nothing left to read. You can assign the result of readlines() to a local variable and use it as many times as you want:
import sys
inputfile=sys.argv[1]
with open(inputfile,'r') as f:
lines = f.readlines()
for line in lines:
print line
#use it again
for line in lines:
print line
Consider this snippet
from sys import argv
script, input_file = argv
def print_all(f):
print f.read()
current_file = open(input_file)
print_all(current_file)
Ref. line 4: Why do I have to use "print" along with "f.read()". When I use just f.read() it doesnt print anything, why ?
f.read() reads the file from disk into memory. print prints to the console. You will find more info on input and output in the documentation
I want to use os.mkfifo for simple communication between programs. I have a problem with reading from the fifo in a loop.
Consider this toy example, where I have a reader and a writer working with the fifo. I want to be able to run the reader in a loop to read everything that enters the fifo.
# reader.py
import os
import atexit
FIFO = 'json.fifo'
#atexit.register
def cleanup():
try:
os.unlink(FIFO)
except:
pass
def main():
os.mkfifo(FIFO)
with open(FIFO) as fifo:
# for line in fifo: # closes after single reading
# for line in fifo.readlines(): # closes after single reading
while True:
line = fifo.read() # will return empty lines (non-blocking)
print repr(line)
main()
And the writer:
# writer.py
import sys
FIFO = 'json.fifo'
def main():
with open(FIFO, 'a') as fifo:
fifo.write(sys.argv[1])
main()
If I run python reader.py and later python writer.py foo, "foo" will be printed but the fifo will be closed and the reader will exit (or spin inside the while loop). I want reader to stay in the loop, so I can execute the writer many times.
Edit
I use this snippet to handle the issue:
def read_fifo(filename):
while True:
with open(filename) as fifo:
yield fifo.read()
but maybe there is some neater way to handle it, instead of repetitively opening the file...
Related
Getting readline to block on a FIFO
You do not need to reopen the file repeatedly. You can use select to block until data is available.
with open(FIFO_PATH) as fifo:
while True:
select.select([fifo],[],[fifo])
data = fifo.read()
do_work(data)
In this example you won't read EOF.
A FIFO works (on the reader side) exactly this way: it can be read from, until all writers are gone. Then it signals EOF to the reader.
If you want the reader to continue reading, you'll have to open again and read from there. So your snippet is exactly the way to go.
If you have mutliple writers, you'll have to ensure that each data portion written by them is smaller than PIPE_BUF on order not to mix up the messages.
The following methods on the standard library's pathlib.Path class are helpful here:
Path.is_fifo()
Path.read_text/Path.read_bytes
Path.write_text/Path.write_bytes
Here is a demo:
# reader.py
import os
from pathlib import Path
fifo_path = Path("fifo")
os.mkfifo(fifo_path)
while True:
print(fifo_path.read_text()) # blocks until data becomes available
# writer.py
import sys
from pathlib import Path
fifo_path = Path("fifo")
assert fifo_path.is_fifo()
fifo_path.write_text(sys.argv[1])