I have below code :
import os, subprocess
def cfile():
p = r'/mypath/abc'
cmd = ["who am i | awk '{print $1}'"]
if not os.path.exists(p):
fh = open(p, 'a')
try:
subprocess.Popen(cmd, stdout=fh)
finally:
fh.close()
cfile()
above code is creating the file called 'folder' but not writing anything. Can you please help me to get to know what is wrong here. I am using python 2.7
You could call .wait() on each Popen object in order to be sure that it's finished and then call flush(). Maybe something like this:
import os
import subprocess
def cfile():
p = r'/mypath/abc'
cmd = ["who am i | awk '{print $1}'"]
fh = open(p, 'a+')
try:
sb = subprocess.Popen(cmd, shell=True, universal_newlines=True, stdout=fh)
sb.wait()
fh.flush()
finally:
fh.close()
cfile()
Related
Problem
The code below is a simulation of a real terminal, in this case, a CMD terminal. The problem is that the "cls" don't clear the STDOUT of CMD. So, the string STDOUT start to stay so much extensive.
Example of problem
Microsoft Windows [versÆo 10.0.19042.746]
(c) 2020 Microsoft Corporation. Todos os direitos reservados.
C:\Users\Lsy\PycharmProjects\Others>chdir
C:\Users\Lsy\PycharmProjects\Others
C:\Users\Lsy\PycharmProjects\Others>echo test
test
C:\Users\Lsy\PycharmProjects\Others>cls
Type:
Question
How to clear the STDOUT?
Script
import subprocess
f = open('output.txt', 'w')
proc = subprocess.Popen('cmd.exe', stderr=subprocess.STDOUT, stdin=subprocess.PIPE, stdout=f, shell=True)
while True:
command = input('Type:')
command = command.encode('utf-8') + b'\n'
proc.stdin.write(command)
proc.stdin.flush()
with open('output.txt', 'r') as ff:
print(ff.read())
ff.close()
This is not how I recommend using sub processes - but I'm assuming you have some reason for doing things this way...
Given:
You've directed the CMD sub process to STDOUT to a file called "output.txt".
The CLS character is captured in the output.txt.
Your terminal then displaying the contents of the "output.txt" file (which is not ever cleared) and leaves a mess.
Therefore: If you want to "clear" your sub process terminal, then you will have to flush your "output.txt" file.
You can trivially do this by processing on the "command" variable before encoding and sending it to the sub process.
e.g:
import subprocess
import os
f = open('output.txt', 'w')
proc = subprocess.Popen('cmd.exe', stderr=subprocess.STDOUT, stdin=subprocess.PIPE, stdout=f, shell=True)
while True:
command = input('Type:')
if command == "cls":
open('output.txt', 'w').close()
os.system('cls' if os.name == 'nt' else 'clear')
else:
command = command.encode('utf-8') + b'\n'
proc.stdin.write(command)
proc.stdin.flush()
with open('output.txt', 'r+') as ff:
print(ff.read())
You could maybe also not redirect the output to a text file...
I want to redirect the output of my jar in a file with python I have tried the following but it didn't work out
import sys
import subprocess
cmdargs = sys.argv
fname = str(cmdargs[1])
input = '../res/test/'+fname
output = '../res/res/'+fname
subprocess.Popen(['java', '-jar', '../res/chemTagger2.jar',input,'>',output])
the output is print in the console
You can redirect the subprocess.Popen stdout and stderr by using their parameters within the Popen command, as follows:
import sys
import subprocess
cmdargs = sys.argv
fname = str(cmdargs[1])
input = '../res/test/' + fname
output = '../res/res/' + fname
with open(output, 'a') as f_output:
subprocess.Popen(['java', '-jar', '../res/chemTagger2.jar',input], stdout=f_output)
Is there any possible way to communicate with the cmd and at the same time save all its output to a file?
I mean that after every command the output will be saved, not at the end of the sub-process.
I want it to be something like this:
import subprocess
process = subprocess.Popen('C:\\Windows\\system32\\cmd.exe', stdout=subprocess.PIPE,
stdin=subprocess.PIPE)
while True:
with open("log.txt", "a+") as myfile:
myfile.write(process.stdout.readlines())
process.stdin(raw_input())
You have two ways of doing this, either by creating an iterator from the read or readline functions and do:
import subprocess
import sys
with open('test.log', 'w') as f:
process = subprocess.Popen(your_command, stdout=subprocess.PIPE)
for c in iter(lambda: process.stdout.read(1), ''):
sys.stdout.write(c)
f.write(c)
or
import subprocess
import sys
with open('test.log', 'w') as f:
process = subprocess.Popen(your_command, stdout=subprocess.PIPE)
for line in iter(process.stdout.readline, ''):
sys.stdout.write(line)
f.write(line)
I have the following shell script that I would like to write in Python (of course grep . is actually a much more complex command):
#!/bin/bash
(cat somefile 2>/dev/null || (echo 'somefile not found'; cat logfile)) \
| grep .
I tried this (which lacks an equivalent to cat logfile anyway):
#!/usr/bin/env python
import StringIO
import subprocess
try:
myfile = open('somefile')
except:
myfile = StringIO.StringIO('somefile not found')
subprocess.call(['grep', '.'], stdin = myfile)
But I get the error AttributeError: StringIO instance has no attribute 'fileno'.
I know I should use subprocess.communicate() instead of StringIO to send strings to the grep process, but I don't know how to mix both strings and files.
p = subprocess.Popen(['grep', '...'], stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
output, output_err = p.communicate(myfile.read())
Don't use bare except, it may catch too much. In Python 3:
#!/usr/bin/env python3
from subprocess import check_output
try:
file = open('somefile', 'rb', 0)
except FileNotFoundError:
output = check_output(cmd, input=b'somefile not found')
else:
with file:
output = check_output(cmd, stdin=file)
It works for large files (the file is redirected at the file descriptor level -- no need to load it into the memory).
If you have a file-like object (without a real .fileno()); you could write to the pipe directly using .write() method:
#!/usr/bin/env python3
import io
from shutil import copyfileobj
from subprocess import Popen, PIPE
from threading import Thread
try:
file = open('somefile', 'rb', 0)
except FileNotFoundError:
file = io.BytesIO(b'somefile not found')
def write_input(source, sink):
with source, sink:
copyfileobj(source, sink)
cmd = ['grep', 'o']
with Popen(cmd, stdin=PIPE, stdout=PIPE) as process:
Thread(target=write_input, args=(file, process.stdin), daemon=True).start()
output = process.stdout.read()
The following answer uses shutil as well --which is quite efficient--,
but avoids a running a separate thread, which in turn never ends and goes zombie when the stdin ends (as with the answer from #jfs)
import os
import subprocess
import io
from shutil import copyfileobj
file_exist = os.path.isfile(file)
with open(file) if file_exists else io.StringIO("Some text here ...\n") as string_io:
with subprocess.Popen("cat", stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True) as process:
copyfileobj(string_io, process.stdin)
# the subsequent code is not executed until copyfileobj ends,
# ... but the subprocess is effectively using the input.
process.stdin.close() # close or otherwise won't end
# Do some online processing to process.stdout, for example...
for line in process.stdout:
print(line) # do something
Alternatively to close and parsing, if the output is known to fit in memory:
...
stdout_text , stderr_text = process.communicate()
I am using subprocess module, which Popen class output some results like:
063.245.209.093.00080-128.192.076.180.01039:HTTP/1.1 302 Found
063.245.209.093.00080-128.192.076.180.01040:HTTP/1.1 302 Found
and here is the script I wrote:
import subprocess, shlex, fileinput,filecmp
proc = subprocess.Popen('egrep \'^HTTP/\' *', shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE,)
stdout_value = proc.communicate()[0]
print 'results:'
print stdout_value
My question is: how to convert/record the results from stdout into a file?
I appreciate all your responses and helps!
import subprocess
import glob
def egrep(pattern, *files):
""" runs egrep on the files and returns the entire result as a string """
cmd = ['egrep', pattern]
for filespec in files:
cmd.extend(glob.glob(filespec))
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE)
return proc.communicate()[0]
results = egrep(r'^HTTP/', '*')
print 'results:'
print results
# write to file
with open('result_file.txt', 'w') as f:
f.write(results)
One or any of the stdin, stdout, and stderr arguments to subprocess.Popen() can be file objects (or a file descriptor), which will cause the program to read from or write to the given files.