I have a program that uses cmd module and looks like this:
import cmd
class Prog(cmd.Cmd):
prompt = '>>'
def do_reverse(self, line):
print line[::-1]
def do_exit(self, line):
return True
if __name__ == '__main__':
Prog().cmdloop()
I want to write to programs stdin and read from its stdout programmatically. I'm trying to achieve that as follows:
from subprocess import Popen, PIPE
class ShellDriver(object):
def __init__(self, process):
self.process = process
self.prompt = '>>'
self.output = ''
self.read()
def read(self):
while not self.output.endswith(self.prompt):
chars = self.process.stdout.read(1)
if chars == '':
break
self.output += chars
result = self.output.replace('\n' + self.prompt, '')
self.output = ''
return result
def execute(self, command):
self.process.stdin.write(command + '\n')
return self.read()
if __name__ == '__main__':
p = Popen(['python', 'prog.py'], stdin=PIPE, stdout=PIPE)
cmd = ShellDriver(p)
print cmd.execute('reverse abc')
cmd.execute('exit')
However when I ran this code from PyCharm it works fine, but when I ran it from command line it hangs. As I understood there is a conflict between consoles (console that reader script is running from and programs console) since they are trying to use the same pipes, and this issue doesn't exist in PyCharm because it redirects I\O to its own console.
Is there a way to get this working in the system console?
I'm on Windows (cross platform solution is preferable) and Python 2.7
Finally found an answer. By default interpreter works in buffered mode sending data to stdout in chunks, in order to avoid that it should be running in unbuffered mode with -u command line argument:
p = Popen(['python', '-u', 'prog.py'], stdin=PIPE, stdout=PIPE)
Related
Minimal example:
def main():
for i in range(100):
print("One line every 2s", end = "\n")
time.sleep(2)
if __name__ == '__main__':
with open("log.out", 'w') as out, open("log.err", 'w') as err:
sys.stdout = out
sys.stderr = err
main()
I want the print statements to be written toe the stdout file after every line. I tried unsuccesfully:
python -u thisfile.py
In the bashrc (zshrc)
PYTHONUNBUFFERED = 1
open("log.out", 'wb', buffering = 0)
Changing the main function is no an option for the real case. A solution where I run the python file with bash and redirect errors and output is ok tough.
I know theres lots of questions like this, but none seem to work for me.
The solution should work ideally for python 3.8 and anything newer.
You can use the subprocess module to run the script with the desired options and redirect the output and error streams to the log files:
import subprocess
with open("log.out", "w") as out, open("log.err", "w") as err:
subprocess.run(["python", "-u", "thisfile.py"], stdout=out, stderr=err)
This will run the script in unbuffered mode (-u option) and redirect the output and error streams to the specified log files.
If you use bash you can use:
[...]$ python -u main.py > >(tee -a log.out) 2> >(tee -a log.err >&2)
main.py
import sys
import time
def main():
for i in range(100):
print("One line every 2s", end = "\n")
time.sleep(2)
if __name__ == '__main__':
main()
Explanation: How do I write standard error to a file while using "tee" with a pipe?
you could have a wrapper that automatically flushes stdout after every write. wrapper taken from this answer
import time
import sys
class Unbuffered(object):
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
def writelines(self, datas):
self.stream.writelines(datas)
self.stream.flush()
def __getattr__(self, attr):
return getattr(self.stream, attr)
def main():
for i in range(100):
print("One line every 2s", end = "\n")
time.sleep(2)
if __name__ == '__main__':
with open("log.out", 'w') as out, open("log.err", 'w') as err:
sys.stdout = Unbuffered(out)
sys.stderr = Unbuffered(err)
main()
I am trying to create a python script that will allow some interface with Cadence Skill (command line interface). I want any output to be directed to the shell. I feel like this should be simple, but I'm not able to get it working yet. With Popen however, I can't see any output on the command line, and I'm not sure that the communicate() is properly sending the command. Here is what i have so far:
import re, array
import sys
from subprocess import call
from subprocess import Popen, PIPE, STDOUT
from threading import Thread
import os
#SET THESE VARIABLES
LibraryPath = 'path_to_library'
skillPath = 'path_to_cadence'
#Cadence Environment path
cadence_env= 'source /mscad/apps/bin/mscad_bash/cadtools --env cadence'
class cd:
"""Context manager for changing the current working directory"""
def __init__(self, newPath):
self.newPath = os.path.expanduser(newPath)
def __enter__(self):
self.savedPath = os.getcwd()
os.chdir(self.newPath)
def __exit__(self, etype, value, traceback):
os.chdir(self.savedPath)
# Change to proper Cadence Directory
# Debugging Variables
etype = 0; value = 0; traceback = 0
NewPath = cd(LibraryPath)
NewPath.__enter__()
# Open Cadence Virtuoso in Shell Mode
try:
from Queue import Queue, Empty
except ImportError:
from queue import Queue, Empty # python 3.x
ON_POSIX = 'posix' in sys.builtin_module_names
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
p = Popen(['/bin/bash', '-i', '-c', 'cadence_env cmos12s0; virtuoso -nograph'], stdout=PIPE, bufsize=1, close_fds=ON_POSIX)
q = Queue()
t = Thread(target=enqueue_output, args=(p.stdout, q))
t.daemon = True # thread dies with the program
t.start()
# read line without blocking
try: line = q.get_nowait() # or q.get(timeout=.1)
except Empty:
print('no output yet')
else: # got line
print(line)
load_command = "load(\""+skillPath+"\")"
print load_command
p.communicate(input=load_command)
print "Command Sent ..."
NewPath.__exit__(etype, value, traceback)
call(["ls -l"], shell=True)
Thanks in advance for the help.
References
Non-blocking read on a subprocess.PIPE in python
I was making it more complicated than it needed to be for my application; however the main problem I had was I was missing \n when sending my command to the program. Also, removing stdout=PIPE allowed all the output of the program to go directly to the console.
proc = Popen(['/bin/bash', '-i', '-c', 'Command_To_Open_Program'], stdin=PIPE, stderr=STDOUT)
command = "command_to_program\n"
proc.stdin.write(command)
I've got a program on Windows that calls a bunch of subprocesses, and displays the results in a GUI. I'm using PyQt for the GUI, and the subprocess module to run the programs.
I've got the following WorkerThread, that spawns a subthread for each shell command devoted to reading the process stdout and printing the results (later I'll wire it up to the GUI).
This all works. Except proc.stdout.read(1) never returns until after the subprocess has completed. This is a big problem, since some of these subprocesses can take 15-20 minutes to run, and I need to display results as they're running.
What do I need to do to get the pipe working while the subprocess is running?
class WorkerThread(QtCore.QThread):
def run(self):
def sh(cmd, cwd = None):
proc = subprocess.Popen(cmd,
shell = True,
stdout = subprocess.PIPE,
stderr = subprocess.STDOUT,
stdin = subprocess.PIPE,
cwd = cwd,
env = os.environ)
proc.stdin.close()
class ReadStdOutThread(QtCore.QThread):
def run(_self):
s = ''
while True:
if self.request_exit: return
b = proc.stdout.read(1)
if b == '\n':
print s
s = ''
continue
if b:
s += b
continue
if s: print s
return
thread = ReadStdOutThread()
thread.start()
retcode = proc.wait()
if retcode:
raise subprocess.CalledProcessError(retcode, cmd)
return 0
FWIW: I rewrote the whole thing using QProcess, and I see the exact same problem. The stdout receives no data, until the underlying process has returned. Then I get everything all at once.
If you know how long will be the the lines of command's output you can poll on the stdout PIPE of the process.
An example of what I mean:
import select
import subprocess
import threading
import os
# Some time consuming command.
command = 'while [ 1 ]; do sleep 1; echo "Testing"; done'
# A worker thread, not as complex as yours, just to show my point.
class Worker(threading.Thread):
def __init__(self):
super(Worker, self).__init__()
self.proc = subprocess.Popen(
command, shell=True,
stdout=subprocess.PIPE,
stdin=subprocess.PIPE, stderr=subprocess.STDOUT
)
def run(self):
self.proc.communicate()
def get_proc(self):
# The proc is needed for ask him for his
# output file descriptor later.
return self.proc
if __name__ == '__main__':
w = Worker()
w.start()
proc = w.get_proc()
pollin = select.poll()
pollin.register(proc.stdout, select.POLLIN)
while ( 1 ):
events = pollin.poll()
for fd, event in events:
if event == select.POLLIN:
# This is the main issue of my idea,
# if you don't know the length of lines
# that process ouput, this is a problem.
# I put 7 since I know the word "Testing" have
# 7 characters.
print os.read(fd, 7)
Maybe this is not exactly what you're looking for, but I think it give you a pretty good idea of what to do to solve your problem.
EDIT: I think I've just found what you need Streaming stdout from a Python subprocess in Python.
Apart from the scripts own console (which does nothing) I want to open two consoles and print the variables con1 and con2 in different consoles, How can I achieve this.
con1 = 'This is Console1'
con2 = 'This is Console2'
I've no idea how to achieve this and spent several hours trying to do so with modules such as subprocess but with no luck. I'm on windows by the way.
Edit:
Would the threading module do the job? or is multiprocessing needed?
Eg:
If you don't want to reconsider your problem and use a GUI such as in #Kevin's answer then you could use subprocess module to start two new consoles concurrently and display two given strings in the opened windows:
#!/usr/bin/env python3
import sys
import time
from subprocess import Popen, PIPE, CREATE_NEW_CONSOLE
messages = 'This is Console1', 'This is Console2'
# open new consoles
processes = [Popen([sys.executable, "-c", """import sys
for line in sys.stdin: # poor man's `cat`
sys.stdout.write(line)
sys.stdout.flush()
"""],
stdin=PIPE, bufsize=1, universal_newlines=True,
# assume the parent script is started from a console itself e.g.,
# this code is _not_ run as a *.pyw file
creationflags=CREATE_NEW_CONSOLE)
for _ in range(len(messages))]
# display messages
for proc, msg in zip(processes, messages):
proc.stdin.write(msg + "\n")
proc.stdin.flush()
time.sleep(10) # keep the windows open for a while
# close windows
for proc in processes:
proc.communicate("bye\n")
Here's a simplified version that doesn't rely on CREATE_NEW_CONSOLE:
#!/usr/bin/env python
"""Show messages in two new console windows simultaneously."""
import sys
import platform
from subprocess import Popen
messages = 'This is Console1', 'This is Console2'
# define a command that starts new terminal
if platform.system() == "Windows":
new_window_command = "cmd.exe /c start".split()
else: #XXX this can be made more portable
new_window_command = "x-terminal-emulator -e".split()
# open new consoles, display messages
echo = [sys.executable, "-c",
"import sys; print(sys.argv[1]); input('Press Enter..')"]
processes = [Popen(new_window_command + echo + [msg]) for msg in messages]
# wait for the windows to be closed
for proc in processes:
proc.wait()
You can get something like two consoles using two Tkinter Text widgets.
from Tkinter import *
import threading
class FakeConsole(Frame):
def __init__(self, root, *args, **kargs):
Frame.__init__(self, root, *args, **kargs)
#white text on black background,
#for extra versimilitude
self.text = Text(self, bg="black", fg="white")
self.text.pack()
#list of things not yet printed
self.printQueue = []
#one thread will be adding to the print queue,
#and another will be iterating through it.
#better make sure one doesn't interfere with the other.
self.printQueueLock = threading.Lock()
self.after(5, self.on_idle)
#check for new messages every five milliseconds
def on_idle(self):
with self.printQueueLock:
for msg in self.printQueue:
self.text.insert(END, msg)
self.text.see(END)
self.printQueue = []
self.after(5, self.on_idle)
#print msg to the console
def show(self, msg, sep="\n"):
with self.printQueueLock:
self.printQueue.append(str(msg) + sep)
#warning! Calling this more than once per program is a bad idea.
#Tkinter throws a fit when two roots each have a mainloop in different threads.
def makeConsoles(amount):
root = Tk()
consoles = [FakeConsole(root) for n in range(amount)]
for c in consoles:
c.pack()
threading.Thread(target=root.mainloop).start()
return consoles
a,b = makeConsoles(2)
a.show("This is Console 1")
b.show("This is Console 2")
a.show("I've got a lovely bunch of cocounts")
a.show("Here they are standing in a row")
b.show("Lorem ipsum dolor sit amet")
b.show("consectetur adipisicing elit")
Result:
I don't know if it suits you, but you can open two Python interpreters using Windows start command:
from subprocess import Popen
p1 = Popen('start c:\python27\python.exe', shell=True)
p2 = Popen('start c:\python27\python.exe', shell=True)
Of course there is problem that now Python runs in interactive mode which is not what u want (you can also pass file as parameter and that file will be executed).
On Linux I would try to make named pipe, pass the name of the file to python.exe and write python commands to that file. 'Maybe' it will work ;)
But I don't have an idea how to create named pipe on Windows. Windows API ... (fill urself).
pymux
pymux gets close to what you want: https://github.com/jonathanslenders/pymux
Unfortunately it is mostly a CLI tool replacement for tmux and does not have a decent programmatic API yet.
But hacking it up to expose that API is likely the most robust option if you are serious about this.
The README says:
Parts of pymux could become a library, so that any prompt_toolkit application can embed a vt100 terminal. (Imagine a terminal emulator embedded in pyvim.)
If you are on windows you can use win32console module to open a second console for your thread or subprocess output. This is the most simple and easiest way that works if you are on windows.
Here is a sample code:
import win32console
import multiprocessing
def subprocess(queue):
win32console.FreeConsole() #Frees subprocess from using main console
win32console.AllocConsole() #Creates new console and all input and output of subprocess goes to this new console
while True:
print(queue.get())
#prints any output produced by main script passed to subprocess using queue
queue = multiprocessing.Queue()
multiprocessing.Process(Target=subprocess, args=[queue]).start()
while True:
print("Hello World")
#and whatever else you want to do in ur main process
You can also do this with threading. You have to use queue module if you want the queue functionality as threading module doesn't have queue
Here is the win32console module documentation
I used jfs' response. Here is my embellishment/theft of jfs response.
This is tailored to run on Win10 and also handles Unicode:
# https://stackoverflow.com/questions/19479504/how-can-i-open-two-consoles-from-a-single-script
import sys, time, os, locale
from subprocess import Popen, PIPE, CREATE_NEW_CONSOLE
class console(Popen) :
NumConsoles = 0
def __init__(self, color=None, title=None):
console.NumConsoles += 1
cmd = "import sys, os, locale"
cmd += "\nos.system(\'color " + color + "\')" if color is not None else ""
title = title if title is not None else "console #" + str(console.NumConsoles)
cmd += "\nos.system(\"title " + title + "\")"
# poor man's `cat`
cmd += """
print(sys.stdout.encoding, locale.getpreferredencoding() )
endcoding = locale.getpreferredencoding()
for line in sys.stdin:
sys.stdout.buffer.write(line.encode(endcoding))
sys.stdout.flush()
"""
cmd = sys.executable, "-c", cmd
# print(cmd, end="", flush=True)
super().__init__(cmd, stdin=PIPE, bufsize=1, universal_newlines=True, creationflags=CREATE_NEW_CONSOLE, encoding='utf-8')
def write(self, msg):
self.stdin.write(msg + "\n" )
if __name__ == "__main__":
myConsole = console(color="c0", title="test error console")
myConsole.write("Thank you jfs. Cool explanation")
NoTitle= console()
NoTitle.write("default color and title! This answer uses Windows 10")
NoTitle.write(u"♥♥♥♥♥♥♥♥")
NoTitle.write("♥")
time.sleep(5)
myConsole.terminate()
NoTitle.write("some more text. Run this at the python console.")
time.sleep(4)
NoTitle.terminate()
time.sleep(5)
Do you know about screen/tmux?
How about tmuxp? For example, you can try to run cat in split panes and use "sendkeys" to send output (but dig the docs, may be there is even easier ways to achieve this).
As a side bonus this will work in the text console or GUI.
Developed a script which builds a project using msbuild. I have GUI developed using wxpython which has a button on which when user clicks would build a project using msbuild. Now, i want to open a status window when user click on that button and shows all the output which shows in the command prompt and command prompt should not be displayed i.e,redirecting the command prompt output to the user GUI status window. My build script is,
def build(self,projpath)
arg1 = '/t:Rebuild'
arg2 = '/p:Configuration=Release'
arg3 = '/p:Platform=x86'
p = subprocess.call([self.msbuild,projpath,arg1,arg2,arg3])
if p==1:
return False
return True
I actually wrote about this a few years ago on my blog where I created a script to redirect ping and traceroute to my wxPython app: http://www.blog.pythonlibrary.org/2010/06/05/python-running-ping-traceroute-and-more/
Basically you create a simple class to redirect stdout to and pass it an instance of a TextCtrl. It ends up looking something like this:
class RedirectText:
def __init__(self,aWxTextCtrl):
self.out=aWxTextCtrl
def write(self,string):
self.out.WriteText(string)
Then when I wrote my ping command, I did this:
def pingIP(self, ip):
proc = subprocess.Popen("ping %s" % ip, shell=True,
stdout=subprocess.PIPE)
print
while True:
line = proc.stdout.readline()
wx.Yield()
if line.strip() == "":
pass
else:
print line.strip()
if not line: break
proc.wait()
The main thing to look at is the stdout parameter in your subprocess call and the wx.Yield() is important too. The Yield allows the text to get "printed" (i.e. redirected) to stdout. Without it, the text won't show up until the command is finished. I hope that all made sense.
I made a change like below,it did work for me.
def build(self,projpath):
arg1 = '/t:Rebuild'
arg2 = '/p:Configuration=Release'
arg3 = '/p:Platform=Win32'
proc = subprocess.Popen(([self.msbuild,projpath,arg1,arg2,arg3]), shell=True,
stdout=subprocess.PIPE)
print
while True:
line = proc.stdout.readline()
wx.Yield()
if line.strip() == "":
pass
else:
print line.strip()
if not line: break
proc.wait()