Getting stdout from console - python

I'm trying to make a python script that would clone(ISO image) one usb stick to another using dd if=/dev/sda of=/dev/sdb
Here's my problem:
I want to create progress bar showing what is done.
I tried:
Looking at storage space at second usb stick, but this doesn't work beacause ISO image scans also unused space
By adding status=progress to dd command I can get progress in terminal but I can't figure how to access stdout from python. I tried subprocess.Popen,run(stdout = PIPE) with and without shell = True
reading process.stdout with .read(), .read(1), .readline() or communicate(). Nothing worked for me. (https://www.endpointdev.com/blog/2015/01/getting-realtime-output-using-python/)
I can see progress going on in python shell but .read() function always get stuck.
Part of code I am concerned about:
comm = 'sudo dd if=/dev/sda of=/dev/sdb'
cloning = subprocess.Popen(shlex.split(comm),stdout = PIPE,text = True)
while True:
print(cloning.stdout.read())
I want something that would work like:
while True:
progress = cloning.stdout.read()
update_bar(progress)
I'm using python 3.7 on Raspberry
Thanks for help

You were on the right track with status=progress, but it outputs to stderr, not stdout. If you do stderr = PIPE and then read from cloning.stderr instead of cloning.stdout, it will work.

Related

Running subprocess while simultaneously grabbing console output

I'm running an external binary using a subprocess and it takes an awfully long time to finish the job so I need to get some progress information to put into a progress bar in GUI. The binary does display progress information in the console output, specifically I'm interested in items highlighted:
The problem is that I'm not sure how to continuously grab this information. I tried to use PIPE and Threading but this doesn't seem to work at all, i.e. for testing purposes I'm trying to grab output and print it to console, but nothing happens. I also tried re-directing the output to a file, but this doesn't work either.
def comm(self, process):
temp = process.stdout
print(temp)
def _convert(self):
tmpdir = self.TEMPDIRECTORY.name.split('\\')
tmpdir = '/'.join(tmpdir)
subprocstring = "{} {} {}"
subprocstring = subprocstring.format(self.EXE_PATH, self.BMF_FILE, tmpdir, '--add-block-id')
print(subprocstring)
h = subprocess.run(subprocstring, stdout=subprocess.PIPE)
threading.Thread(target=self.comm, args=h)
Anyone knows what should I do to get it properly?

Spawning a Python Tkinter process through Rust's spawn does not give the stdout continuously

I'm starting an individual process from Rust like this:
let stdout = std::process::Command::new("python3")
.arg(tool_path)
.args(python_params)
.stdout(std::process::Stdio::piped())
.spawn()
.unwrap()
.stdout
.ok_or_else(|| "Could not capture standard output.")
.unwrap();
I'm starting a Python process of my script and print it out like so:
let reader = std::io::BufReader::new(stdout);
reader
.lines()
.filter_map(|line| line.ok())
.for_each(|line| println!("{}", line));
This will continuously print out all my prints inside my Python script. If my Python script looks like this:
print("foo.pyyyyyyy")
while True:
print("asdasd")
I will see first "foo.pyyyyyyy" printed, and then a continuous stream of "asdasd" prints in my terminal.
However, if my Python script consists of a Tkinter update loop like this:
from tkinter import *
window = Tk()
window.title("Welcome to LikeGeeks app")
window.mainloop()
print("foo.pyyyyyyy")
It does not print the "foo.pyyyyyyy" statement until after I terminate the external process.
Is there a way to circumvent this? I would really like to be able to pipe my Tkinter prints to my Rust app.
Thanks to #jmb and #acw1668 for the answer:
My issue had something to do with the I/O buffer when starting my python script. Adding the -u flag when calling my script solved the issues:
python3 -u path_to_script parameters
from the docs:
-u
Force the binary layer of the stdout and stderr streams
(which is available as their buffer attribute) to be unbuffered.
The text I/O layer will still be line-buffered if writing to the console,
or block-buffered if redirected to a non-interactive file.
See also PYTHONUNBUFFERED.

How to read stdout output of ongoing process in Python?

Hello I'm really new to the Python programming language and i have encountered a problem writing one script. I want to save the output from stdout that i obtain when i run a tcpdump command in a variable in a Python script, but i want the tpcdump command to run continuously because i want to gather the length from all packets transferred that get filtered by tcpdump(with the filter i wrote).
I tried :
fin, fout = os.popen4(comand)
result = fout.read()
return result
But it just hangs.
I'm guessing that it hangs because os.popen4 doesn't return until the child process exits. You should be using subprocess.Popen instead.
import subprocess
import shlex #just so you don't need break "comand" into a list yourself ;)
p=subprocess.Popen(shlex.split(comand),stdout=subprocess.PIPE)
first_line_of_output=p.stdout.readline()
second_line_of_output=p.stdout.readline()
...

GUI Progress Indicator using command line tools

I would like to know a good way to make a GUI progress bar. I'm using python, Glade/GTK. An example of a program I'd like to get more than just a pulse bar. Something that could give the user a more accurate way to know the progress. The program I'm trying to capture this for is cdparanoia, but would like to know how to do this in general for other programs like wget. Important to note on these two programs is the the info is output on stderr.
I should mention that when doing a
output = p.stderr.read(1)
print output
Doesn't show that the text progress bar is showing. It is almost like it is treating a non-interactive shell differently. There are no carriage returns (\r) like I thought it would be doing.
Codings is as such:
import subprocess, shlex, gtk
command = 'cdparanoia -w -Z 1- - | sox -t wav - "my disk.flac"'
p = subprocess(shlex.split(command), stderr=subprocess.PIPE)
gui = gtk.Builder()
gui.add_from_file("pulsebar.glade")
#do magic here to make a good pulsebar
Gratefully,
Narnie
gtk.ProgressBar? (Note the .set_fraction() method.)
Here's some info about doing non-blocking reads from a subprocess. The suggestion of polling with timeout seems appropriate. Also, this answer.
My dim memory of cdparanoia's progress indicator is that it is very idiosyncratic. But I'm guessing it's just some silly stuff + '\r'; shouldn't be too hard to extract a fraction from it.
Edit: Ok, actually, perhaps the above is incorrect under normal usage situations; but have you tried -e?
-e --stderr-progress
Force output of progress information to stderr (for wrapper
scripts).

Python: cannot read / write in another commandline application by using subprocess module

I am using Python 3.0 in Windows and trying to automate the testing of a commandline application. The user can type commands in Application Under Test and it returns the output as 2 XML packets. One is a packet and the other one is an packet. By analyzing these packets I can verifyt he result. I ahev the code as below
p = subprocess.Popen(SomeCmdAppl, stdout=subprocess.PIPE,
shell = True, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
p.stdin.write((command + '\r\n').encode())
time.sleep(2.5)
testresult = p.stdout.readline()
testresult = testresult.decode()
print(testresult)
I cannot ge any output back. It get stuck in place where I try to read the output by using readline(). I tried read() and it get stuck too
When I run the commandline application manually and type the command I get the output back correctly as tow xml packets as below
Sent: <PivotNetMessage>
<MessageId>16f8addf-d366-4031-b3d3-5593efb9f7dd</MessageId>
<ConversationId>373323be-31dd-4858-a7f9-37d97e36eb36</ConversationId>
<SageId>4e1e7c04-4cea-49b2-8af1-64d0f348e621</SagaId>
<SourcePath>C:\Python30\PyNTEST</SourcePath>
<Command>echo</Command>
<Content>Hello</Content>
<Time>7/4/2009 11:16:41 PM</Time>
<ErrorCode>0</ErrorCode>
<ErrorInfo></ErrorInfo>
</PivotNetMessagSent>
Recv: <PivotNetMessage>
<MessageId>16f8addf-d366-4031-b3d3-5593efb9f7dd</MessageId>
<ConversationId>373323be-31dd-4858-a7f9-37d97e36eb36</ConversationId>
<SageId>4e1e7c04-4cea-49b2-8af1-64d0f348e621</SagaId>
<SourcePath>C:\PivotNet\Endpoints\Pipeline\Pipeline_2.0.0.202</SourcePath>
<Command>echo</Command>
<Content>Hello</Content>
<Time>7/4/2009 11:16:41 PM</Time>
<ErrorCode>0</ErrorCode>
<ErrorInfo></ErrorInfo>
</PivotNetMessage>
But when I use the communicate() as below I get the Sent packet and never get the Recv: packet. Why am I missing the recv packet? The communicate(0 is supposed to bring everything from stdout. rt?
p = subprocess.Popen(SomeCmdAppl, stdout=subprocess.PIPE,
shell = True, stdin=subprocess.PIPE, stderr=subprocess.STDOUT)
p.stdin.write((command + '\r\n').encode())
time.sleep(2.5)
result = p.communicate()[0]
print(result)
Can anybody help me with a sample code that should work? I don't know if it is needed to read and write in separate threads. Please help me. I need to do repeated read/write. Is there any advanced level module in python i can use. I think Pexpect module doesn't work in Windows
This is a popular problem, e.g. see:
Interact with a Windows console application via Python
How do I get 'real-time' information back from a subprocess.Popen in python (2.5)
how do I read everything currently in a subprocess.stdout pipe and then return?
(Actually, you should have seen these during creation of your question...?!).
I have two things of interest:
p.stdin.write((command + '\r\n').encode()) is also buffered so your child process might not even have seen its input. You can try flushing this pipe.
In one of the other questions one suggested doing a stdout.read() on the child instead of readline(), with a suitable amount of characters to read. You might want to experiment with this.
Post your results.
Try sending your input using communicate instead of using write:
result = p.communicate((command + '\r\n').encode())[0]
Have you considered using pexpect instead of subprocess? It handles the details which are probably preventing your code from working (like flushing buffers, etc). It may not be available for Py3k yet, but it works well in 2.x.
See: http://pexpect.sourceforge.net/pexpect.html

Categories

Resources