subprocess.run with stdin input doesn't process - python

I'm trying to run a command in python:
from subprocess import run, DEVNULL
run(["./rarcrack",'walks.rar'], text=True, input='nano1 nano2', stdout=DEVNULL)
The command doesn't seem to process the stdin though (It says no more words, whereas in the example below it says successfully cracked).
I decided to do this because I'm under the impression that:
The bash pipe redirects stdout to stdin and
./rarcrack takes an argument from stdin because a command like
echo 'nano1 nano2' | ./rarcrack walks.rar works.
And I don't think I can pass in the words as another argument (I don't know any C).
The program is here

The problem is that you discard any results with stdout=DEVNULL. You only see the error output, not the successes.

Related

subprocess Popen stdin will only input and run after the script has finished

Description:
I was trying to make a shell that can be interactive on a chatting software, so I need a cmd.exe as a subprocess and pass strings into the process.
I have this:
from subprocess import Popen
from subprocess import PIPE as p
proc = Popen("cmd",stdout=p,stdin=p,shell=True)
so usually what we do if we need to pass input to the process is by using proc.stdin.write()
but it seems that the string will only pass in and work after the python script is complete
for example, I have
#same thing above
proc.stdin.write("ping 127.0.0.1".encode())
time.sleep(10)
the script will wait for 10 sec then pass and run the ping command.
which means it's impossible to get the result stdout.read() because there is nothing.
I have tried to use subprocess.Popen.communicate() but it closes the pipe after one input.
Is there any way to solve the "only run the command after script finish" thing, or make communicate() not close the pipe?
Writes to pipes are buffered, you need to flush the buffer.
proc.stdin.write("ping 127.0.0.1".encode())
proc.stdin.flush()

Need help to read out the output of a subprocess

My python script (python 3.4.3) calls a bash script via subprocess.
OutPST = subprocess.check_output(cmd,shell=True)
It works, but the problem is, that I only get half of the data. The subprocess I call, calls a different subprocess and I have the guess, that if the "sub subprocess" sends the EOF, my programm thinks, that that´s it and ends the check_output.
Has someone an idea how to get all the data?
You should use subprocess.run() unless you really need that fine grained of control over talking to the processing via its stdin (or doing something else while the process is running instead of blocking for it to finish). It makes capturing output super easy:
from subprocess import run, PIPE
result = run(cmd, stdout=PIPE, stderr=PIPE)
print(result.stdout)
print(result.stderr)
If you want to merge stdout and stderr (like how you'd see it in your terminal if you didn't do any redirection), you can use the special destination STDOUT for stderr:
from subprocess import STDOUT
result = run(cmd, stdout=PIPE, stderr=STDOUT)
print(result.stdout)

Is there a way to use data in a string in place of a file path argument in subprocess?

I have an executable I need to run via a python script and the executable takes a path to a file as an argument, like so:
./myExecutable /pathToFileToProcess
I'm hoping I can run my executable with Popen and pass a string representing my "/pathToFileToProcess" data without having to write it to disk.
I believe what I'm trying to do is very similar to the question asked here:
Python string as file argument to subprocess
However the solution suggested isn't working for me. Here is how I've tried to implement it:
from subprocess import Popen, STDOUT, PIPE
stringDataFromFile = "the data to be processed"
p = Popen(['./myExecutable'], shell = False, stdout = PIPE, stderr = PIPE, stdin = PIPE)
stdout, stderr = p.communicate(input = stringDataFromFile)
print(stdout)
print(stderr)
When I run this I get the generic usage message for ./myExecutable in stdout which is what happens if you don't supply a file to process. Note: I need to capture the stdout so I have implemented it slightly differently than the suggested.
I'm thinking the difference must be that ./myExecutable won't accept it's input via stdin, where as the program in the linked example will. Or, I'm just doing it wrong.
Any thoughts on how to do this, or if it is even possible?
I suspect your suggestion about the executable not accepting input via stdin is correct. You can test this by piping the contents of an example input file into your executable, e.g:
cat path_to_file | my_executable

Python Subprocess can not get the output of oracle command "imp"

For example:
import subprocess
p=subprocess.Popen("imp -help",stdout=subprocess.PIPE,stdin=subprocess.PIPE)
out,err=p.communicate
the out is null
but other oracle command like "sqlplus -help","rman -help" works fine
There could be two problems why you are not getting any output in stdout:
The process is dumping all it's output to stderr.
The system does not know how to execute "imp -help".
The solution for the first problem is easy: capture stderr using the argument stderr = subprocess.PIPE.
The solution to the second is also easy, but the explanation is a bit longer: Subprocess does not guess much, it will just try to execute the whole string as one command. That means, in your case, it will try to execute "imp -help" as one command. It does not try to execute the command "imp" with the argument "-help". You have to explicitly tell subprocess the command and the arguments separately.
From the python documentation on subprocess:
args should be a string, or a sequence
of program arguments. The program to
execute is normally the first item in
the args sequence or the string if a
string is given, ...
That means you have to separate the command and the arguments and pack them together in a sequence. This: "imp -help" should look like this: ["imp", "-help"]. Read the documentation on subprocess for more details on the intricacies of spliting the command and arguments.
Here is how the code should look like:
import subprocess
p=subprocess.Popen(["imp", "-help"],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
out,err=p.communicate()
Note: you also typed p.communicate instead of p.communicate(). I assume that was a typo in your question, not in your code.

Python: How do I redirect this output?

I'm calling rtmpdump via subprocess and trying to redirect its output to a file. The problem is that I simply can't redirect it.
I tried first setting up the sys.stdout to the opened file. This works for, say, ls, but not for rtmpdump. I also tried setting the sys.stderr just to make sure and it also didn't work.
I tried then using a ">> file" with the command line argument but again it doesn't seem to work.
Also for the record, for some reason, Eclipse prints rtmpdump's output even if I use subprocess.call instead of subprocess.check_output, and without having to call the print method. This is black magic!
Any suggestions?
Edit: Here's some sample code.
# /!\ note: need to use os.chdir first to get to the folder with rtmpdump!
command = './rtmpdump -r rtmp://oxy.videolectures.net/video/ -y 2007/pascal/bootcamp07_vilanova/keller_mikaela/bootcamp07_keller_bss_01 -a video -s http://media.videolectures.net/jw-player/player.swf -w ffa4f0c469cfbe1f449ec42462e8c3ba16600f5a4b311980bb626893ca81f388 -x 53910 -o test.flv'
split_command = shlex.split(command)
subprocess.call(split_command)
sys.stdout is the python's idea of the parent's output stream.
In any case you want to change the child's output stream.
subprocess.call and subprocess.Popen take named parameters for the output streams.
So open the file you want to output to and then pass that as the appropriate argument to subprocess.
f = open("outputFile","wb")
subprocess.call(argsArray,stdout=f)
Your talk of using >> suggest you are using shell=True, or think you are passing your arguments to the shell. In any case it is better to use the array form of subprocess, which avoid an unnecessary process, and any weirdness from the shell.
EDIT:
So I downloaded RTMPDump and tried it out, it would appear the messages are appearing on stderr.
So with the following program, nothing appears on the programs output, and the rtmpdump logs when into the stderr.txt file:
#!/usr/bin/env python
import os
import subprocess
RTMPDUMP="./rtmpdump"
assert os.path.isfile(RTMPDUMP)
command = [RTMPDUMP,'-r','rtmp://oxy.videolectures.net/video/',
'-y','2007/pascal/bootcamp07_vilanova/keller_mikaela/bootcamp07_keller_bss_01',
'-a','video','-s',
'http://media.videolectures.net/jw-player/player.swf',
'-w','ffa4f0c469cfbe1f449ec42462e8c3ba16600f5a4b311980bb626893ca81f388'
,'-x','53910','-o','test.flv']
stdout = open("stdout.txt","wb")
stderr = open("stderr.txt","wb")
subprocess.call(command,stdout=stdout,stderr=stderr)
See the link on getting the output from subprocess on SO
Getting the entire output from subprocess.Popen
https://stackoverflow.com/questions/tagged/subprocess
I guess the way would be to collect the output and write it to a file directly or provide file descriptors to which you output can be written.
Something like this:
f = open('dump.txt', 'wb')
p = subprocess.Popen(args, stdout=f, stderr=subprocess.STDOUT, shell=True)

Categories

Resources