Python Subprocess can not get the output of oracle command "imp" - python

For example:
import subprocess
p=subprocess.Popen("imp -help",stdout=subprocess.PIPE,stdin=subprocess.PIPE)
out,err=p.communicate
the out is null
but other oracle command like "sqlplus -help","rman -help" works fine

There could be two problems why you are not getting any output in stdout:
The process is dumping all it's output to stderr.
The system does not know how to execute "imp -help".
The solution for the first problem is easy: capture stderr using the argument stderr = subprocess.PIPE.
The solution to the second is also easy, but the explanation is a bit longer: Subprocess does not guess much, it will just try to execute the whole string as one command. That means, in your case, it will try to execute "imp -help" as one command. It does not try to execute the command "imp" with the argument "-help". You have to explicitly tell subprocess the command and the arguments separately.
From the python documentation on subprocess:
args should be a string, or a sequence
of program arguments. The program to
execute is normally the first item in
the args sequence or the string if a
string is given, ...
That means you have to separate the command and the arguments and pack them together in a sequence. This: "imp -help" should look like this: ["imp", "-help"]. Read the documentation on subprocess for more details on the intricacies of spliting the command and arguments.
Here is how the code should look like:
import subprocess
p=subprocess.Popen(["imp", "-help"],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
out,err=p.communicate()
Note: you also typed p.communicate instead of p.communicate(). I assume that was a typo in your question, not in your code.

Related

subprocess.run with stdin input doesn't process

I'm trying to run a command in python:
from subprocess import run, DEVNULL
run(["./rarcrack",'walks.rar'], text=True, input='nano1 nano2', stdout=DEVNULL)
The command doesn't seem to process the stdin though (It says no more words, whereas in the example below it says successfully cracked).
I decided to do this because I'm under the impression that:
The bash pipe redirects stdout to stdin and
./rarcrack takes an argument from stdin because a command like
echo 'nano1 nano2' | ./rarcrack walks.rar works.
And I don't think I can pass in the words as another argument (I don't know any C).
The program is here
The problem is that you discard any results with stdout=DEVNULL. You only see the error output, not the successes.

Subprocess call failed to parse argument (kill function) Python [duplicate]

import os
import subprocess
proc = subprocess.Popen(['ls','*.bc'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out,err = proc.communicate()
print out
This script should print all the files with .bc suffix however it returns an empty list. If I do ls *.bc manually in the command line it works. Doing ['ls','test.bc'] inside the script works as well but for some reason the star symbol doesnt work.. Any ideas ?
You need to supply shell=True to execute the command through a shell interpreter.
If you do that however, you can no longer supply a list as the first argument, because the arguments will get quoted then. Instead, specify the raw commandline as you want it to be passed to the shell:
proc = subprocess.Popen('ls *.bc', shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
Expanding the * glob is part of the shell, but by default subprocess does not send your commands via a shell, so the command (first argument, ls) is executed, then a literal * is used as an argument.
This is a good thing, see the warning block in the "Frequently Used Arguments" section, of the subprocess docs. It mainly discusses security implications, but can also helps avoid silly programming errors (as there are no magic shell characters to worry about)
My main complaint with shell=True is it usually implies there is a better way to go about the problem - with your example, you should use the glob module:
import glob
files = glob.glob("*.bc")
print files # ['file1.bc', 'file2.bc']
This will be quicker (no process startup overhead), more reliable and cross platform (not dependent on the platform having an ls command)
Besides doing shell=True, also make sure that your path is not quoted. Otherwise it will not be expanded by shell.
If your path may have special characters, you will have to escape them manually.

Difference between whole string command and list of strings in popen

I found most of the programmers suggest use list of strings to represent the command in popen. However, in my own project, I found a whole string works in more cases.
For example, the following works
subprocess.Popen('pgrep -f "\./run"', stdout=subprocess.PIPE, shell=True).wait()
while
subprocess.Popen(['pgrep', '-f', '"\./run"'], stdout=subprocess.PIPE, shell=True).wait()
does not.
May I know what's the difference between these two ways of implementation and why the second one does not work as expected?
The second should not have a shell=True parameter. Instead, it should be:
subprocess.Popen(['pgrep', '-f', '"\./run"'], stdout=subprocess.PIPE).wait().
The shell parameter sets whether or not to execute the command in a separate shell. That is, if a new shell should be spawned just to execute the command, which must be interpreted by the shell before it can be run.
When providing a list of strings, however, this does not spawn a second shell, and thus is (minimally) faster. It is also better to use for processing variable input, because it avoids string interpolation.
See: https://stackoverflow.com/a/15109975/1730261

How to call a series of bash commands in python and store output

I am trying to run the following bash script in Python and store the readlist output. The readlist that I want to be stored as a python list, is a list of all files in the current directory ending in *concat_001.fastq.
I know it may be easier to do this in python (i.e.
import os
readlist = [f for f in os.listdir(os.getcwd()) if f.endswith("concat_001.fastq")]
readlist = sorted(readlist)
However, this is problematic, as I need Python to sort the list in EXACTLY the same was as bash, and I was finding that bash and Python sort certain things in different orders (eg Python and bash deal with capitalised and uncapitalised things differently - but when I tried
readlist = np.asarray(sorted(flist, key=str.lower))
I still found that two files starting with ML_ and M_ were sorted in different order with bash and Python. Hence trying to run my exact bash script through Python, then to use the sorted list generated with bash in my subsequent Python code.
input_suffix="concat_001.fastq"
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g" `
readlist="$(echo $ender)"
I have tried
proc = subprocess.call(command1, shell=True, stdout=subprocess.PIPE)
proc = subprocess.call(command2, shell=True, stdout=subprocess.PIPE)
proc = subprocess.Popen(command3, shell=True, stdout=subprocess.PIPE)
But I just get: subprocess.Popen object at 0x7f31cfcd9190
Also - I don't understand the difference between subprocess.call and subprocess.Popen. I have tried both.
Thanks,
Ruth
So your question is a little confusing and does not exactly explain what you want. However, I'll try to give some suggestions to help you update it, or in my effort, answer it.
I will assume the following: your python script is passing to the command line 'input_suffix' and that you want your python program to receive the contents of 'readlist' when the external script finishes.
To make our lives simpler, and allow things to be more complicated, I would make the following bash script to contain your commands:
script.sh
#!/bin/bash
input_suffix=$1
ender=`echo $input_suffix | sed "s/concat_001.fastq/\*concat_001.fastq/g"`
readlist="$(echo $ender)"
echo $readlist
You would execute this as script.sh "concat_001.fastq", where $1 takes in the first argument passed on the command line.
To use python to execute external scripts, as you quite rightly found, you can use subprocess (or as noted by another response, os.system - although subprocess is recommended).
The docs tell you that subprocess.call:
"Wait for command to complete, then return the returncode attribute."
and that
"For more advanced use cases when these do not meet your needs, use the underlying Popen interface."
Given you want to pipe the output from the bash script to your python script, let's use Popen as suggested by the docs. As I posted the other stackoverflow answer, it could look like the following:
import subprocess
from subprocess import Popen, PIPE
# Execute out script and pipe the output to stdout
process = subprocess.Popen(['script.sh', 'concat_001.fastq'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
# Obtain the standard out, and standard error
stdout, stderr = process.communicate()
and then:
>>> print stdout
*concat_001.fastq

redirecting the output of shell script executing through python

Hi I am trying to execute shell script from python using following command.
os.system("sh myscript.sh")
in my shell script I have written some SOP's, now how do I get the SOP's in my Python so that I can log them into some file?
I know using subprocess.Popen I can do it, for some reason I can not use it.
p=subprocess.Popen(
'DMEARAntRunner \"'+mount_path+'\"',
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT
)
while 1:
line=p.stdout.readline()[:-1]
if not line:
break
write_to_log('INFO',line)
p.communicate()
If I understand your question correctly, you want something like this:
import subprocess
find_txt_command = ['find', '-maxdepth', '2', '-name', '*.txt']
with open('mylog.log', 'w') as logfile:
subprocess.call(find_txt_command, stdout=logfile, shell=False)
You can use Popen instead of call if you need to, the syntax is very similar. Notice that command is a list with the process you want to run and the arguments. In general you want to use Popen/call with shell=False, it prevents unexpected behavior that can be hard to debug and it is more portable.
Kindly check this official documentation which uses the subprocess module in python. It is currently the recommended way over os.system calls to execute system functions and retrieve the results. The link above gives examples very close to what you need.
I personally would advise you to leave the shell argument at its default value of False. In that case, the first argument isn't a string as you'd type into a terminal, but a list of "words", the first being the program, the ones after that being arguments. This means that there is no need to quote arguments, making your program more resilient to whitespace arguments and injection attacks.
This should do the trick:
p = subsprocess.Popen(['DMEARAntRunner', mount_path],
stdout=subprocess.PIPE,stderr=subprocess.STDOUT)
As always with executing shell commands the question remains whether it's the easiest/best way to solve a problem, but that's another discussion altogether.

Categories

Resources