Subprocess.call for cmd - python

I have the following command that works in the shell:
$ pv itunes20140910.tbz | sudo tar xpj -C /tmp
However, when I try and do it in python, it doesn't work:
>>> import subprocess
>>> import shlex
>>> cmd=shlex.split('pv itunes20140910.tbz | sudo tar xpj -C /tmp')
>>> subprocess.call(cmd)
pv: invalid option -- 'C'
Try `pv --help' for more information.
1
What am I doing wrong here, and what would be the correct command to run in python?

The above answers didn't have the net effect of what I was looking for (the progress bar), though the command would run without error. Here is what worked for me:
>>> import shlex, subprocess
>>> p1 = subprocess.Popen(shlex.split('pv /tmp/itunes20140910.tbz'), stdout=subprocess.PIPE) #Set up the echo command and direct the output to a pipe
>>> subprocess.Popen(shlex.split('sudo tar xpj -C /tmp'), stdin=p1.stdout) #send p1's output to p2

Use shell=True argument. Otherwise | cannot be interpreted.
subprocess.call('pv itunes20140910.tbz | sudo tar xpj -C /tmp', shell=True)

Related

How to pass Python variable into bash commands?

I've seen some answers about subprocess.call and popen, but I have a list of commands and I think it's not a good idea to have multiple calls or etc. Also I don't want to have a separate script.sh with these commands.
My code looks like
bash_code=r'''
echo "/common_home/{context['nickname']} /tmp/back/{context['nickname']} none bind 0 0" | sudo tee --append /etc/fstab
sudo mkdir /tmp/{context['nickname']} /tmp/back/{context['nickname']}
'''
subprocess.run(['bash', '-c', bash_code], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
But it has much more line with {context['nickname']} and I don't know best way how to parse this variable into bash commands.
You can use a so-called "f-string" to replace the variable references with their values.
context = {'nickname': 'foobar'}
bash_code = f'''
echo "/common_home/{context['nickname']} /tmp/back/{context['nickname']} none bind 0 0" | sudo tee --append /etc/fstab
sudo mkdir /tmp/{context['nickname']} /tmp/back/{context['nickname']}
'''
print(bash_code)
subprocess.run(['bash', '-c', bash_code], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Value printed:
echo "/common_home/foobar /tmp/back/foobar none bind 0 0" | sudo tee --append /etc/fstab
sudo mkdir /tmp/foobar /tmp/back/foobar

Why do we lost stdout from a terminal after running subprocess.check_output(xyz, shell=True)?

I have this bash line:
$ printf ' Number of xml files: %s\n' `find . -name '*.xml' | wc -l`
Number of xml files: 4
$
When I run it from python in this way the python interpreter stop and my terminal does not have stdout anymore::
$ ls
input aa bb
$ python
Python 3.6.7 (default, Oct 22 2018, 11:32:17)
>>>
>>> import subprocess
>>> cmd = "printf 'xml files: %s\n' `find . -name '*.xml' | wc -l`"
>>> subprocess.check_output(['/bin/bash', cmd], shell=True)
$ ls # stdout is not seen any more I have to kill this terminal
$
Obviously the question here is not how to make this bash work from python::
>>> import subprocess
>>> cmd = "printf 'xml files: %s\n' `find . -name '*.xml' | wc -l`"
>>> out = subprocess.run(cmd, shell=True, stdout=subprocess.PIPE)
>>> print(str(out.stdout, 'utf8'))
xml files: 4
>>>
The two following issues No output from subprocess.check_output() and Why is terminal blank after running python executable? does not answer the question
The short version is that check_output is buffering all the output to return. When you run ls, its standard output is going to check_output's buffer, not the terminal. When you exit the shell you are currently in, you'll get all the output at once as a single Python string.
This leads to the question, why are you getting a sub shell in the first place, instead of executing the contents of cmd? First, you are using bash wrong; its argument is a file to run, not an arbitrary command line. A more correct version of what you are doing would be
cmd = "printf 'xml files: %s\n' `find . -name '*.xml' | wc -l`"
subprocess.check_output(['/bin/bash', '-c', cmd])
or, if you want subprocess to run a shell for you, instead of explicitly executing it,
subprocess.check_output(cmd, shell=True)
Combining the list argument with shell=True is almost never what you want.
Second, given your original code, check_output first tries to combine your list into a single string, which is then joined to sh -c. That is, you try to execute something like
sh -c /bin/bash "printf 'xml files: %s\n' `find . -name '*.xml' | wc -l`"
sh runs /bin/bash, and your command string is just used as an additional argument to sh which, for the purposes of this question, we can assume is ignored. So you are in an interactive shell whose standard output is buffered instead of displayed, as described in the first part of this answer.

Using PIPE in long subprocess call (python) doesn't work

I am trying to do the following command (to download Calibre through a python script):
sudo -v && wget -nv -O- https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py | sudo python -c "import sys; main=lambda:sys.stderr.write('Download failed\n'); exec(sys.stdin.read()); main()"
Following some of the answers on here on how to PIPE, I have been doing this:
import subprocess
from subprocess import Popen, PIPE
wget = subprocess.Popen(["sudo -v wget -nv -O- https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py"], stdout=PIPE)
run = subprocess.Popen(["sudo python -c "import sys; exec(sys.stdin.read()); main()""], stdin=wget.stdout)
I have tried changing many things but NOTHING is working. There are too many errors to put here. Can anyone correct this? Many thanks in advance.
All I get is the first
When you have parameters you need to break it into a list, e.g.
wget = subprocess.Popen(["wget -nv -O- https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py"])
(**No such file or directory** error, because it looks for that whole string as a command/file)
Needs to become:
subprocess.Popen(['wget', '-nv', '-O-', 'https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py'])
<subprocess.Popen object at 0x10e203950>
You can also use shlex.split() to split your command for you, e.g.
>>> import shlex
>>> shlex.split('wget -nv -O- https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py')
['wget', '-nv', '-O-', 'https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py']
Ref. https://docs.python.org/2/library/subprocess.html#popen-constructor
You also have "sudo python -c "import sys; exec(sys.stdin.read()); main()"", I don't think it's acceptable syntax because you have quotes inside quotes (the inside ones close out the first opening quote), so try 'sudo python -c "import sys; exec(sys.stdin.read()); main()"' with single quotes instead, that way you don't have to escape the quotes inside!
To avoid escaping quotes inside the command, you could use triple quotes:
from subprocess import check_call
check_call(r'''# http://calibre-ebook.com/download_linux
sudo -v &&
wget -nv -O- https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py |
sudo python -c "import sys; main=lambda:sys.stderr.write('Download failed\n'); exec(sys.stdin.read()); main()"
''', shell=True)
shell=True causes the first parameter to be interpreted as a shell command.
In principle, you could simplify the command:
#!/usr/bin/env python3
import sys
from subprocess import check_call
from urllib.request import urlretrieve
# http://calibre-ebook.com/download_linux
path, _ = urlretrieve('https://raw.githubusercontent.com/kovidgoyal/calibre/master/setup/linux-installer.py')
check_call(['sudo', sys.executable, path])
but it may have subtle security implications e.g., urlretrieve() may skip server ssl certificate validation.

How to do this Python subprocess call without using shell=True?

For example, in /tmp I have files ending in .txt, .doc, and .jpg that I'd like to delete in one step using shred and subprocess.
The following does the job:
subprocess.call('bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"', shell=True)
How would I do this command without using shell=True. I've tried the following:
subprocess.call(['bash', '-c', '"shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}"'])
subprocess.call(['bash', '-c', 'shred', '-n 10', '-uz', '/tmp/{*.txt,*.pdf,*.doc}'])
Any suggestions?
I believe that other guy is spot on (haven't tried it myself though). However if you ever find yourself having similar issues again shlex.split(s) might be helpful. It takes the string 's' and splits it "using shell-like syntax".
In [3]: shlex.split(s)
Out[3]: ['bash', '-c', 'shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}']
subprocess.call(['bash', '-c', 'shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}'])
You can tell how a command is expanded and split up with:
$ printf "Argument: %s\n" bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"
Argument: bash
Argument: -c
Argument: shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}
In the more general case (but overkill here), if you're ever in doubt of what's executed by something with which parameters, you can use strace:
$ cat script
import subprocess
subprocess.call('bash -c "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"', shell=True)
$ strace -s 1000 -fe execve python script
...
execve("/bin/bash", ["bash", "-c", "shred -n 5 -uz /tmp/{*.txt,*.pdf,*.doc}"], [/* 49 vars */]) = 0
...
$
If the command is coming from a trusted source e.g., it is hardcoded then there is nothing wrong in using shell=True:
#!/usr/bin/env python
from subprocess import check_call
check_call("shred -n 10 -uz /tmp/{*.txt,*.pdf,*.doc}",
shell=True, executable='/bin/bash')
/bin/bash is used to support {} inside the command.
This command doesn't run /bin/sh

Subprocess call providing incorrect parameters compared to ' '.join()

When I run the command
subprocess.call(['intersectBed','-u','-a',out_snv_filter,'-b',cds,'>',out_cds],shell=True)
I get the help menu for intersectBed reported back in the interpreter.
But when I run
>>> ' '.join(['intersectBed','-u','-a',out_snv_filter,'-b',cds,'>',out_cds])
'intersectBed -u -a test/test.out.snv.filter -b gencode7.cds.bed > test/test.out.cds'
$ intersectBed -u -a test/test.out.snv.filter -b gencode7.cds.bed > test/test.out.cds
The program runs normally. What is the difference here?
from subprocess import check_call
args = ['intersectBed','-u','-a',out_snv_filter,'-b',cds]
with open(out_cds, 'wb') as outfile:
check_call(args, stdout=outfile)

Categories

Resources