python subprocess sends backslash before a quote - python

I have a string, which is a framed command that should be executed by in command line
cmdToExecute = "TRAPTOOL -a STRING "ABC" -o STRING 'XYZ'"
I am considering the string to have the entire command that should be triggered from command prompt. If you take a closer look at the string cmdToExecute, you can see the option o with value XYZ enclosed in SINGLE QUOTE. There is a reason that this needs to be given in single quote orelse my tool TRAPTOOL will not be able to process the command.
I am using subprocess.Popen to execute the entire command. Before executing my command in a shell, I am printing the content
print "Cmd to be exectued: %r" % cmdToExecute
myProcess = subprocess.Popen(cmdToExecute, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False)
(stdOut, stdErr) = myProcess.communicate()
The output of the above command is,
Cmd to be executed: TRAPTOOL -a STRING "ABC" -o \'XYZ\'.
You can see that the output shows a BACKWARD SLASH added automatically while printing. Actually, the \ is not there in the string, which I tested using a regex. But, when the script is run on my box, the TRAPTOOL truncates the part of the string XYZ on the receiving server. I manually copy pasted the print output and tried sending it, I saw the same error on the receiving server. However, when I removed the backward slash, it sent the trap without any truncation.
Can anyone say why this happens?
Is there anyway where we can see what command is actually executed in subprocess.Popen?
Is there any other way I can execute my command other that subprocess.Popen that might solve this problem?

Try using shlex to split your command string:
>>> import shlex
>>> argv = shlex.split("TRAPTOOL -a STRING \"ABC\" -o STRING 'XYZ'")
>>> myProcess = subprocess.Popen(argv, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False)
>>> (stdOut, stdErr) = myProcess.communicate()
The first parameter to the Popen constructor can be an argument list for your shell command or a string, but an argument list might be easier to work with because of all the quotes involved. (See the Python subprocess documentation.)
If you want to see the commands being written, you could probably do something like:
>>> argv = shlex.split("bash -x -c 'TRAPTOOL -a STRING \"ABC\" -o STRING \'XYZ\''")
This makes bash echo the commands to the shell by means of the -x option.

You asked for the repr representation of the string, not the str representation. Basically, what would you have to type at the Python interactive interpreter to get the same output? That's what %r displays. Change that to %s to see the value as it's actually stored:
print "Cmd to be exectued: %s" % cmdToExecute

Related

How to run advance Linux command on python script

I want to get the string output of the following linux command
systemctl show node_exporter |grep LoadState| awk '{split($0,a,"="); print a[2]}'
I tried with
import subprocess
output = subprocess.check_output("systemctl show node_exporter |grep LoadState| awk '{split($0,a,"="); print a[2]}'", shell=True)
but the output is,
output = subprocess.check_output("systemctl show node_exporter |grep LoadState| awk '{split($0,a,"="); print a[2]}'", shell=True)
SyntaxError: keyword can't be an expression
Well,
First of all, the function takes a list of strings as a command, not a single string. E.g.:
"ls -a -l" - wrong
["ls", "-a", "-l"] - good
Secondly. If the linux command is super complex or contains lots of lines - it makes sense to create a separate bash file e.g. command.sh, put your linux commands there and run the script from python with:
import subprocess
output = subprocess.check_output(["./command.sh"], shell=True)
You need to escape the double quotes (because they indicate the begin/end of the string):
import subprocess
output = subprocess.check_output("systemctl show node_exporter |grep LoadState| awk '{split($0,a,\"=\"); print a[2]}'", shell=True)

How can I pass a `sed` command to `popen` without using a raw string?

How can I pass a sed command to popen without using a raw string?
When I pass an sed command to popen in the list form I get an error: unterminated address regex (see first example)
>>> COMMAND = ['sed', '-i', '-e', "\$amystring", '/home/map/myfile']
>>> subprocess.Popen(COMMAND).communicate(input=None)
sed: -e expression #1, char 11: unterminated address regex
using the raw string form it works as expected:
>>> COMMAND = r"""sed -i -e "\$amystring" /home/map/myfile"""
>>> subprocess.Popen(COMMAND, shell=True).communicate(input=None)
I'm really interested in passing "\$amystring" as an element of the list. Please avoid answers like
>>> COMMAND = r" ".join(['sed', '-i', '-e', "\$amystring", '/home/map/myfile']
>>> subprocess.Popen(COMMAND, shell=True).communicate(input=None)
The difference between the two forms is that with shell=True as an argument, the string gets passed as is to the shell, which then interprets it. This results in (with bash):
sed -i -e \$amystring /home/map/myfile
being run.
With the list args and the default shell=False, python calls the executable directly with the arguments in the list. In this case, the literal string is passed to sed,
sed -i -e '\$amystring' /home/map/myfile
and '\$amystring' is not a valid sed expression. In this case, you'd need to call
>>> COMMAND = ['sed', '-i', '-e', "$amystring", '/home/map/myfile']
>>> subprocess.Popen(COMMAND).communicate(input=None)
since the string does not need to be escaped for the shell.
There is not such thing as a raw string. There are only raw string literals.
A literal -- it is something that you type in the Python source code.
r'\$amystring' and '\\$amystring' are the same strings objects despite being represented using different string string literals.
As #Jonathan Villemaire-Krajden said: if there is no shell=True then you don't need to escape $ shell metacharacter. You only need it if you run the command in the shell:
$ python -c 'import sys; print(sys.argv)' "\$amystring"
['-c', '$amystring']
Note: there is no backslash in the output.
Don't use .communicate() unless you redirect standard streams using PIPE, you could use call(), check_call() instead:
import subprocess
rc = subprocess.call(['sed', '-i', '-e', '$amystring', '/home/map/myfile'])
To emulate the $amystring sed command in Python (± newlines):
with open('/home/map/myfile', 'a') as file:
print('mystring', file=file)

Python subprocess argument with quotes [duplicate]

This question already has answers here:
Using subprocess.run with arguments containing quotes
(3 answers)
Closed 1 year ago.
I am trying to run http://mediaarea.net/en/MediaInfo command-line utility from python.
It accepts arguments like this.
*Simple Usage: *
# verbose all info
MediaInfo.exe test.mp4
Template Usage:
# verbose selected info from csv
MediaInfo.exe --inform="file://D:\path\to\csv\template.csv" test.mp4
I am trying to run it with Template argument.I can use above command successfully from CMD.It is working and i can see my selected output fine from Dos window.
But when I try to run it from python , it outputs all info ignoring CSV which I give as argument.
Can anyone explain why ? It is because of quotes ?
NOTE: If path to csv not correct/invalid csv, MediaInfo outputs all info which is happening here exactly.
#App variable is full path to MediaInfo.exe
#filename variable is full path to media file
proc = subprocess.Popen([App ,'--inform="file://D:\path\to\csv\template.csv"',filename],shell=True,stderr=subprocess.PIPE, stdout=subprocess.PIPE)
return_code = proc.wait()
for line in proc.stdout:
print line
On Windows, you could pass the command as string i.e., as is:
from subprocess import check_output
cmd = r'MediaInfo.exe --inform="file://D:\path\to\csv\template.csv" test.mp4'
out = check_output(cmd)
Notice: r'' -- the raw-string literal is used to avoid interpreting '\t' as a single tab character instead of r'\t' two characters (backslash and t).
Unrelated: if you have specified stdout=PIPE, stderr=PIPE then you should read both streams concurrently and before p.wait() is called otherwise a deadlock is possible if the command generates enough output.
If the passing of the command as a string works then your could try a list argument:
from subprocess import check_output
from urllib import pathname2url
cmd = [app, '--inform']
cmd += ['file:' + pathname2url(r'D:\path\to\csv\template.csv')]
cmd += [filename]
out = check_output(cmd)
Also can u write a example for p.wait() deadlock u mentioned.
It is easy. Just produce large output in the child process:
import sys
from subprocess import Popen, PIPE
#XXX DO NOT USE, IT DEADLOCKS
p = Popen([sys.executable, "-c", "print('.' * (1 << 23))"], stdout=PIPE)
p.wait() # <-- this never returns unless the pipe buffer is larger than (1<<23)
assert 0 # unreachable
If you print your arguments, you might see what is going wrong:
>>> print '--inform="file://D:\path\to\csv\template.csv"'
--inform="file://D:\path o\csv emplate.csv"
The problem is \ denotes special characters. If you use the "r" literal in front of your string, these special characters are not escaped:
>>> print r'--inform="file://D:\path\to\csv\template.csv"'
--inform="file://D:\path\to\csv\template.csv"

Passing double quote shell commands in python to subprocess.Popen()?

I've been trying to pass a command that works only with literal double quotes in the commandline around the "concat:file1|file2" argument for ffmpeg.
I cant however make this work from python with subprocess.Popen(). Anyone have an idea how one passes quotes into subprocess.Popen?
Here is the code:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
output,error = subprocess.Popen(command, universal_newlines=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
When I do this, ffmpeg won't take it any other way other than quotes around the concat segement. Is there a way to successfully pass this line to subprocess.Popen command?
I'd suggest using the list form of invocation rather than the quoted string version:
command = ["ffmpeg", "-i", "concat:1.ts|2.ts", "-vcodec", "copy",
"-acodec", "copy", "temp.mp4"]
output,error = subprocess.Popen(
command, universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
This more accurately represents the exact set of parameters that are going to be passed to the end process and eliminates the need to mess around with shell quoting.
That said, if you absolutely want to use the plain string version, just use different quotes (and shell=True):
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
output,error = subprocess.Popen(
command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
Either use single quotes 'around the "whole pattern"' to automatically escape the doubles or explicitly "escape the \"double quotes\"". Your problem has nothing to do with Popen as such.
Just for the record, I had a problem particularly with a list-based command passed to Popen that would not preserve proper double quotes around a glob pattern (i.e. what was suggested in the accepted answer) under Windows. Joining the list into a string with ' '.join(cmd) before passing it to Popen solved the problem.
This works with python 2.7.3 The command to pipe stderr to stdout has changed since older versions of python:
Put this in a file called test.py:
#!/usr/bin/python
import subprocess
command = 'php -r "echo gethostname();"'
p = subprocess.Popen(command, universal_newlines=True, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
text = p.stdout.read()
retcode = p.wait()
print text
Invoke it:
python test.py
It prints my hostname, which is apollo:
apollo
Read up on the manual for subprocess: http://docs.python.org/2/library/subprocess.html
I have been working with a similar issue, with running a relatively complex
command over ssh. It also had multiple double quotes and single quotes. Because
I was piping the command through python, ssh, powershell etc.
If you can instead just convert the command into a shell script, and run the
shell script through subprocess.call/Popen/run, these issues will go away.
So depending on whether you are on windows or on linux or mac, put the
following in a shell script either (script.sh or script.bat)
ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4
Then you can run
import subprocess; subprocess.call(`./script.sh`; shell=True)
Without having to worry about single quotes, etc.
This line of code in your question isn't valid Python syntax:
command = "ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4"
If you had a Python file with just this line in it, you would get a syntax error. A string literal surrounded with double quotes can't have double quotes in them unless they are escaped with a backslash. So you could fix that line by replacing it with:
command = "ffmpeg -i \"concat:1.ts|2.ts\" -vcodec copy -acodec copy temp.mp4"
Another way to fix this line is to use single quotes for the string literal in Python, that way Python is not confused when the string itself contains a double quote:
command = 'ffmpeg -i "concat:1.ts|2.ts" -vcodec copy -acodec copy temp.mp4'
Once you have fixed the syntax error, you can then tackle the issue with using subprocess, as explained in this answer. I also wrote this answer to explain a helpful mental model for subprocess in general.
Also struggling with a string argument containing spaces and not wanting to use the shell=True.
The solution was to use double quotes for the inside strings.
args = ['salt', '-G', 'environment:DEV', 'grains.setvals', '{"man_version": "man-dev-2.3"}']
try:
p = subprocess.Popen(args, stdin=subprocess.PIPE
, stdout=subprocess.PIPE
, stderr=subprocess.PIPE
)
(stdin,stderr) = p.communicate()
except (subprocess.CalledProcessError, OSError ) as err:
exit(1)
if p.returncode != 0:
print("Failure in returncode of command:")
Anybody suffering from this pain. It also works with params enclosed with quotation marks.
params = ["ls", "-la"]
subprocess.check_output(" ".join(params), shell=True)

subprocess.Popen not escaping command line arguments properly?

I am trying to call the following curl command with python:
curl -k -F file=#something.zip -F "data={\\"title\\":\\"Another App\\"}" -Lu usr:pwd https://build.phonegap.com/api/v0/apps
For it to work, I've found that the json I'm passing in data needs to be escaped with backslashes.
I can call this command with...
os.system(curl -k -F file=#something.zip -F "data={\\"title\\":\\"Another App\\"}" -Lu usr:pwd https://build.phonegap.com/api/v0/apps)
and it works.
However, when I try to use the subprocess module like this...
s = 'curl -k -F file=#something.zip -F "data={\\"title\\":\\"Another App\\"}" -Lu usr:pwd https://build.phonegap.com/api/v0/apps'
push = subprocess.Popen(s.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output, errors = push.communicate()
print output
...the curl doesn't work and I get an error from the api I'm using that I'm using invalid parameters, which I've gotten in the past when I've used improperly escaped json.
What is going on here? Why can I call this command with os.system and not subprocess.Popen? So far my hypothesis is that the split is messing up something in the string, but I didn't find anything that looked wrong when I check the output of s.split().
perhaps using shell=True
push = subprocess.Popen(s, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
Instead of doing
s.split()
try using shlex from the standard library
import shlex
shlex.split(s)
Shlex allows you configure the escaping behavior (see the link for details, the defaults might be sufficient though)
Specifically where you are going wrong is splitting at:
\"Another,
App\"}"
.split()#
is using space-character by default you'll need to change split behaviour as others have said.

Categories

Resources