String parameter using subprocess module - python

I am using Python to simplify some commands in Maven. I have this script which calls mvn test in debug mode.
from subprocess import call
commands = []
commands.append("mvn")
commands.append("test")
commands.append("-Dmaven.surefire.debug=\"-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE\"")
call(commands)
The problem is with line -Dmaven.surefire.debug which accepts parameter which has to be in quotas and I don't know how to do that correctly. It looks fine when I print this list but when I run the script I get Error translating CommandLine and the debugging line is never executed.

The quotas are only required for the shell executing the command.
If you do the said call directly from the shell, you probably do
mvn test -Dmaven.surefire.debug="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE"
With these " signs you (simply spoken) tell the shell to ignore the spaces within.
The program is called with the arguments
mvn
test
-Dmaven.surefire.debug=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE
so
from subprocess import call
commands = []
commands.append("mvn")
commands.append("test")
commands.append("-Dmaven.surefire.debug=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000 -Xnoagent -Djava.compiler=NONE")
call(commands)
should be the way to go.

Related

How to get full command executed using sh module?

I ran into an error while executing one of our devops scripts. The script uses the sh package (for executing common unix commands, pypi link). However, the commands that are executed are truncated in the messages printed by sh. How can I see the whole command that was executed?
example:
import sh
sh.ssh(host,
'rsync -av {src} {dst}'.format(src=src,
dst=dst),
_out=sys.stdout
)
Produces output like:
INFO:sh.command:<Command '/bin/ssh dbw#ny...(77 more)' call_args {'bg': False, 'timeo...(522 more)>: starting process
I'd like to see the full command executed, and all of the call_args.
sh.ssh returns an sh.RunningCommand object, which you can query to find the call args and the cmd:
import sh
a=sh.ssh(host,
'rsync -av {src} {dst}'.format(src=src,
dst=dst),
_out=sys.stdout
)
print(a.cmd)
print(a.call_args)
After peeking into the source code, it looks like this is controlled by the max_len parameter of the friendly_truncate function, so one option may be to edit the sh.py code directly and set a higher int value:
https://github.com/amoffat/sh/blob/master/sh.py#L424
https://github.com/amoffat/sh/blob/master/sh.py#L425
Or, possibly just remove points where that function is called.

Run shell script from python

I am trying to run a shell script from a python script using the following:
from subprocess import call
call(['bash run.sh'])
This gives me an error, but I can successfully run other commands like:
call(['ls'])
You should separate arguments:
call(['bash', 'run.sh'])
call(['ls','-l'])
from subprocess import call
import shlex
call(shlex.split('bash run.sh'))
You want to properly tokenize your command arguments. shlex.split() will do that for you.
Source: https://docs.python.org/2/library/subprocess.html#popen-constructor
Note shlex.split() can be useful when determining the correct
tokenization for args, especially in complex cases:
When you call call() with a list, it expects every element of that list to correspond to a command line argument.
In this case it is looking for bash run.sh as the executable with spaces and everything as a single string.
Try one of these:
call("bash run.sh".split())
call(["bash", "run.sh"])

How to handle ':' in command using os.popen

I am trying to call a program with:
os.popen("program -s:'*' -c:'A;B;C;'")
However, it seems that it was interpreted as shell command:
program -s '*' -c 'A;B;C;'
which result incorrect behavior.
Can somebody help me on how to hanle such situdations where ':' is inside shell commandline?
Don't use os.popen(), use the subprocess module instead:
import subprocess
result = subprocess.check_output(['program', "-s:'*'", "-c:'A;B;C;'"])
This returns the output of the program without running it through a shell, passing in the arguments directly without any additional parsing.

Using Subprocess or os.system to run shell commands in python

I need to write a code in Python using functions that a friend of mine developed in shell. Is that possible? Can I do something like
output = subprocess.call('friends_developed_function', shell = True)
You need to make sure your friend's function is defined before you can call it. You cannot call a function which was defined in a parent process [Note 1]. So you could execute the following:
output = subprocess.check_output(
'source /path/to/function/definition; the_function args if needed',
shell = True)
Note that I changed subprocess.call to subprocess.check_output so that the call will return the output of the shell function, instead of its exit code.
It's a little awkward fixing the path to the script file with the function definitions. You could instead just define the function directly before calling it, using a string literal:
output = subprocess.check_output(
"""my_func() { echo The argument is "$1"; }
my_func the_argument
""",
shell = True)
Notes:
Unless you are using bash, but that probably won't work for os.system or subprocess.call(..., shell=True) because those will use the basic shell /bin/sh, which often is not bash. Even if you forced the use of bash, and you had properly exported the function definitions, it would still be a bad idea because your python script would only work if the environment were set up correctly.
There is a couple of ways to do this, I'm posting what I am familiar with.
with open(r"*file location", 'wb', 0) as file:
subprocess.check_call(*command*, stdout=file)
Now the output is in the text file location.I used check_call to validate the command so I assume subprocess.call() just executes the command.

Run commands sequential from Python

I'm trying to build a LaTeX document using Python but am having problems getting the commands to run in sequence. For those familiar with LaTeX, you'll know that you usually have to run four commands, each completing before running the next, e.g.
pdflatex file
bibtex file
pdflatex file
pdflatex file
In Python, I'm therefore doing this to define the commands
commands = ['pdflatex','bibtex','pdflatex','pdflatex']
commands = [(element + ' ' + src_file) for element in commands]
but the problem is then running them.
I've tried to suss things out from this thread – e.g. using os.system() in a loop, subprocess stuff like map(call, commands) or Popen, and collapsing the list to a single string separated by & – but it seems like the commands all run as separate processes, without waiting for the previous one to complete.
For the record, I'm on Windows but would like a cross-platform solution.
EDIT
The problem was a bug in speciyfing the src_file variable; it's shouldn't have a ".tex". The following code now works:
test.py
import subprocess
commands = ['pdflatex','bibtex','pdflatex','pdflatex']
for command in commands:
subprocess.call((command, 'test'))
test.tex
\documentclass{article}
\usepackage{natbib}
\begin{document}
This is a test \citep{Body2000}.
\bibliographystyle{plainnat}
\bibliography{refs}
\end{document}
refs.bib
#book{Body2000,
author={N.E. Body},
title={Introductory Widgets},
publisher={Widgets International},
year={2000}
}
os.system shouldn't cause this, but subprocess.Popen should.
But I think using subprocess.call is the best choice:
commands = ['pdflatex','bibtex','pdflatex','pdflatex']
for command in commands:
subprocess.call((command, src_file))

Categories

Resources