Python call to system program not working - python

I wrote a little python script, intending to automate non-default options for gcc (on Kubuntu 14.04); the python runs without error now, and inserting a debug print statement (or changing the system command to 'echo') verifies the correct information is being passed, but I get an error from gcc saying
$ python gccm prog16
gcc: fatal error: no input files
compilation terminated.
Here's the script I wrote:
#!/usr/bin/python
from sys import argv #get incoming argument
import subprocess #function to call an OS program
script, target = argv
# massage received argument into form needed for math.h linkage
target = "-o " + target + " " + target + ".c -lm"
subprocess.call (['gcc', target], shell=False)`
There are other additions I'd make to the gcc call (compile version options, stricter code checking, etc.), if I can get this to work correctly. Based on the error message, it appears to be invoking gcc correctly, but the target source file isn't being found; could this not be running in the directory from which I invoke it? If so, how can I get it to run from the correct directory (where I'm keeping my C source code files); if not, what else could cause this?

If you're using shell=False, your arguments to the sub-processes shouldn't be concatenated together. Instead, they should each be their own element in the args list:
subprocess.call(['gcc', '-o', target, target+'.c', '-lm'], shell=False)
On a related note, any reason why you're writing something like this yourself? If you're looking to use a Python-based build system, have a look at SCons.

If you have shell=False, then you must pass each argument separately into the subprocess.call.
Try this instead:
subprocess.call (['gcc', '-o', target, target + '.c', '-lm'], shell=False)

Related

Execute batch file in different directory

I have a a file structure like the following (Windows):
D:\
dir_1\
batch_1.bat
dir_1a\
batch_2.bat
dir_2\
main.py
For the sake of this question, batch_1.bat simply calls batch_2.bat, and looks like:
cd dir_1a
start batch_2.bat %*
Opening batch_1.bat from a command prompt indeed opens batch_2.bat as it's supposed to, and from there on, everything is golden.
Now I want my Python file, D:\dir_2\main.py, to spawn a new process which starts batch_1.bat, which in turn should start batch_2.bat. So I figured the following Python code should work:
import subprocess
subprocess.Popen(['cd "D:/dir_1"', "start batch_1.bat"], shell=True)
This results in "The system cannot find the path specified" being printed to my Python console. (No error is raised, of course.) This is due to the first command. I get the same result even if I cut it down to:
subprocess.Popen(['cd "D:/"'], shell=True)
I also tried starting the batch file directly, like so:
subprocess.Popen("start D:/dir_1/batch_1.bat", shell=True)
For reasons that I don't entirely get, this seems to just open a windows command prompt, in dir_2.
If I forego the start part of this command, then my Python process is going to end up waiting for batch_1 to finish, which I don't want. But it does get a little further:
subprocess.Popen("D:/dir_1/batch_1.bat", shell=True)
This results in batch_1.bat successfully executing... in dir_2, the directory of the Python script, rather than the directory of batch_1.bat, which results in it not being able to find dir_1a\ and hence, batch_2.bat is not executed at all.
I am left highly confused. What am I doing wrong, and what should I be doing instead?
Your question is answered here: Python specify popen working directory via argument
In a nutshell, just pass an optional cwd argument to Popen:
subprocess.Popen(["batch_1.bat"], shell=True, cwd=r'd:\<your path>\dir1')

Python function subprocess.check_output returns CalledProcessError: command returns non-zero exit status

As a follow-up to my previous question, which got solved quickly, I'm running the Python code below in WinPython:
import os, subprocess
os.chdir("C:/Users/Mohammad/Google Drive/PhD/Spyder workspace/production-consumption/logtool-examples/")
logtoolDir="C:/Users/Mohammad/Google Drive/PhD/Spyder workspace/production-consumption/logtool-examples/ "
#processEnv = {'JAVA_HOME': 'C:/Program Files/Java/jdk1.8.0_66/'}
args = r'"org.powertac.logtool.example.ProductionConsumption D:/PowerTAC/Logs/2015/log/powertac-sim-1.state testrunoutput.data"'
subprocess.check_output(['mvn', 'exec:exec', '-Dexec.args=' + args],
shell = True, cwd = logtoolDir)
And get the following error:
CalledProcessError: Command '['mvn', 'exec:exec', '-Dexec.args="org.powertac.logtool.example.ProductionConsumption D:/PowerTAC/Logs/2015/log/powertac-sim-1.state testrunoutput.data"']' returned non-zero exit status 1
The Apache Maven executable does not seem to run. My guess is that the arguments are being passed on to the program incorrectly. I couldn't find any typos in the args or the logtoolDir arguments, but maybe I'm missing something there? Any ideas?
UPDATE:
The mvn exec:exec was not running because check_output has somehow been unable to access Windows' environmental variables. I added the path variable to processEnv and now 'mvn','--version' in the check_output args confirms Maven runs. The code still doesn't run but I imagine it's probably an issue with how I've defined the directories.
Cheers.
Problem solved. Basically: a) subprocess.check_output could not read Windows' environment variables (e.g. PATH, JAVA_HOME), so I had to redefine the one I was using in processEnv and pass it along in the function's arguments. Also, b) The args variable was defined incorrectly. I needed to remove one set of quotation marks, and also make it raw using r.
The corrected code:
logtoolDir='C:/Users/Mohammad/Google Drive/PhD/Spyder workspace/production-consumption/logtool-examples/'
processEnv = {'JAVA_HOME': 'C:/Program Files/Java/jdk1.8.0_66/jre/',
'Path' : 'C:/Program Files/apache-maven-3.3.3/bin/'}
args = r"org.powertac.logtool.example.ProductionConsumption D:/PowerTAC/Logs/2015/log/powertac-sim-1.state testrunoutput2.data"
print(subprocess.check_output(['mvn', 'exec:exec', '-Dexec.args='+ args],
shell = True, env = processEnv, cwd = logtoolDir))
Unfortunately I can't find a way to go around using the shell = True argument, which probably won't be a problem since this will only be used for data analysis.
Cheers.

Python rsync error in reading remote root-level files

I try to setup a cron job to rsync remote files (contains root-level files) into my local server, if I run the command in shell, it works. But if I run this in Python, I got into strange command not found error:
This works if run it in a shell:
rsync -ave ssh --rsync-path='sudo rsync' --delete root#192.168.1.100:/tmp/test2 ./test
But this Python script doesn't:
#!/usr/bin/python
from subprocess import call
....
for src_dir in backup_list:
call(["rsync", "-ave", "ssh", "--rsync-path='sudo rsync'", "--delete", src_host+src_dir, dst_dir])
It fails with:
local server:$ backup.py
bash: sudo rsync: command not found
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: remote command not found (code 127) at io.c(226) [Receiver=3.1.0]
...
It is most likely a spacing error or something small, the way I debug commands is to make sure to prints out. OS.system is a great alternative thats easier although subprocess is better. I am not around my computer to test it but you can either set your subprocess like that, or use this example. This is assuming your on Linux or Mac.
import os
cmd = ('rsync -ave --delete root' +str(src_host) + str(src_directory) + '' + str(dst_dir)) #variable you can call anytime
os.system(cmd) # actually performs the command
print x # how to test and make sure
Quotes around an argument with spaces like you have in "--rsync-path='sudo rsync'" are needed when the shell splits up a long string into arguments, to avoid treating rsync as a separate argument. In your call(), you're providing the individual arguments, so that splitting of a string into arguments is not performed. With your code as-is, the quotes end up as part of the argument passed to rsync. Just drop them. Here's a working example of the list passed to a call() for a very similar rsync invocation:
['rsync',
'-arvz',
'-delete',
'-e',
'ssh',
'--rsync-path=sudo rsync',
'192.168.0.17:/remote/directory/',
'/local/directory/']
I have been facing the same issue:
This piece of code work for me…
join the command while passing to call or Popen and add shell=True.
from subprocess import call
for src_dir in backup_list:
call( " ".join(["rsync", "-ave", "ssh", "--rsync-path='sudo rsync'", "--delete", src_host+src_dir, dst_dir]) , shell=True)

SSH: A javac command that works in terminal doesn't work when executed over SSH

I'm using Python code to run a Hadoop program on a Linux (Cloudera) machine using SSH.
I'm having some trouble with compiling java files to class files. When I'm executing the command:
javac -cp /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/* remote_hadoop/javasrc/* from the Linux terminal all the files get compiled successfully.
When I'm executing the same command through my Python SSH client I'm receiving an 'invalid flag' error:
spur.results.RunProcessError: return code: 2
output: b''
stderr output: b'javac: invalid flag: remote_hadoop/javasrc\nUsage: javac \nuse -help for a list of possible options\n'
The python code:
list_of_commands = ["javac", "-cp", r"/usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/*", input_folder + r"/*"]
print ' '.join(list_of_commands)
self.shell.run(list_of_commands)
The command is getting rendered correctly, since what is getting printed is javac -cp /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/* remote_hadoop/javasrc/*.
UPDATE: It's pretty weird. I can compile one file at a time over ssh, but not all of them. Seems like something happens to the "*" over ssh.
You're passing a list of arguments, not a list of commands. It's not even an accurate list of arguments.
If your underlying tool expects a list of arguments, then pass:
['sh', '-c', 'javac -cp /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/* remote_hadoop/javasrc/*']
If it expects a list of commands:
['javac -cp /usr/lib/hadoop/*:/usr/lib/hadoop/client-0.20/* remote_hadoop/javasrc/*']
If it expects something else -- read the documentation and determine what that something is!
Note that SSH doesn't provide a way to pass a literal argv array when running an arbitrary command; rather, it expects -- at the protocol level -- a string ready for parsing by the remote shell. If your self.shell.run code is doing shell quoting before joining the argument list given, then it would be passing the last argument as the literal string remote_hadoop/javasrc/* -- not expanding it into a list of filenames as a shell would.
Using the sh -c form forces the remote shell to perform expansion on its end, assuming that contents are being given to it in a form which doesn't have remote expansion performed already.
The problem is the way that spur builds the command list into a command string. It takes every command token and encloses it in single quotes (["ls", "*.txt"]) becomes 'ls' '*.txt'). There is no shell expansion of * inside quotes, so the command doesn't work.
You can see the problem in spur's ssh.py on line 323:
def escape_sh(value):
return "'" + value.replace("'", "'\\''") + "'"
I don't use spur, but it looks like it just doesn't allow you to do such things. The problem with "simplifiers" like spur is that if they simplify in a way you don't want, you can't use them.

python subprocess module to call shell (Bash) script

I am trying to call a shell (Bash) script from python. The script is in my /home/user/bin directory with execute permission for group & user, i.e., -rwxr-xr--. I am using subprocess.check_call(["/home/user/bin/script.sh %s %s" % (subj,-6)],shell=True) and this is generating an exit status 127 code. Adding stderr=subprocess.STDOUT to the command does nothing to elucidate. Here is the exact output:
CalledProcessError: Command
'['/home/.../bin/MNE_setup_source_space.sh kubi_td104 -6']'
returned non-zero exit status 127`
I believe this might be a PATH related issue, is that correct? I don't know how to resolve this. If I am already passing in the absolute path to the executable how can there be a PATH issue?
Thanks in advance
Do not use shell=True. Do not pass arguments as part of argv[0]. Pass your argument vector as a vector -- which is to say, in Python, a list:
subprocess.check_call(["/home/user/bin/script.sh", str(subj), "-6"])
If you were going to use shell=True, you would do it like so:
subprocess.check_call("/home/user/bin/script.sh %s %s" % (subj,-6), shell=True)
...which is to say, you wouldn't use a list form at all.
To clarify why what you're currently trying is failing -- because you're using shell=True, it's trying to pass only the first list element as a script, and additional arguments as extra argv elements which would only be read or interpreted if the script passed in the first argument chose to look at them (as by referring to "$1", "$2", or the like).
shell=True is only needed in very rare circumstances where you need a shell to perform redirections or logic before starting the program you're trying to run, and comes with serious security concerns if any unvetted input is incorporated into the command being run. Do not use it unless you're very, very sure you need to.

Categories

Resources