Python subprocess.call() apparently not working with psexec - python

I am having an issue executing remote processes with subprocess.call() and psexec. I am using the following syntax for executing the process remotely with subprocess.call():
def execute(hosts):
''' Using psexec, execute the script on the list of hosts '''
successes = []
wd = r'c:\\'
file = r'c:\\script.exe'
for host in hosts:
res = subprocess.call(shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (host,wd,file)), stdin=None, stdout=None, stderr=None)
if res == 0:
successes.append(host)
else:
logging.warning("Error executing script on host %s with error code %d" % (host, res))
print shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (hosts[0],wd,file))
return successes
As you can see, as part of my troubleshooting, I am printing the shlex.split() output to ensure that it is what I want. This print statement gives:
['psexec', '\\HOSTNAME', '-e', '-s', '-d', '-w', 'c:\\', 'c:\\script.exe']
Which is what I would expect. Unfortunately, when I run it, I get an error saying:
PsExec could not start \GN-WRK-02:
The system cannot find the file specified.
Directly after this, I run the psexec command with the exact syntax that the program should be running it with (judging by the shlex.split() output) and it works completely fine. My syntax is:
psexec \\HOSTNAME -e -s -d -w c:\\ c:\\script.exe
Any ideas why this wouldn't be working? If it matters the execute function is being called through multiprocessing's map() function on two or 3 host lists.
Any help would be great! Thanks!

Your \\ double slash in front of the hostname is just one slash; it is doubled to escape the slash.
You can see this in the shlex.split() output:
['psexec', '\\HOSTNAME', '-e, '-s', '-d', '-w', 'c:\\', 'c:\\script.exe']
note that \\ before the hostname is just the two backslashes, just like in the c:\\ filename value.. If you print just that value you see that the backslash at the sart is just one character:
>>> print '\\HOSTNAME'
\HOSTNAME
>>> '\\HOSTNAME'[0]
'\\'
>>> '\\HOSTNAME'[1]
'H'
That's because shlex.split() is a POSIX tool, not a Windows tool, and it interprets the \\ in the raw string as an escape too; if you are using that tool, double the slashes again:
shlex.split(r'psexec \\\\%s -e -s -d -w %s %s ' % (host,wd,file))
An alternative may be to disable POSIX mode, but I am not entirely certain how that would interplay with Windows:
shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (host,wd,file), posix=False)

Related

Passing variables to a script over ssh using gcloud command -- all variables treated as a single string?

I'm trying to setup a system to run some commands on VM's in google cloud, in my case we want to run a tcpdump at a certain time using the 'at' command. Right now I'm just trying to execute any commands succesfully, when I have to pass arguments along with the command and getting confusing behaviour, which appears to be that the command, and the arguments are executed as a single long command instead of seperate arguements.
I first tried in bash, and thinking my issue was one of quoting, I moved to using python to hopefully make things easier to understand, but I appear to be hitting the same issue and figure I must be doing something wrong.
I have the following functions defined in python, and call them
def execute(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
return_code = popen.wait()
if return_code:
raise subprocess.CalledProcessError(return_code, cmd)
def runCapture(project, instance, zone, time, duration):
## Run capture against server
print ("Running capture against Project: " + project + ", Instance: " + instance + ", Zone: " + zone, "at: " + time, "for " + str(duration) + " minutes")
## First connect, schedule capture
## Connect again, schedule upload of capture at capture time + duration time + some overrun.
## gcloud compute ssh --project=${PROJECT} ${INSTANCE} --zone="${ZONE}" --command="...do stuff..." --tunnel-through-iap
## CMD=\${1:-"/usr/sbin/tcpdump -nn -i ens4 -G \$(( ${DURATION}*60 )) -W 1 -w ./\$(uname -n)-%Y-%m-%d_%H.%M.%S.pcap"}
total_time=str(duration*60)
command="/bin/bash -c 'echo \"hello world\"'"
for path in execute(["/usr/bin/gcloud", "compute", "ssh", instance, "--project="+project, "--zone="+zone, "--tunnel-through-iap", "--command=\""+command+"\"", ]):
print(path, end="")
The resulting errors are as follows:
bash: /bin/bash -c 'echo hello: No such file or directory
Traceback (most recent call last):
File "./ingressCapture.py", line 79, in <module>
results = runCapture(project, instance, zone, time, duration)
File "./ingressCapture.py", line 33, in runCapture
for path in execute(["/usr/bin/gcloud", "compute", "ssh", instance, "--project="+project, "--zone="+zone, "--tunnel-through-iap", "--command=\""+command+"\"", ]):
File "./ingressCapture.py", line 17, in execute
raise subprocess.CalledProcessError(return_code, cmd)
subprocess.CalledProcessError: Command '['/usr/bin/gcloud', 'compute', 'ssh', 'tbtst-test3-app-egress-nztw', '--project=devops-tb-sandbox-250222', '--zone=europe-west1-b', '--tunnel-through-iap', '--command="/bin/bash -c \'echo "hello world"\'"']' returned non-zero exit status 127.
It appears to me, that instead of invoking the bash shell and running the echo command, it is instead invoking a command that includes the bash shell and then all the arguments too. I have a bash shell when I login normally via SSH, and can run the commands manually (and they work). Why are the arguments for the command from --command="....." getting called like this and how do I prevent it?
I'm pretty sure your problem is that you have too many quotes.
When you write --command="bash -c 'echo \"Hello World\"'" on the command line, the shell internally marks all the stuff inside the quotes as being in a quoted state and then removes the quotes. The actual argument that ends up going to the program is --command=bash -c 'echo "Hello World"' as a single string in argv (or your language's equivalent).
Try putting import sys ; print(sys.argv[1]) inside a small python script and calling it with ./test.py --command="bash -c 'echo \"Hello World\"'" to see for yourself.
However, in your arglist to subprocess, you're forming this string: --command="/bin/bash -c 'echo "hello world"'", presumably because you thought you needed to match what you'd normally type on the command line. You can see this in the stacktrace (minus the escaped single quotes, since that's syntax highlighting from python). Since python does not perform quote removal, those quotes are going through to the other side of your ssh connection where the login shell is attempting to reparse it as a shell command. The first "word" on the other end of the connection is /bin/bash -c 'echo hello because of those extra quotes so the shell attempts to find a command with that name on the path, and it clearly doesn't exist.
What you need to put into your arglist for subprocess is simply "--command="+command.

ssh + here-document syntax with Python

I'm trying to run a set of commands through ssh from a Python script. I came upon the here-document concept and thought: cool, let me implement something like this:
command = ( ( 'ssh user#host /usr/bin/bash <<EOF\n'
+ 'cd %s \n'
+ 'qsub %s\n'
+ 'EOF' ) % (test_dir, jobfile) )
try:
p = subprocess.Popen( command.split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
except :
print ('from subprocess.Popen( %s )' % command.split() )
raise Exception
#endtry
Unfortunately, here is what I get:
bash: warning: here-document at line 0 delimited by end-of-file (wanted `EOF')
Not sure how I can code up that end-of-file statement (I'm guessing the newline chars get in the way here?)
I've done a search on the website but there seem to be no Python examples of this sort...
Here is a minimum working example,the key is that after << EOF the remaining string should not be split. Note that command.split() is only called once.
import subprocess
# My bash is at /user/local/bin/bash, your mileage may vary.
command = 'ssh user#host /usr/local/bin/bash'
heredoc = ('<< EOF \n'
'cd Downloads \n'
'touch test.txt \n'
'EOF')
command = command.split()
command.append(heredoc)
print command
try:
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
except Exception as e:
print e
Verify by checking that the created file test.txt shows up in the Downloads directory on the host that you ssh:ed into.

output the command line called by subprocess?

I'm using the subprocess.Popen call, and in another question I found out that I had been misunderstanding how Python was generating arguments for the command line.
My Question
Is there a way to find out what the actual command line was?
Example Code :-
proc = subprocess.popen(....)
print "the commandline is %s" % proc.getCommandLine()
How would you write getCommandLine ?
It depends on the version of Python you are using. In Python3.3, the arg is saved in proc.args:
proc = subprocess.Popen(....)
print("the commandline is {}".format(proc.args))
In Python2.7, the args not saved, it is just passed on to other functions like _execute_child. So, in that case, the best way to get the command line is to save it when you have it:
proc = subprocess.Popen(shlex.split(cmd))
print "the commandline is %s" % cmd
Note that if you have the list of arguments (such as the type of thing returned by shlex.split(cmd), then you can recover the command-line string, cmd using the undocumented function subprocess.list2cmdline:
In [14]: import subprocess
In [15]: import shlex
In [16]: cmd = 'foo -a -b --bar baz'
In [17]: shlex.split(cmd)
Out[17]: ['foo', '-a', '-b', '--bar', 'baz']
In [18]: subprocess.list2cmdline(['foo', '-a', '-b', '--bar', 'baz'])
Out[19]: 'foo -a -b --bar baz'
The correct answer to my question is actually that there IS no command line. The point of subprocess is that it does everything through IPC. The list2cmdline does as close as can be expected, but in reality the best thing to do is look at the "args" list, and just know that that will be argv in the called program.
Beautiful and scalable method
I have been using something like this:
#!/usr/bin/env python3
import os
import shlex
import subprocess
import sys
def run_cmd(cmd, cwd=None, extra_env=None, extra_paths=None, dry_run=False):
if extra_env is None:
extra_env = {}
newline_separator = ' \\\n'
out = []
kwargs = {}
env = os.environ.copy()
# cwd
if 'cwd' is not None:
kwargs['cwd'] = cwd
# extra_env
env.update(extra_env)
for key in extra_env:
out.append('{}={}'.format(shlex.quote(key), shlex.quote(extra_env[key])) + newline_separator)
# extra_paths
if extra_paths is not None:
path = ':'.join(extra_paths)
if 'PATH' in env:
path += ':' + env['PATH']
env['PATH'] = path
out.append('PATH="{}:${{PATH}}"'.format(':'.join(extra_paths)) + newline_separator)
# Command itself.
for arg in cmd:
out.append(shlex.quote(arg) + newline_separator)
# Print and run.
kwargs['env'] = env
print('+ ' + ' '.join(out) + ';')
if not dry_run:
subprocess.check_call(cmd, **kwargs)
run_cmd(
sys.argv[1:],
cwd='/bin',
extra_env={'ASDF': 'QW ER'},
extra_paths=['/some/path1', '/some/path2']
)
Sample run:
./a.py echo 'a b' 'c d'
Output:
+ ASDF='QW ER' \
PATH="/some/path1:/some/path2:${PATH}" \
echo \
'a b' \
'c d' \
;
a b c d
Feature summary:
makes huge command lines readable with one option per line
add a + to commands like sh -x so users can differentiate commands from their output easily
show cd, and extra environment variables if they are given to the command. These only printed if given, generating a minimal shell command.
All of this allows users to easily copy the commands manually to run them if something fails, or to see what is going on.
Tested on Python 3.5.2, Ubuntu 16.04. GitHub upstream.
You can see it by passing the process id to ps command, if you are on POSIX OS:
import subprocess
proc = subprocess.Popen(["ls", "-la"])
subprocess.Popen(["ps", "-p", str(proc.pid)])
Output (see the CMD column):
PID TTY TIME CMD
7778 ttys004 0:00.01 ls -la
On windows, I used #catwith 's trick (thanks, btw):
wmic process where "name like '%mycmd%'" get processid,commandline
where "mycmd" is a part of the cmd unique to your command (used to filter irrelevant system commands)
That's how I revealed another bug in the suprocess vs windows saga. One of the arguments I had had its double-quotes escaped a-la unix! \"asdasd\"

calling rsync from python subprocess.call

I'm trying to execute rsync over ssh from a subprocess in a python script to copy images from one server to another. I have a function defined as:
def rsyncBookContent(bookIds, serverEnv):
bookPaths = ""
if len(bookIds) > 1:
bookPaths = "{" + ",".join(("book_"+str(x)) for x in bookIds) + "}"
else:
bookPaths = "book_" + str(bookIds[0])
for host in serverEnv['content.hosts']:
args = ["rsync", "-avz", "--include='*/'", "--include='*.jpg'", "--exclude='*'", "-e", "ssh", options.bookDestDir + "/" + bookPaths, "jill#" + host + ":/home/jill/web/public/static/"]
print "executing " + ' '.join(args)
subprocess.call(args)
What I'm ultimately trying to do is have Python execute this (which works from a bash shell):
rsync -avz --include='*/' --include='*.jpg' --exclude='*' -e ssh /shared/books/{book_482,book_347} jill#10.12.27.20:/home/jill/web/public/static/
And indeed my print statement outputs:
executing rsync -avz --include='*/' --include='*.jpg' --exclude='*' -e ssh /shared/books/{book_482,book_347} jill#10.12.27.20:/home/jill/web/public/static/
But when executed from within this python script, there are two problems:
if len(bookIds) > 1, the list of sub-directories under /shared/books/ is somehow misinterpreted by bash or rsync. The error message is:
rsync: link_stat "/shared/books/{book_482,book_347}" failed: No such file or directory (2))
if len(bookIds) == 1, all files under the source directory are rsynced (not just *.jpg, as is my intention)
Seems as if the subprocess.call function requires some characters to be escaped or something, no?
Figured out my issues. My problems were the result of my misunderstanding of how the subprocess.call function executes and bash's expansion of lists inside curly braces.
When I was issuing the rsync command in a bash shell with subdirectories in curly braces, bash was really expanding that into multiple arguments which were being passed to rsync (/shared/books/book_1 shared/books/book_2, etc.). When passing the same string with curly braces "/shared/books/{book_1, book_2}" to the subprocess.call function, the expansion wasn't happening, since it wasn't going through bash, so my argument to rsync was really "/shared/books/{book_1, book_2}".
Similarly, the single quotes around the file patterns ('*', '*.jpg', etc.) work on the bash command line (only the values inside the single quotes are passed to rsync), but inside subprocess.call, the single quotes are passed to rsync as the file pattern ("'*.jpg'").
New (working) code looks like this:
def rsyncBookContent(bookIds, serverEnv):
bookPaths = []
for b in bookIds:
bookPaths.append(options.bookDestDir + "/book_" + str(b))
args = []
for host in serverEnv['content.hosts']:
# copy all *.jpg files via ssh
args = ["rsync", "-avz", "--include", "*/", "--include", "*.jpg", "--exclude", "*", "-e", "ssh"]
args.extend(bookPaths)
args.append("jill#" + host + ":/home/jill/web/public/static/"])
print "executing " + ' '.join(args)
subprocess.call(args)

python sub-process

I usually execute a Fortran file in Linux (manually) as:
Connect to the server
Go to the specific folder
ifort xxx.for -o xxx && ./xxx (where 'xxx.for' is my Fortran file and 'xxx' is Fortran executable file)
But I need to call my fortran file (xxx.for) from python (I'm a beginner), so I used subprocess with the following command:
cmd = ["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
But I get an error, and I'm not sure what's wrong. Here's the full code:
import string
import subprocess as subProc
from subprocess import Popen as ProcOpen
from subprocess import PIPE
import numpy
import subprocess
userID = "pear"
serverName = "say4"
workDir = "/home/pear/2/W/fortran/"
Fortrancmd = "ifort"
jobname = "rad.for"
exeFilename = "rad"
sshConnect=userID+"#"+servername
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
**#command to execute fortran files in Linux
**#ifort <filename>.for -o <filename> && ./<filename> (press enter)
**#example:ifort xxx.for -o xxx && ./xxx (press enter)
print cmd
How can I write a python program that performs all 3 steps described above and avoids the error I'm getting?
there are some syntax errors...
original:
cmd=["ssh", sshConnect, "cd %s;"%(workDir), Fortrancmd %s jobname "%s -o %s" exeFilename "%s && %s ./ %s%s"%(exeFilename)]
I think you mean:
cmd = [
"ssh",
sshConnect,
"cd %s;" % (workDir,),
"%s %s -o %s && ./%s" % (Fortrancmd, jobname, exeFilename, exeFilename)
]
A few notes:
a tuple with one element requires a comma at the end of the first argument see (workDir,) to be interpreted as a tuple (vs. simple order-of-operations parens)
it is probably easier to contruct your fortan command with a single string format operation
PS - For readability it is often a good idea to break long lists into multiple lines :)
my advice
I would recommend looking at this stackoverflow thread for ssh instead of using subprocess
For the manual part you may want to look into pexpect or for windows wexpect. These allow you to perform subprocesses and pass input under interactive conditions.
However most of what you're doing sounds like it would work well in a shell script. For simplicity, you could make a shell script on the server side for your server side operations, and then plug in the path in the ssh statement:
ssh user#host "/path/to/script.sh"
one error:
you have an unquoted %s in your list of args, so your string formatting will fail.
Here is a complete example of using the subprocess module to run a remote command via ssh (a simple echo in this case) and grab the results, hope it helps:
>>> import subprocess
>>> proc = subprocess.Popen(("ssh", "remoteuser#host", "echo", "1"), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> stdout, stderr = proc.communicate()
Which in this case returns: ('1\n', '')
Note that to get this to work without requiring a password you will likely have to add your local user's public key to ~remoteuser/.ssh/authorized_keys on the remote machine.
You could use fabric for steps 1 and 2.
This is the basic idea:
from fabric.api import *
env.hosts = ['host']
dir = '/home/...'
def compile(file):
with cd(dir):
run("ifort %s.for -o %s" %(file,file))
run("./%s > stdout.txt" % file)
Create fabfile.py
And you run fab compile:filename
do you have to use python?
ssh user#host "command"

Categories

Resources