ssh + here-document syntax with Python - python

I'm trying to run a set of commands through ssh from a Python script. I came upon the here-document concept and thought: cool, let me implement something like this:
command = ( ( 'ssh user#host /usr/bin/bash <<EOF\n'
+ 'cd %s \n'
+ 'qsub %s\n'
+ 'EOF' ) % (test_dir, jobfile) )
try:
p = subprocess.Popen( command.split(), stdout=subprocess.PIPE, stderr=subprocess.STDOUT )
except :
print ('from subprocess.Popen( %s )' % command.split() )
raise Exception
#endtry
Unfortunately, here is what I get:
bash: warning: here-document at line 0 delimited by end-of-file (wanted `EOF')
Not sure how I can code up that end-of-file statement (I'm guessing the newline chars get in the way here?)
I've done a search on the website but there seem to be no Python examples of this sort...

Here is a minimum working example,the key is that after << EOF the remaining string should not be split. Note that command.split() is only called once.
import subprocess
# My bash is at /user/local/bin/bash, your mileage may vary.
command = 'ssh user#host /usr/local/bin/bash'
heredoc = ('<< EOF \n'
'cd Downloads \n'
'touch test.txt \n'
'EOF')
command = command.split()
command.append(heredoc)
print command
try:
p = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
except Exception as e:
print e
Verify by checking that the created file test.txt shows up in the Downloads directory on the host that you ssh:ed into.

Related

Failed to execute command line argument from python script

I am trying to run a command line argument through python script. Script triggers the .exe but it throws an error as System.IO.IOException: The handle is invalid..
Following is my code :
import os , sys , os.path
from subprocess import call
import subprocess, shlex
def execute(cmd):
"""
Purpose : To execute a command and return exit status
"""
process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(result, error) = process.communicate()
rc = process.wait()
if rc != 0:
print "Error: failed to execute command:",cmd
print error
return result
found_alf = r"C:\AniteSAS\ResultData\20170515\Run01\1733200515.alf"
filter_alvf = r"C:\Users\sshaique\Desktop\ALF\AniteLogFilter.alvf"
command = str(r'ALVConsole.exe -e -t -i ' + '\"'+found_alf+'\"' + ' --ffile ' + '\"'+filter_alvf+'\"')
print command
os.chdir('C:\Program Files\Anite\LogViewer\ALV2')
print os.getcwd()
print "This process detail: \n", execute(command)
Output is as follows :
ALVConsole.exe -e -t -i "C:\AniteSAS\ResultData\20170515\Run01\1733200515.alf" --ffile "C:\Users\sshaique\Desktop\ALF\AniteLogFilter.alvf"
C:\Program Files\Anite\LogViewer\ALV2
This process detail:
Error: failed to execute command: ALVConsole.exe -e -t -i "C:\AniteSAS\ResultData\20170515\Run01\1733200515.alf" --ffile "C:\Users\sshaique\Desktop\ALF\AniteLogFilter.alvf"
Unhandled Exception: System.IO.IOException: The handle is invalid.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.Console.GetBufferInfo(Boolean throwOnNoConsole, Boolean& succeeded)
at ALV.Console.CommandLineParametersHandler.ConsoleWriteLine(String message, Boolean isError)
at ALV.Console.CommandLineParametersHandler.InvokeActions()
at ALV.Console.Program.Main(String[] args)
When I copy the command line argument from the above output and run manually from cmd it works fine.
ALVConsole.exe -e -t -i "C:\AniteSAS\ResultData\20170515\Run01\1733200515.alf" --ffile "C:\Users\sshaique\Desktop\ALF\AniteLogFilter.alvf"
I am using Windows 7 and Python 2.7.13 for. Please suggest overcoming this issue.
EDIT:
I have also tried to pass command as a list s as per below code but the issue remains the same.
command = str(r'ALVConsole.exe -e --csv -i ' + '\"'+found_alf+'\"' + ' --ffile ' + '\"'+filter_alvf+'\"')
s=shlex.split(command)
print s
print "This process detail: \n", execute(s)
Based on your error messages I think that this problem is with ALVConsole.exe, not your Python script.
When you redirect the output, ALVConsole.exe tries to do something to the console (like setting cursor position, or getting the size of the terminal) but fails like this.
Is there a flag to ALVConsole.exe that modifies the output to a machine-readable version? I wasn't able to find the documentation for this program.

Python subprocess.call() apparently not working with psexec

I am having an issue executing remote processes with subprocess.call() and psexec. I am using the following syntax for executing the process remotely with subprocess.call():
def execute(hosts):
''' Using psexec, execute the script on the list of hosts '''
successes = []
wd = r'c:\\'
file = r'c:\\script.exe'
for host in hosts:
res = subprocess.call(shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (host,wd,file)), stdin=None, stdout=None, stderr=None)
if res == 0:
successes.append(host)
else:
logging.warning("Error executing script on host %s with error code %d" % (host, res))
print shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (hosts[0],wd,file))
return successes
As you can see, as part of my troubleshooting, I am printing the shlex.split() output to ensure that it is what I want. This print statement gives:
['psexec', '\\HOSTNAME', '-e', '-s', '-d', '-w', 'c:\\', 'c:\\script.exe']
Which is what I would expect. Unfortunately, when I run it, I get an error saying:
PsExec could not start \GN-WRK-02:
The system cannot find the file specified.
Directly after this, I run the psexec command with the exact syntax that the program should be running it with (judging by the shlex.split() output) and it works completely fine. My syntax is:
psexec \\HOSTNAME -e -s -d -w c:\\ c:\\script.exe
Any ideas why this wouldn't be working? If it matters the execute function is being called through multiprocessing's map() function on two or 3 host lists.
Any help would be great! Thanks!
Your \\ double slash in front of the hostname is just one slash; it is doubled to escape the slash.
You can see this in the shlex.split() output:
['psexec', '\\HOSTNAME', '-e, '-s', '-d', '-w', 'c:\\', 'c:\\script.exe']
note that \\ before the hostname is just the two backslashes, just like in the c:\\ filename value.. If you print just that value you see that the backslash at the sart is just one character:
>>> print '\\HOSTNAME'
\HOSTNAME
>>> '\\HOSTNAME'[0]
'\\'
>>> '\\HOSTNAME'[1]
'H'
That's because shlex.split() is a POSIX tool, not a Windows tool, and it interprets the \\ in the raw string as an escape too; if you are using that tool, double the slashes again:
shlex.split(r'psexec \\\\%s -e -s -d -w %s %s ' % (host,wd,file))
An alternative may be to disable POSIX mode, but I am not entirely certain how that would interplay with Windows:
shlex.split(r'psexec \\%s -e -s -d -w %s %s ' % (host,wd,file), posix=False)

subprocess ssh command fails for some commands but not others (command works in terminal)

As part of a python script, I am hoping to capture the output of a shell command executed via ssh, namely
ssh User#999 screen -list
If I execute the above command directly in terminal, I get the results I need. However, when executing through subprocess.check_output as below, I get a non-zero exit status 1 error.
I am able to execute other commands via ssh and capture the output without problem.
Is there something specific about screen -list that does not like being called in this fashion?
import subprocess
srvr = 'User#999.99.999.9'
print("CMD 1: ===============")
cmd1 = "ssh " + srvr + " ls -l"
print ("COMMAND IS ..... " + cmd1 + "\n")
out1 = subprocess.check_output(cmd1, shell=True)
print(out1 + "\n")
print("CMD 2: ===============")
cmd2 = "ssh " + srvr + " screen -list"
print ("COMMAND IS ..... " + cmd2 + "\n")
out2 = subprocess.check_output(cmd2, shell=True)
print(out2 + "\n")
Error:
subprocess.CalledProcessError: Command '['ssh User#999.99.999.9 screen', '-list']' returned non-zero exit status 1
subprocess.check_output check the exit code of the subprocess; and it raises exception if the exit code is not zero.
If you don't care about exit code, use subprocess.Popen.communicate:
out1, err1 = subprocess.Popen(cmd1,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate()
That's how subprocess.check_output() is supposed to work. See: http://docs.python.org/2/library/subprocess.html
The command on your server is returning a non zero return code and thus is raising the appropriate Exception CalledProcessError.

Python subprocess - run multiple shell commands over SSH

I am trying to open an SSH pipe from one Linux box to another, run a few shell commands, and then close the SSH.
I don't have control over the packages on either box, so something like fabric or paramiko is out of the question.
I have had luck using the following code to run one bash command, in this case "uptime", but am not sure how to issue one command after another. I'm expecting something like:
sshProcess = subprocess.call('ssh ' + <remote client>, <subprocess stuff>)
lsProcess = subprocess.call('ls', <subprocess stuff>)
lsProcess.close()
uptimeProcess = subprocess.call('uptime', <subprocess stuff>)
uptimeProcess.close()
sshProcess.close()
What part of the subprocess module am I missing?
Thanks
pingtest = subprocess.call("ping -c 1 %s" % <remote client>,shell=True,stdout=open('/dev/null', 'w'),stderr=subprocess.STDOUT)
if pingtest == 0:
print '%s: is alive' % <remote client>
# Uptime + CPU Load averages
print 'Attempting to get uptime...'
sshProcess = subprocess.Popen('ssh '+<remote client>, shell=True,stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
sshProcess,stderr = sshProcess.communicate()
print sshProcess
uptime = subprocess.Popen('uptime', shell=True,stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
uptimeProcess,stderr = uptimeProcess.communicate()
uptimeProcess.close( )
print 'Uptime : ' + uptimeProcess.split('up ')[1].split(',')[0]
else:
print "%s: did not respond" % <remote client>
basically if you call subprocess it creates a local subprocess not a remote one
so you should interact with the ssh process. so something along this lines:
but be aware that if you dynamically construct my directory it is suceptible of shell injection then END line should be a unique identifier
To avoid the uniqueness of END line problem, an easiest way would be to use different ssh command
from __future__ import print_function,unicode_literals
import subprocess
sshProcess = subprocess.Popen(['ssh',
'-tt'
<remote client>],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("ls .\n")
sshProcess.stdin.write("echo END\n")
sshProcess.stdin.write("uptime\n")
sshProcess.stdin.write("logout\n")
sshProcess.stdin.close()
for line in sshProcess.stdout:
if line == "END\n":
break
print(line,end="")
#to catch the lines up to logout
for line in sshProcess.stdout:
print(line,end="")

How to log errors in python when using the os module

I'm trying to incorporate a simple way to keep track of a periodic mysqldump command I want to run using the os module in python. I've written this, but in testing it doesn't raise the exception, even when the mysqldump command completes with an error. I'm pretty new to python, so I might be approaching this terribly, but I thought I would try to get pointed in the right direction.
db_dump = "mysqldump -u %s -p%s --socket=source_socket --databases %s | mysql -u %s -p%s --socket=dest_socket" % (db_user, db_pass, ' '.join(db_list), db_user, db_pass)
try:
os.system(db_dump)
except:
logging.error("databases did not dump")
else:
logging.info("database dump complete")
os.system is not a very robust or powerful way to call system commands, I'd recommend using subprocess.check_output() or subprocess.check_call
ie,
>>> cmd = 'ls -l'
>>> badcmd = 'ls /foobar'
>>> subprocess.check_call(cmd.split())
0
>>> subprocess.check_call(badcmd.split())
ls: /foobar: No such file or directory
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 511, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['ls', '/foobar']' returned non-zero exit status 1
os.system() returns an integer result code. When it returns 0, the command ran successfully; when it returns a nonzero value, that indicates an error.
db_dump = "mysqldump -u %s -p%s --socket=source_socket --databases %s | mysql -u %s -p%s --socket=dest_socket" % (db_user, db_pass, ' '.join(db_list), db_user, db_pass)
result = os.system(db_dump)
if 0 == result:
logging.info("database dump complete")
else:
logging.error("databases did not dump; result code: %d" % result)
Like #COpython, I recommend the use of subprocess. It is a bit more complicated than os.system() but it is tremendously more flexible. With os.system() the output is sent to the terminal, but with subprocess you can collect the output so you can search it for error messages or whatever. Or you can just discard the output.
Here is what I would do.
import logging
import subprocess
log = logging.getLogger(__name__)
cmd = "mysqldump -u %s -p%s --socket=source_socket --databases %s | mysql -u %s -p%s " \
"--socket=dest_socket" % (db_user, db_pass, ' '.join(db_list), db_user, db_pass)
process = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE.PIPE)
stdout, stderr = process.communicate()
stdout = [x for x in stdout.split("\n") if x != ""]
stderr = [x for x in stderr.split("\n") if x != ""]
if process.returncode < 0 or len(stderr):
for error in stderr:
log.error(error)

Categories

Resources