Python subprocess + scp - can't read all output - python

I'm trying to SCP a file between machines and I need to fail when the user hasn't set up a private/public certificate to do passwordless logins. Unfortunatly, using subprocess.Popen I can't figure out how to capture the following output:
The authenticity of host '***' can't be established.
RSA key fingerprint is ***.
Are you sure you want to continue connecting (yes/no)
It always shows up on the console and I can't get it in my program to detect it.
Here's some example code:
proc = subprocess.Popen(['scp', 'user#server:/location/file.txt', '/someplace/file.txt',
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.wait()
print 'result: %s' % repr(proc.stderr.readline())
I've tried many other permutations. This one still prompts me, and not Python to enter yes/no. At least when I type no though I get:
result: 'Host key verification failed.\r\n'

'The authenticity of host '***' can't be established' means the machine your connecting from hasn't been told to save the other ends (server) identity to the known_hosts file and it asking if you trust the machine. You can change the ssh client to just add it automatically without prompting you.
try this:
proc = subprocess.Popen(['scp', '-o BatchMode=yes',
'user#server:/location/file.txt',
'/someplace/file.txt'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.wait()
print 'result: %s' % repr(proc.stderr.readline())
With the above code i get:
me#myMachine:~$ python tmp.py
result: 'Host key verification failed.\r\n'
me#myMachine:~$
If I use disable StrictHostKeyChecking i get:
me#myMachine:~$ python tmp.py
result: 'Permission denied (publickey,password,keyboard-interactive).\r\n'
me#myMachine:~$ python tmp.py
So it looks like it is printing the first line from stderr with BatchMode turned on :)

I've run into something similar before, though in my case it was actually helpful. I believe ssh and friends don't actually read stdin and print on stdout or stderr, they do funky things to hook up with the terminal you're running in directly.
I believe the reasoning is they they're supposed to be able to talk to the user directly, even when run through wrapper shell scripts, because the user knows the password, not the calling script (and probably they deliberately don't want calling scripts to have the opportunity to intercept a password).
[Edit to add]: According to the man page on my system, scp does have a flag that might do what you want:
-B Selects batch mode (prevents asking for passwords or passphrases).

Related

Paramiko: Can't send command on its standard input [duplicate]

I use a friends server that allows only one user to be logged from SSH, so normally I just log in as that user and then do su -l myuser to change accounts. I wanted to automate some boring stuff using Python, but I ran into problems with that. Apparently Paramiko module that I tried first invokes a single shell for every command, so that was out of the question. Later I tried using invoke_shell() to overcome that, but it still failed (I assume it's because changing user changes shell as well).
After that I found about Fabric module, but best I could do is open SSH shell with a proper user logged in, but without option to run any commands from code.
Is there any way to accomplish that? Final goal would probably look something like this:
ssh.login(temp_user, pass)
ssh.command("su -l myuser")
expect("Password: ", ssh.send("mypass\n")
ssh.command("somescript.sh > datadump.txt")
Using sudo is impossible, as well as adding passwordless login.
As suggested here is the code that I tried with Paramiko:
import paramiko
host = "hostip"
user = "user"
user_to_log = "myuser"
password = "pass"
password_to_log = "mypass"
login_command = "su -l " + user_to_log
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostip, username=user,
password=password)
transport = ssh.get_transport()
session = transport.open_session()
session.set_combine_stderr(True)
session.get_pty()
session.exec_command("su -l " + user_to_log)
stdin = session.makefile('wb', -1)
stdin.write(password_to_log +'\n')
stdin.flush()
session.exec_command("whoami")
stdout = session.makefile('rb', -1)
for line in stdout.read().splitlines():
print('host: %s: %s' % (host, line))
su -c command won't work either, since server system doesn't support that option.
General disclaimers first (to others who stumble upon this question):
Using su is not the right solution. su is a tool intended for an interactive use, not for an automation. The correct solution is to login with the correct account directly.
Or at at least use a password-less sudo.
Or you can create a root-owned script with setuid right.
See also Allowing automatic command execution as root on Linux using SSH.
If you are stuck with su, on most systems you can use -c switch to su to specify a command:
su -c "whoami" user
See also How to run sudo with Paramiko? (Python)
If none of the above is feasible (and you really tried hard to make the admin enable some of the options above):
As the last resort option, you can write the command to a standard input of the su, the same way you already write a password (another thing not to do):
stdin, stdout, stderr = session.exec_command("su -l " + user_to_log)
stdin.write(password_to_log + '\n')
stdin.flush()
command = 'whoami'
stdin.write(command + '\n')
stdin.flush()
(also note that it's redundant to call makefile, as exec_command already returns that)
See Execute (sub)commands in secondary shell/command on SSH server in Python Paramiko.
Note that your question is not about which SSH client library to use. It does not matter if you use Paramiko or other. This all is actually a generic SSH/Linux/shell question.

SSHClient.exec_command() reported "command not found" [duplicate]

This question already has an answer here:
Some Unix commands fail with "<command> not found", when executed using Python Paramiko exec_command
(1 answer)
Closed 3 years ago.
I've written a script to help others run a simple day to day commands on our storage system here at work. The script works fine with commands that return a short and simple output, for example, ls, however, when the script wants to run a command which has a large output, the output isn't returned. It's almost as if it times out but there's no feedback at all, e.g. I thought there might be part of the command output. I've done some research around this and discovered other people with the same problem. The answers they got was to use:
stdin, stdout, stderr = client.exec_command(command)
Which I was already using in my code.
I'm wondering if it's something to do with the buffer size, which annoyingly I don't know how to implement that in my code. I've tried adding a time delay using:
time.sleep(10)
But no joy from that. I have also tried using:
print stdout.channel.recv_exit_status()
However, I got a return of 127 so I think I'm way off the mark there!
My code is:
def ssh_command(ip, user, passwd, command):
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(ip, username=user, password=passwd)
stdin, stdout, stderr = client.exec_command(command)
print stdout.read()
print stderr.read()
return
if __name__ == '__main__':
ssh_command(ip, user, passwd, command)
I've omitted the first few blocks of code which are where a few variables are defined by raw input from the user. It's rather long so I thought it best to omit but naturally, I can post it if needs be.
For those interested in the command I'm trying to run, it's an IBM command unique to their GPFS (Spectrum Scale) storage system. The command is:
mmdf mmfs1 --block-size auto
The command returns the storage space on all the disk pools on the storage system.
UPDATE:
The stderr.read() states the command isn't recognised (bash: mmdf: command not found) despite it working when SSH'd into the storage controller.
Based on your latest comments you should use the absolute path to mmdf when running the command:
client.exec_command("/the/path/to/mmdf mmfs1 --block-size auto")
To find out where mmdf is, manually login to the server and run:
which mmdf
# or
type -P mmdf

Querying database on different Windows credentials in Python/cmd [duplicate]

I've managed to get the cmd being opened by python. However, using runas administrator comes with a password check before cmd.exe is executed.
I'm using this to open cmd...
import subprocess
subprocess.call(["runas", "/user:Administrator", "cmd.exe"])
I'm looking for a way to automatically enter the password into the runas.exe prompt which opens when i run the code. Say if i were to create var = "test" and add it after import subprocess how would i make it so that this variable is passed to and seen as an input to the runas.exe?
The solution would require only python modules which are in version 3.4 or higher.
Update
I have found some code which appears to input straight into runas.exe. However, the apparent input is \x00\r\n when in the code the input is supposed to be test I am fairly certain that if i can get the input to be test then the code will be successful.
The code is as follows :
import subprocess
args = ['runas', '/user:Administrator', 'cmd.exe']
proc = subprocess.Popen(args,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write(b'test\n')
proc.stdin.flush()
stdout, stderr = proc.communicate()
print (stdout)
print (stderr)
Although not an answer to your question, this can be a solution to your problem. Use psexec instead of runas. You can run it like this:
psexec -u user -p password cmd
(or run it from Python using subprocess.Popen or something else)
This piece of code actually works (tested on a Windows 2008 server). I've used it to call runas for a different user and pass his password. A new command prompt opened with new user context, without needing to enter password.
Note that you have to install pywin32 to have access to the win32 API.
The idea is:
to Popen the runas command, without any input redirection, redirecting output
read char by char until we encounter ":" (last char of the password prompt).
send key events to the console using win32 packages, with the final \r to end the password input.
(adapted from this code):
import win32console, win32con, time
import subprocess
username = "me"
domain = "my_domain"
password ="xxx"
free_console=True
try:
win32console.AllocConsole()
except win32console.error as exc:
if exc.winerror!=5:
raise
## only free console if one was created successfully
free_console=False
stdin=win32console.GetStdHandle(win32console.STD_INPUT_HANDLE)
p = subprocess.Popen(["runas",r"/user:{}\{}".format(domain,username),"cmd.exe"],stdout=subprocess.PIPE)
while True:
if p.stdout.read(1)==b":":
for c in "{}\r".format(password): # end by CR to send "RETURN"
## write some records to the input queue
x=win32console.PyINPUT_RECORDType(win32console.KEY_EVENT)
x.Char=unicode(c) # remove unicode for python 3
x.KeyDown=True
x.RepeatCount=1
x.VirtualKeyCode=0x0
x.ControlKeyState=win32con.SHIFT_PRESSED
stdin.WriteConsoleInput([x])
p.wait()
break

diverting the stdin when ssh-ing to another machine from python

I am trying to use SSH as a socks proxy to another machine, then ask the user if he wants to proceed.
so I use:
proxy_cmd = "ssh -o 'StrictHostKeyChecking no' -i " + key_filename + ' -D 9998 ubuntu#' + ip_address
subprocess.Popen(proxy_cmd, shell=True, stdout=subprocess.PIPE)
if not raw_input('would you like to proceed?(y)')=='y':
sys.exit()
and I get:
IOError: [Errno 11] Resource temporarily unavailable
I assume that's because the ssh is open and it is capturing stdin or something. I just don't know how to bypass this (I have no need to send input to the ssh, I just want it open for Selenium to use later)
How can I do this?
If you want to keep stdin available to your Python program, then you'll have to redirect stdin for the ssh process even if you have no intention of using it, with something like...
subprocess.Popen(proxy_cmd,
shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
Note that the subprocess module will retain a reference to the Popen object in the subprocess._active list, but you may also want to bind the resulting Popen object to a variable so you can perform operations on it later.

Supplying password to wrapped-up MySQL

Greetings.
I have written a little python script that calls MySQL in a subprocess. [Yes, I know that the right approach is to use MySQLdb, but compiling it under OS X Leopard is a pain, and likely more painful if I wanted to use the script on computers of different architectures.] The subprocess technique works, provided that I supply the password in the command that starts the process; however, that means that other users on the machine could see the password.
The original code I wrote can be seen here.
This variant below is very similar, although I will omit the test routine to keep it shorter:
#!/usr/bin/env python
from subprocess import Popen, PIPE
# Set the command you need to connect to your database
mysql_cmd_line = "/Applications/MAMP/Library/bin/mysql -u root -p"
mysql_password = "root"
def RunSqlCommand(sql_statement, database=None):
"""Pass in the SQL statement that you would like executed.
Optionally, specify a database to operate on. Returns the result."""
command_list = mysql_cmd_line.split()
if database:
command_list.append(database)
# Run mysql in a subprocess
process = Popen(command_list, stdin=PIPE, stdout=PIPE,
stderr=PIPE, close_fds=True)
#print "Asking for output"
#needs_pw = process.stdout.readline()
#print "Got: " + needs_pw
# pass it in the password
process.stdin.write(mysql_password + "\n")
# pass it our commands, and get the results
#(stdout, stderr) = process.communicate( mysql_password + "\n" + sql_statement)
(stdout, stderr) = process.communicate( sql_statement )
return stdout
I am suspicious that the MySQL password prompt is not actually on stdout (or stderr), although I don't know how that could be or if it means I could trap it.
I did try reading output first, before supplying a password, but it didn't work. I also tried passing the password
Again, if I supply the password on the command line (and thus have no code between the "Popen" and "communicate" functions) my wrapped function works.
Two new thoughts, months laster:
Using pexpect would let me supply a password. It simulates a tty and gets all output, even that which bypasses stdout and stderr.
There is a project called MySQL Connector/Python, in early alpha, that will allow provide a pure python library for accessing MySQL, without requiring you to compile any C-code.
You could simply build a my.cnf file and point to that on the mysql command. Obviously you'll want to protect that file with permissions/acls. But it shouldn't be really an more/less secure then having the password in your python script, or the config for your python script.
So you would do something like
mysql_cmd_line = "/Applications/MAMP/Library/bin/mysql --defaults-file=credentials.cnf"
and your config would look about like this
[client]
host = localhost
user = root
password = password
socket = /var/run/mysqld/mysqld.sock
The only secure method is to use a MySQL cnf file as one of the other posters mentions. You can also pass a MYSQL_PWD env variable, but that is insecure as well: http://dev.mysql.com/doc/refman/5.0/en/password-security.html
Alternatively, you can communicate with the database using a Unix socket file and with a little bit of tweaking you can control permissions at the user id level.
Even better, you can use the free BitNami stack DjangoStack that has Python and MySQLDB precompiled for OS X (And Windows and Linux) http://bitnami.org/stacks
This may be a windows / SQL Server feature, but could you use a Trusted Connection (i.e. use your OS login/password to access the DB)? There may be an OS X equivalent for MySQL.
Or you may just need to set up your DB to use the OS login and password so that you don't need to keep it in your code.
Anyway, just an idea.
Try this:
process.stdin.write(mysql_password + "\n")
process.communicate()
(stdout, stderr) = process.communicate( sql_statement )
process.stdin.close()
return stdout
Call communicate() to force what you just wrote to the buffer to send. Also, it's good to close stdin when you are done.

Categories

Resources