I'm using a simple pexpect script to ssh to a remote machine and grab a value returned by a command.
Is there any way, pexpect or sshwise I can use to ignore the unix greeting?
That is, from
child = pexpect.spawn('/usr/bin/ssh %s#%s' % (rem_user, host))
child.expect('[pP]assword: ', timeout=5)
child.sendline(spass)
child.expect([pexpect.TIMEOUT, prompt])
child.before = '0'
child.sendline ('%s' % cmd2exec)
child.expect([pexpect.EOF, prompt])
# Collected data processing
result = child.before
# logon to the machine returns a lot of garbage, the returned executed command is at the 57th position
print result.split('\r\n') [57]
result = result.split('\r\n') [57]
How can I simply get the returned value, ignoring,
the "Last successful login" and "(c)Copyright" stuff
and without having to concern with the value correct position?
Thanks !
If you have access to the server to which you are logging in, you can try creating a file named .hushlogin in the home directory. The presence of this file silences the standard MOTD greeting and similar stuff.
Alternatively, try ssh -T, which will disable terminal allocation entirely; you won't get a shell prompt, but you may still issue commands and read the response.
There is also a similar thread on ServerFault which may be of some use to you.
If the command isn't interactive, you can just run ssh HOST COMMAND to run the command without all the login excitement happening at all. If the command is interactive, you can frequently use the ssh -t option (ssh -t HOST COMMAND) to force pseudo-tty allocation and trick the remote process to think that it's running attached to a TTY.
I have used paramiko to automate ssh connection and I have found it useful. It can deal with greetings and silent execution.
http://www.lag.net/paramiko/
Hey there you kann kill all that noise by using the sys module and a small class:
class StreamToLogger(object):
"""
Fake file-like stream object that redirects writes to a logger instance.
"""
def __init__(self, logger, log_level=logging.INFO):
self.logger = logger
self.log_level = log_level
self.linebuf = ''
def write(self, buf):
for line in buf.rstrip().splitlines():
self.logger.log(self.log_level, line.rstrip())
#Mak
stdout_logger = logging.getLogger('STDOUT')
sl = StreamToLogger(stdout_logger, logging.INFO)
sys.stdout = sl
stderr_logger = logging.getLogger('STDERR')
sl = StreamToLogger(stderr_logger, logging.ERROR)
sys.stderr = sl
Can't remember where i found that snippet but it works for me :)
Related
I'm a software tester, trying to verify that the log on a remote QNX (a BSD variant) machine will contain the correct entries after specific actions are taken. I am able to list the contents of the directory in which the log resides, and use that information in the command to read (really want to use tail -n XX <file>) the file. So far, I always get a "(No such file or directory)" when trying to read the file.
We are using Froglogic Squish for automated testing, because the Windows UI (that interacts with the server piece on QNX) is built using Qt extensions for standard Windows elements. Squish uses Python 2.7, so I am using Python 2.7.
I am using paramiko for the SSH connection to the QNX server. This has worked great for sending commands to the simulator piece that also runs on the QNX server.
So, here's the code. Some descriptive names have been changed to avoid upsetting my employer.
import sys
import time
import select
sys.path.append(r"C:\Python27\Lib\site-packages")
sys.path.append(r"C:\Python27\Lib\site-packages\pip\_vendor")
import paramiko
# Import SSH configuration variables
ssh_host = 'vvv.xxx.yyy.zzz'
thelog_dir = "/logs/the/"
ssh_user = 'un'
ssh_pw = 'pw'
def execute_Command(fullCmd):
outptLines = []
#
# Try to connect to the host.
# Retry a few times if it fails.
#
i = 1
while True:
try:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(ssh_host, 22, ssh_user, ssh_pw)
break
except paramiko.AuthenticationException:
log ("Authentication failed when connecting to %s" % ssh_host)
return 1
except:
log ("Could not SSH to %s, waiting for it to start" % ssh_host)
i += 1
time.sleep(2)
# If we could not connect within time limit
if i == 30:
log ("Could not connect to %s. Giving up" % ssh_host)
return 1
# Send the command (non-blocking?)
stdin, stdout, stderr = ssh.exec_command(fullCmd, get_pty=True)
for line in iter(stdout.readline, ""):
outptLines.append(line)
#
# Disconnect from the host
#
ssh.close()
return outptLines
def get_Latest_Log():
fullCmd = "ls -1 %s | grep the_2" %thelog_dir
files = execute_Command(fullCmd)
theFile = files[-1]
return theFile
def main():
numLines = 20
theLog = get_Latest_Log()
print("\n\nThe latest log is %s\n\n" %theLog)
fullCmd = "cd /logs/the; tail -n 20 /logs/the/%s" %theLog
#fullCmd = "tail -n 20 /logs/the/%s" %theLog
print fullCmd
logLines = execute_Command(fullCmd)
for line in logLines:
print line
if __name__ == "__main__":
# execute only if run as a script
main()
I have tried to read the file using both tail and cat. I have also tried to get and open the file using Paramiko's SFTP client.
In all cases, the response of trying to read the file fails -- despite the fact that listing the contents of the directory works fine. (?!) And BTW, the log file is supposed to be readable by 'world'. Permissions are -rw-rw-r--.
The output I get is:
"C:\Users\xsat086\Documents\paramikoTest>python SSH_THE_MsgChk.py
The latest log is the_20210628_115455_205.log
cd /logs/the; tail -n 20 /logs/the/the_20210628_115455_205.log
(No such file or directory)the/the_20210628_115455_205.log"
The file name is correct. If I copy and paste the tail command into an interactive SSH session with the QNX server, it works fine.
Is it something to do with the 'non-interactive' nature of this method of sending commands? I read that some implementations of SSH are built upon a command that offers a very limited environment. I don't see how that would impact this tail command.
Or am I doing something stupid in this code?
I cannot really explain completely, why you get the results you get.
But in general a corrupted output is a result of enabling and not handling terminal emulation. You enable the terminal emulation using get_pty=True. Remove it. You should not use the terminal emulation, when automating command execution.
Related question:
Is there a simple way to get rid of junk values that come when you SSH using Python's Paramiko library and fetch output from CLI of a remote machine?
I am using this code for executing command on remote server.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
When I try to execute this script, I get prompt for password. Is there any way I could avoid it, for example, can I enter password in script somehow? Also, password should be encrypted somehow so that people who have access to the script cannot see it.
Why make it so complicated? Here's what I suggest:
1) Create a ssh config section in your ~/.ssh/config file:
Host myserver
HostName 50.50.50.12 (fill in with your server's ip)
Port xxxx (optional)
User me (your username for server)
2) If you have generated your ssh keypair do it now (with ssh-keygen). Then upload with:
$ ssh-copy-id myserver
3) Now you can use subprocess with ssh. For example, to capture output, I call:
result = subprocess.check_output(['ssh', 'myserver', 'cat', 'somefile'])
Simple, robust, and the only time a password is needed is when you copy the public key to the server.
BTW, you code will probably work just fine as well using these steps.
One way is to create a public key, put it on the server, and do ssh -i /path/to/pub/key user#host or use paramiko like this:
import paramiko
import getpass
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
p = getpass.getpass()
ssh.connect('hostname', username='user', password=p)
stdin, stdout, stderr = ssh.exec_command('ls')
print stdout.readlines()
ssh.close()
You should use pexpect or paramiko to connect to remote machine,then spawn a child ,and then run subprocess to achieve what you want.
Here's what I did when encountering this issue before:
Set up your ssh keys for access to the server.
Set up an alias for the server you're accessing. Below I'll call it remote_server.
Put the following two lines at the end of ~/.bash_profile.
eval $(ssh-agent -s)
ssh-add
Now every time you start your shell, you will be prompted for a passphrase. By entering it, you will authenticate your ssh keys and put them 'in hand' at the start of your bash session. For the remainder of your session you will be able to run commands like
ssh remote_server ls
without being prompted for a passphrase. Here ls will run on the remote server and return the results to you. Likewise your python script should run without password prompt interruption if you execute it from the shell.
You'll also be able to ssh to the server just by typing ssh remote_server without having to enter your username or password every time.
The upside to doing it this way is that you should be doing this anyway to avoid password annoyances and remembering funky server names :) Also you don't have to worry about having passwords saved anywhere in your script. The only potential downside is that if you want to share the python script with others, they'll have to do this configuring as well (which they should anyway).
You don't really need something like pexpect to handle this. SSH keys already provide a very good and secure solution to this sort of issue.
The simplest way to get the results you want would probably be to generate an ssh key and place it in the .ssh folder of your device. I believe github has a pretty good guide to doing that, if you look into it. Once you set up the keys correctly on both systems, you won't actually have to add a single line to your code. When you don't specify a password it will automatically use the key to authenticate you.
While subprocess.Popen might work for wrapping ssh access, this is not the preferred way to do so.
I recommend using paramiko.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
def exec_cmd(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='bash $' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_cmd('pwd')
If you don't know the prompt in advance, you can set it with:
chan.send('PS1="python-ssh:"\n')
You could use following.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen("powershell putty.exe user#HOST -pw "password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
I have a class that creates the connection. I can connect and execute 1 command before the channel is closed. On another system i have i can execute multiple commands and the channel does not close. Obviously its a config issue with the systems i am trying to connect to.
class connect:
newconnection = ''
def __init__(self,username,password):
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
try:
ssh.connect('somehost', username=username,password=password,port=2222,timeout=5)
except:
print "Count not connect"
sys.exit()
self.newconnection = ssh
def con(self):
return self.newconnection
Then i use 'ls' command just to print some output
sshconnection = connect('someuser','somepassword').con()
stdin, stdout, stderr = sshconnection.exec_command("ls -lsa")
print stdout.readlines()
print stdout
stdin, stdout, stderr = sshconnection.exec_command("ls -lsa")
print stdout.readlines()
print stdout
sshconnection.close()
sys.exit()
After the first exec_command runs it prints the expected output of the dir list. When i print stdout after the first exec_command it looks like the channel is closed
<paramiko.ChannelFile from <paramiko.Channel 1 (closed) -> <paramiko.Transport at 0x2400f10L (cipher aes128-ctr, 128 bits) (active; 0 open channel(s))>>>
Like i said on another system i am able to keep running commands and the connection doesn't close. Is there a way i can keep this open? or a better way i can see the reason why it closes?
edit: So it looks like you can only run 1 command per SSHClient.exec_command... so i decided to get_transport().open_session() and then run a command. The first one always works. The second one always fails and the scripts just hangs
With just paramiko after the exec_command executes the channel is closed and the ssh returns an auth prompt.
Seems its not possible with just paramiko, try fabric or another tool.
** fabric did not work out too.
Please see the following referece as it provides a way to do this in Paramiko:
How do you execute multiple commands in a single session in Paramiko? (Python)
it's possible with netmiko (tested on windows).
this example is written for connecting to cisco devices but the principle is adaptable for others as well.
import netmiko
from netmiko import ConnectHandler
import json
def connect_enable_silent(ip_address,ios_command):
with open ("credentials.txt") as line:
line_1 = json.load(line)
for k,v in line_1.items():
router=(k,v)
try:
ssh = ConnectHandler(**router[1],device_type="cisco_ios",ip=ip_address)
ssh.enable()
except netmiko.ssh_exception.NetMikoAuthenticationException:
#incorrect credentials
continue
except netmiko.ssh_exception.NetMikoTimeoutException:
#oddly enough if it can log in but not able to authenticate to enable mode the ssh.enable() command does not give an authentication error
#but a time-out error instead
try:
ssh = ConnectHandler(username = router[1]['username'],password = router[1]['password'],device_type="cisco_ios", ip=ip_address)
except netmiko.ssh_exception.NetMikoTimeoutException:
# connection timed out (ssh not enabled on device, try telnet)
continue
except Exception:
continue
else:
output = ssh.send_command(ios_command)
ssh.disconnect()
if "at '^' marker." in output:
#trying to run a command that requires enble mode but not authenticated to enable mode
continue
return output
except Exception:
continue
else:
output = ssh.send_command(ios_command)
ssh.disconnect()
return output
output = connect_enable_silent(ip_address,ios_command)
for line in output.split('\n'):
print(line)
Credentials text is meant to store different credentials in case you are planning to call this function to access multiple devices and not all of them using the same credentials. It is in the format:
{"credentials_1":{"username":"username_1","password":"password_1","secret":"secret_1"},
"credentials_2":{"username":"username_2","password":"password_2","secret":"secret_2"},
"credentials_3": {"username": "username_3", "password": "password_3"}
}
The exceptions can be changed to do different things, in my case i just needed it to not return an error and continue trying the next set, which is why most exceptions are silenced.
I'm trying to run an scp (secure copy) command using subprocess.Popen. The login requires that I send a password:
from subprocess import Popen, PIPE
proc = Popen(['scp', "user#10.0.1.12:/foo/bar/somefile.txt", "."], stdin = PIPE)
proc.stdin.write(b'mypassword')
proc.stdin.flush()
This immediately returns an error:
user#10.0.1.12's password:
Permission denied, please try again.
I'm certain the password is correct. I easily verify it by manually invoking scp on the shell. So why doesn't this work?
Note, there are many similar questions to this, asking about subprocess.Popen and sending a password for automated SSH or FTP login:
How can I set a users password in linux from a python script?
Use subprocess to send a password
The answer(s) to these questions don't work and/or don't apply because I am using Python 3.
Here's a function to ssh with a password using pexpect:
import pexpect
import tempfile
def ssh(host, cmd, user, password, timeout=30, bg_run=False):
"""SSH'es to a host using the supplied credentials and executes a command.
Throws an exception if the command doesn't return 0.
bgrun: run command in the background"""
fname = tempfile.mktemp()
fout = open(fname, 'w')
options = '-q -oStrictHostKeyChecking=no -oUserKnownHostsFile=/dev/null -oPubkeyAuthentication=no'
if bg_run:
options += ' -f'
ssh_cmd = 'ssh %s#%s %s "%s"' % (user, host, options, cmd)
child = pexpect.spawn(ssh_cmd, timeout=timeout) #spawnu for Python 3
child.expect(['[pP]assword: '])
child.sendline(password)
child.logfile = fout
child.expect(pexpect.EOF)
child.close()
fout.close()
fin = open(fname, 'r')
stdout = fin.read()
fin.close()
if 0 != child.exitstatus:
raise Exception(stdout)
return stdout
Something similar should be possible using scp.
The OpenSSH scp utility invokes the ssh program to make the SSH connection to the remote host, and the ssh process handles authentication. The ssh utility doesn't accept a password on the command line or on its standard input. I believe this is a deliberate decision on the part of the OpenSSH developers, because they feel that people should be using more secure mechanisms like key-based authentication. Any solution for invoking ssh is going to follow one of these approaches:
Use an SSH key for authentication, instead of a password.
Use sshpass, expect, or a similar tool to automate responding to the password prompt.
Use (abuse) the SSH_ASKPASS feature to get ssh to get the password by invoking another command, described here or here, or in some of the answers here.
Get the SSH server administrator to enable host-based authentication and use that. Note that host-based authentication is only suitable for certain network environments. See additional notes here and here.
Write your own ssh client using perl, python, java, or your favorite language. There are ssh client libraries available for most modern programming languages, and you'd have full control over how the client gets the password.
Download the ssh source code and build a modified version of ssh that works the way you want.
Use a different ssh client. There are other ssh clients available, both free and commercial. One of them might suit your needs better than the OpenSSH client.
In this particular case, given that you're already invoking scp from a python script, it seems that one of these would be the most reasonable approach:
Use pexpect, the python expect module, to invoke scp and feed the password to it.
Use paramiko, the python ssh implementation, to do this ssh task instead of invoking an outside program.
The second answer you linked suggests you use Pexpect (which is usually the right way to go about interacting with command line programs that expect input).
Pexpect has a library for exactly this: pxssh
http://pexpect.readthedocs.org/en/stable/api/pxssh.html
import pxssh
import getpass
try:
s = pxssh.pxssh()
hostname = raw_input('hostname: ')
username = raw_input('username: ')
password = getpass.getpass('password: ')
s.login(hostname, username, password)
s.sendline('uptime') # run a command
s.prompt() # match the prompt
print(s.before) # print everything before the prompt.
s.logout()
except pxssh.ExceptionPxssh as e:
print("pxssh failed on login.")
print(e)
I guess some applications interact with the user using stdin and some applications interact using terminal. In this case when we write the password using PIPE we are writing to stdin. But SCP application reads the password from terminal. As subprocess cannot interact with user using terminal but can only interact using stdin we cannot use the subprocess module and we must use pexpect for copying the file using scp.
Feel free for corrections.
Here is my scp function based on pexpect. It can handle wildcards (i.e. multiple file transfer), in addition to the password.
To handle multiple file transfer (i.e. wildcards), we need to issue a command via a shell. Refer to pexpect FAQ.
import pexpect
def scp(src,user2,host2,tgt,pwd,opts='',timeout=30):
''' Performs the scp command. Transfers file(s) from local host to remote host '''
cmd = f'''/bin/bash -c "scp {opts} {src} {user2}#{host2}:{tgt}"'''
print("Executing the following cmd:",cmd,sep='\n')
tmpFl = '/tmp/scp.log'
fp = open(tmpFl,'wb')
childP = pexpect.spawn(cmd,timeout=timeout)
try:
childP.sendline(cmd)
childP.expect([f"{user2}#{host2}'s password:"])
childP.sendline(pwd)
childP.logfile = fp
childP.expect(pexpect.EOF)
childP.close()
fp.close()
fp = open(tmpFl,'r')
stdout = fp.read()
fp.close()
if childP.exitstatus != 0:
raise Exception(stdout)
except KeyboardInterrupt:
childP.close()
fp.close()
return
print(stdout)
It can be used this way:
params = {
'src': '/home/src/*.txt',
'user2': 'userName',
'host2': '192.168.1.300',
'tgt': '/home/userName/',
'pwd': myPwd(),
'opts': '',
}
scp(**params)
This is a rewrite I did from the code posted by #Kobayashi and #sjbx but for the purposes of doing scp requests, so credit to those two.
def scp(host, user, password, from_dir, to_dir, timeout=300, recursive=False):
fname = tempfile.mktemp()
fout = open(fname, 'w')
scp_cmd = 'scp'
if recursive:
scp_cmd += ' -r'
scp_cmd += f' {user}#{host}:{from_dir} {to_dir}'
child = pexpect.spawnu(scp_cmd, timeout=timeout)
child.expect(['[pP]assword: '])
child.sendline(str(password))
child.logfile = fout
child.expect(pexpect.EOF)
child.close()
fout.close()
fin = open(fname, 'r')
stdout = fin.read()
fin.close()
if 0 != child.exitstatus:
raise Exception(stdout)
return stdout
I'm attempting to use the python subprocess module to log in to a secure ftp site and then grab a file. However I keep getting hung up on just trying to send the password when it is requested. I so far have the following code:
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password')
This still stops at the password prompt. If I enter the password manually it then goes to the ftp site and then enters the password on the command line. I've seen people suggest using pexpect but long story short I need a standard library solution. Is there anyway with subprocess and/or any other stdlib? What am I forgetting above?
Try
proc.stdin.write('yourPassword\n')
proc.stdin.flush()
That should work.
What you describe sounds like stdin=None where the child process inherits the stdin of the parent (your Python program).
Perhaps you should use an expect-like library instead?
For instance Pexpect (example). There are other, similar python libraries as well.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate(input='password')
Try with input=‘password’ in communicate, that worked for me.
Use Paramiko for SFTP. For anything else, this works:
import subprocess
args = ['command-that-requires-password', '-user', 'me']
proc = subprocess.Popen(args,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write('mypassword\n')
proc.stdin.flush()
stdout, stderr = proc.communicate()
print stdout
print stderr
For some reason, I couldn't get any of the standard library answers here to work for me - getting very strange problems with all of them. Someone else here: unable to provide password to a process with subprocess [python] had the same problem, and concluded that ultimately you just have to go with pexpect to be able to send a password.
I wanted to add my final code here to just save the time of anyone having a similar problem, since I wasted so much time on this (Python 3, 2020):
ssh_password = getpass("user's password: ")
ssh_password = (ssh_password + "\n").encode()
scp_command = 'scp xx.xx.xx.xx:/path/to/file.log /local/save/path/'
child = pexpect.spawn(scp_command)
# make output visible for debugging / progress watching
child.logfile = sys.stdout.buffer
i = child.expect([pexpect.TIMEOUT, "password:"])
if i == 0:
print("Got unexpected output: {} {}".format(child.before, child.after))
return
else:
child.sendline(ssh_password)
child.read()
The above code runs an SCP command to pull a file from the remote server onto your local computer - alter the server IP and paths as necessary.
Key things to remember:
Have to have a pexpect.TIMEOUT in the child.expect call
Have to encode to bytes whatever strings you pass in, and have to use the default encode
Write pexpect output to sys.stdout.buffer so that you can actually see what is going on
Have to have a child.read() at the end
I would recommend scrapping the subprocess approach and using the paramiko package for sftp access.
This same problem plagued me for a week. I had to submit a password from user input through subprocess securely because I was trying to avoid introducing a command injection vulnerability. Here is how I solved the problem with a little help from a colleague.
import subprocess
command = ['command', 'option1', '--password']
subprocess.Popen(command, stdin=subprocess.PIPE).wait(timeout=60)
The .wait(timeout=int) was the most important component because it allows the user to feed input to stdin. Otherwise, the timeout is defaulted to 0 and leaves the user no time to enter input, which consequently results in a None or null string. Took me FOREVER to figure this out.
For repeat use-cases where you know you'll have to do this multiple times, you can override the popen function and use it as a private method which I was told by the same programmer is best practice if you anticipate someone else will be interested in maintaining the code later on and you don't want them to mess with it.
def _popen(cmd):
proc_h = subprocess.Popen(cmd, stdin=subprocess.PIPE)
proc_h.wait(timeout=60)
return proc_h.poll() == os.EX_OK
It is important to remove stdout=subprocess.PIPE if the user is going to be prompted for input. Otherwise, the process appears to hang for 60 seconds, and the user doesn't get a prompt, nor do they realize they are expected to give a password. The stdout will naturally go to the shell window and allow the user to pass input to popen().
Also, just to explain why you return proc_h.poll() == os.EX_OK, is that it returns 0 if the command succeeded. This is just c-style best-practice for when you want to return system error codes in the event the function fails, while accounting for the fact that return 0 will be treated as "false" by the interpreter.
This is a pure Python solution using expect - not pexpect.
If on Ubuntu you first need to install expect with:
sudo apt install expect
Python 3.6 or later:
def sftp_rename(from_name, to_name):
sftp_password = 'abigsecret'
sftp_username = 'foo'
destination_hostname = 'some_hostname'
from_name = 'oldfilename.txt'
to_name = 'newfilename.txt'
commands = f"""
spawn sftp -o "StrictHostKeyChecking no"
{sftp_username}#{destination_hostname}
expect "password:"
send "{sftp_password}\r"
expect "sftp>"
send "rename {from_name} {to_name}\r"
expect "sftp>"
send "bye\r"
expect "#"
"""
sp = subprocess.Popen(['expect', '-c', commands], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
since what you want is just grab a file, I am trying to use "sub process" but it is not works for me. So now I am using paramiko, here is my code:
here is one tutorial I found online
Transfer a file from local server to remote server and vice versa using paramiko of python
"https://www.youtube.com/watch?v=dtvV2xKaVjw"
underneath is my code for transfering all the files in one folder from Linux to windows
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='11.11.11.1111', username='root', password='********', port=22)
sftp_client = ssh.open_sftp()
source_folder = '/var/ftp/file_pass'
local_folder = 'C:/temp/file_pass'
inbound_files = sftp_client.listdir(source_folder)
print(inbound_files)
for ele in inbound_files:
try:
path_from = source_folder + '/' + ele
path_to = local_folder + '/'+ ele
sftp_client.get(path_from, path_to)
except:
print(ele)
sftp_client.close()
ssh.close()
Python have a built in library called ftplib, that can be used for ftp processes without any hassle. (Assuming the remote server have a ftp service running)
from ftplib import FTP
ftp = FTP('ftp.us.debian.org') # connect to host, default port
ftp.login() # user anonymous, passwd anonymous#
##'230 Login successful.'
ftp.cwd('debian') # change into "debian" directory
##'250 Directory successfully changed.'
ftp.retrlines('LIST')
Otherwise, You can use scp command, which is a command line tool. The problem with the password can be avoided creating password less user for remote host.
import os
os.system('scp remoteuser#remotehost:/remote/location/remotefile.txt /client/location/')
To create a passwordless user in linux systems,
Fallow below Steps. Fallow this SO answer.
> ssh-keyscan remotehost
> known_hosts ssh-keygen -t rsa # ENTER toevery field (One time)
> ssh-copy-id remoteuser#remotehost
The safest way to do this is to prompt for the password beforehand and then pipe it into the command. Prompting for the password will avoid having the password saved anywhere in your code. Here's an example:
from getpass import getpass
from subprocess import Popen, PIPE
password = getpass("Please enter your password: ")
proc = Popen("sftp user#server stop".split(), stdin=PIPE)
# Popen only accepts byte-arrays so you must encode the string
proc.communicate(password.encode())
import subprocess
args = ['command', 'arg1', 'arg2']
proc = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
proc.stdin.write(b'password') ##The b prefix is necessary because it needs a byte type
proc.stdin.flush()
stdout, stderr = proc.communicate()
print(stdout)
print(stderr)
You just forgot the line return (aka user pressing Enter) in your password.
from subprocess import Popen, PIPE
proc = Popen(['sftp','user#server', 'stop'], stdin=PIPE)
proc.communicate('password\n'.encode())
Also .encode() because by default proc.communicate() accept bytes-like object.