It seems very simple, but I searched multiple resources but could not find an answer on how to change a remote Linux system password using Python and with SFTP.
def changepwd():
sftp_client = ssh.open_sftp()
#change password of root on remote server
Are there any built-in modules that I can use to change the password?
Thanks in advance.
Thanks for all you help. This is how I changed the passwd for 'root'.
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname, username=Name, password=Pwd)
print "Connection succesfully established ...with %s " % hostname
stdin, stdout, stderr = ssh.exec_command('echo -e "newpasswd\newPasswd" | passwd')
print "stderr: ", stderr.readlines()
print "pwd: ", stdout.readlines()
ssh.close()
You cannot change password with SFTP protocol.
You can change password with SSH protocol. But the SSH protocol API for changing a password is not support by the most widespread SSH server – OpenSSH. Nor it is supported by the most widespread Python SSH library – Paramiko. So this most likely won't work for you anyway.
So in the end the only viable option is to execute a relevant shell command (passwd or chpasswd) via SSH (e.g. using Paramiko).
Related
I want to list all the files in my server. But it returns empty like this []. What can I do? And what is wrong with my code? I am using a module called paramiko.
command = "ls"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, port, username, password)
stdin, stdout, stderr = ssh.exec_command(command)
lines = stdout.readlines()
print(lines)
Your code has nothing to do with SFTP. You are executing ls command in remote shell.
To list files using SFTP, use SFTPClient.listdir_attr:
sftp = ssh.open_sftp()
for entry in sftp.listdir_attr():
print(entry.filename)
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
I am using this code for executing command on remote server.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
When I try to execute this script, I get prompt for password. Is there any way I could avoid it, for example, can I enter password in script somehow? Also, password should be encrypted somehow so that people who have access to the script cannot see it.
Why make it so complicated? Here's what I suggest:
1) Create a ssh config section in your ~/.ssh/config file:
Host myserver
HostName 50.50.50.12 (fill in with your server's ip)
Port xxxx (optional)
User me (your username for server)
2) If you have generated your ssh keypair do it now (with ssh-keygen). Then upload with:
$ ssh-copy-id myserver
3) Now you can use subprocess with ssh. For example, to capture output, I call:
result = subprocess.check_output(['ssh', 'myserver', 'cat', 'somefile'])
Simple, robust, and the only time a password is needed is when you copy the public key to the server.
BTW, you code will probably work just fine as well using these steps.
One way is to create a public key, put it on the server, and do ssh -i /path/to/pub/key user#host or use paramiko like this:
import paramiko
import getpass
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
p = getpass.getpass()
ssh.connect('hostname', username='user', password=p)
stdin, stdout, stderr = ssh.exec_command('ls')
print stdout.readlines()
ssh.close()
You should use pexpect or paramiko to connect to remote machine,then spawn a child ,and then run subprocess to achieve what you want.
Here's what I did when encountering this issue before:
Set up your ssh keys for access to the server.
Set up an alias for the server you're accessing. Below I'll call it remote_server.
Put the following two lines at the end of ~/.bash_profile.
eval $(ssh-agent -s)
ssh-add
Now every time you start your shell, you will be prompted for a passphrase. By entering it, you will authenticate your ssh keys and put them 'in hand' at the start of your bash session. For the remainder of your session you will be able to run commands like
ssh remote_server ls
without being prompted for a passphrase. Here ls will run on the remote server and return the results to you. Likewise your python script should run without password prompt interruption if you execute it from the shell.
You'll also be able to ssh to the server just by typing ssh remote_server without having to enter your username or password every time.
The upside to doing it this way is that you should be doing this anyway to avoid password annoyances and remembering funky server names :) Also you don't have to worry about having passwords saved anywhere in your script. The only potential downside is that if you want to share the python script with others, they'll have to do this configuring as well (which they should anyway).
You don't really need something like pexpect to handle this. SSH keys already provide a very good and secure solution to this sort of issue.
The simplest way to get the results you want would probably be to generate an ssh key and place it in the .ssh folder of your device. I believe github has a pretty good guide to doing that, if you look into it. Once you set up the keys correctly on both systems, you won't actually have to add a single line to your code. When you don't specify a password it will automatically use the key to authenticate you.
While subprocess.Popen might work for wrapping ssh access, this is not the preferred way to do so.
I recommend using paramiko.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
def exec_cmd(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='bash $' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_cmd('pwd')
If you don't know the prompt in advance, you can set it with:
chan.send('PS1="python-ssh:"\n')
You could use following.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen("powershell putty.exe user#HOST -pw "password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
I use Paramiko for establishing SSH connection with some target device and I want to execute reboot command.
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(zip_hostname, username=username, password=password, timeout=1)
try:
stdin, stdout, stderr = ssh.exec_command("/sbin/reboot -f")
# .........
# some code
# .........
except AuthenticationException, e:
print ''
finally:
ssh.close()
But after executing ssh.exec_command("/sbin/reboot -f") "some code" does not execute because program is stuck in exec_command (the disconnection takes place caused by rebooting). What should I do to solve my problem?
Try this:
ssh.exec_command("/sbin/reboot -f > /dev/null 2>&1 &")
All the output of reboot is redirected to /dev/null to make it produce no output and it is started in the background thanks to the '&' sign in the end. Hopefully the program won't hang on that line this way, because the remote shell gives the prompt back.
Get the transport from the ssh and set the keepalive using:
transport = ssh.get_transport()
transport.set_keepalive(5)
This sets the keepalive to 5 seconds; mind you I would have expected the timeout=1 to have achieved the same thing.
All you need to do is to call channel.exec_command() instead of the high-level interface client.exec_command()
# exec fire and forget
timeout=0.5
transport = ssh.get_transport()
chan = ssh.get_transport().open_session(timeout=timeout)
chan.settimeout(timeout)
try:
chan.exec_command(command)
except socket.timeout:
pass
I was having this issue and managed to avoid it by switching to this command:
/sbin/shutdown -r now
Note this command does not result in any STDOUT or STDERR output
In case you or anyone else gets stuck trying to reboot host with sudo using forwarding agents (ssh keys) or in my case (yubikey)
If you look at this as bash you would reboot a host as non root user like this.
ssh -t -A user#hostname sudo /sbin/reboot
For the -A flag, from ssh man page
Enables forwarding of the authentication agent connection. This can also be specified on a per-host basis in a
configuration file.
Agent forwarding should be enabled with caution. Users with the ability to bypass file permissions on the
remote host (for the agent’s Unix-domain socket) can access the local agent through the forwarded connection.
An attacker cannot obtain key material from the agent, however they can perform operations on the keys that
enable them to authenticate using the identities loaded into the agent.*
For the -t flag, from ssh man page
Force pseudo-tty allocation. This can be used to execute arbitrary screen-based programs on a remote machine,
which can be very useful, e.g. when implementing menu services. Multiple -t options force tty allocation, even
if ssh has no local tty.*
So lets break this down into how you would do this in paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, username=username)
s = ssh.get_transport().open_session()
paramiko.agent.AgentRequestHandler(s)
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
For authentication forwarding (-A flag in bash ssh command) for paramiko
ssh = paramiko.SSHClient() #'ssh' is client variable
s = ssh.get_transport().open_session() #get 'ssh' transport and open sessions assigned to 's' variable
paramiko.agent.AgentRequestHandler(s) #call in 's' to the forwarding agent for current ssh session
Now for force pseudo-tty allocation (-t flag in bash ssh command) for paramiko
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
Adding 'get_pty=True' to exec_command will allow you execute sudo /sbin/reboot
Hope this helps, everyone's environments are different but this should work as it the exact same thing as if you ran it as bash.
I know that Paramiko supports Pageant under Windows, but it doesn't work by default.
I am looking for an example of connecting using the key that is loaded in Pageant.
This is what I am using to connect and do an automated login using Pageant to store my key, and connecting to it from within my Python script. It counts on Pageant already being loaded, (and I haven't found a good reliable way to launch it and load the key (prompt for key password)) but the below works for now.
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
host = 'somehost.com'
port = 22
ssh.connect(host, port=port, username='user', allow_agent=True)
stdin,stdout,stderr = ssh.exec_command("ps -ef")
print stdout.read()
print stderr.read()
What's the most pythonic way to scp a file in Python? The only route I'm aware of is
os.system('scp "%s" "%s:%s"' % (localfile, remotehost, remotefile) )
which is a hack, and which doesn't work outside Linux-like systems, and which needs help from the Pexpect module to avoid password prompts unless you already have passwordless SSH set up to the remote host.
I'm aware of Twisted's conch, but I'd prefer to avoid implementing scp myself via low-level ssh modules.
I'm aware of paramiko, a Python module that supports SSH and SFTP; but it doesn't support SCP.
Background: I'm connecting to a router which doesn't support SFTP but does support SSH/SCP, so SFTP isn't an option.
EDIT:
This is a duplicate of How to copy a file to a remote server in Python using SCP or SSH?. However, that question doesn't give an scp-specific answer that deals with keys from within Python. I'm hoping for a way to run code kind of like
import scp
client = scp.Client(host=host, user=user, keyfile=keyfile)
# or
client = scp.Client(host=host, user=user)
client.use_system_keys()
# or
client = scp.Client(host=host, user=user, password=password)
# and then
client.transfer('/etc/local/filename', '/etc/remote/filename')
Try the Python scp module for Paramiko. It's very easy to use. See the following example:
import paramiko
from scp import SCPClient
def createSSHClient(server, port, user, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, port, user, password)
return client
ssh = createSSHClient(server, port, user, password)
scp = SCPClient(ssh.get_transport())
Then call scp.get() or scp.put() to do SCP operations.
(SCPClient code)
You might be interested in trying Pexpect (source code). This would allow you to deal with interactive prompts for your password.
Here's a snip of example usage (for ftp) from the main website:
# This connects to the openbsd ftp site and
# downloads the recursive directory listing.
import pexpect
child = pexpect.spawn ('ftp ftp.openbsd.org')
child.expect ('Name .*: ')
child.sendline ('anonymous')
child.expect ('Password:')
child.sendline ('noah#example.com')
child.expect ('ftp> ')
child.sendline ('cd pub')
child.expect('ftp> ')
child.sendline ('get ls-lR.gz')
child.expect('ftp> ')
child.sendline ('bye')
Couldn't find a straight answer, and this "scp.Client" module doesn't exist.
Instead, this suits me:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')
with SCPClient(ssh.get_transport()) as scp:
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
You could also check out paramiko. There's no scp module (yet), but it fully supports sftp.
[EDIT]
Sorry, missed the line where you mentioned paramiko.
The following module is simply an implementation of the scp protocol for paramiko.
If you don't want to use paramiko or conch (the only ssh implementations I know of for python), you could rework this to run over a regular ssh session using pipes.
scp.py for paramiko
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('<IP Address>', username='<User Name>',password='' ,key_filename='<.PEM File path')
#Setup sftp connection and transmit this script
print ("copying")
sftp = client.open_sftp()
sftp.put(<Source>, <Destination>)
sftp.close()
if you install putty on win32 you get an pscp (putty scp).
so you can use the os.system hack on win32 too.
(and you can use the putty-agent for key-managment)
sorry it is only a hack
(but you can wrap it in a python class)
As of today, the best solution is probably AsyncSSH
https://asyncssh.readthedocs.io/en/latest/#scp-client
async with asyncssh.connect('host.tld') as conn:
await asyncssh.scp((conn, 'example.txt'), '.', recurse=True)
You can use the package subprocess and the command call to use the scp command from the shell.
from subprocess import call
cmd = "scp user1#host1:files user2#host2:files"
call(cmd.split(" "))
Have a look at fabric.transfer.
from fabric import Connection
with Connection(host="hostname",
user="admin",
connect_kwargs={"key_filename": "/home/myuser/.ssh/private.key"}
) as c:
c.get('/foo/bar/file.txt', '/tmp/')
It has been quite a while since this question was asked, and in the meantime, another library that can handle this has cropped up:
You can use the copy function included in the Plumbum library:
import plumbum
r = plumbum.machines.SshMachine("example.net")
# this will use your ssh config as `ssh` from shell
# depending on your config, you might also need additional
# params, eg: `user="username", keyfile=".ssh/some_key"`
fro = plumbum.local.path("some_file")
to = r.path("/path/to/destination/")
plumbum.path.utils.copy(fro, to)
If you are on *nix you can use sshpass
sshpass -p password scp -o User=username -o StrictHostKeyChecking=no src dst:/path
Hmmm, perhaps another option would be to use something like sshfs (there an sshfs for Mac too). Once your router is mounted you can just copy the files outright. I'm not sure if that works for your particular application but it's a nice solution to keep handy.
I while ago I put together a python SCP copy script that depends on paramiko. It includes code to handle connections with a private key or SSH key agent with a fallback to password authentication.
http://code.activestate.com/recipes/576810-copy-files-over-ssh-using-paramiko/