I need to create tunneling to read information from a database. I use Paramiko, but I have not worked with tunneling yet. Please provide an example of a simple code that creates and closes a tunnel.
At work we usually create ssh tunnels forwarding ports. The way we do that is, by using the standard command ssh -L port:addr:port addr with subprocess running in a separate thread.
I found this useful link: https://github.com/paramiko/paramiko/blob/master/demos/forward.py with an example of doing port forwarding with paramiko.
I used sshtunnel for my projects. Example of the forwarding remote local MySQL port to the host local port:
pip install sshtunnel
python -m sshtunnel -U root -P password -L :3306 -R 127.0.0.1:3306 -p 2222 localhost
Even though this does not use paramiko, I believe it's a very clean solution to implement (similar to #dario's answer but without managing the thread in python).
There's this little-mentioned feature in openssh client that allows us to control a ssh process through a unix socket, quoting man ssh:
-M Places the ssh client into “master” mode for connection sharing. Multiple -M options places ssh
into “master” mode with confirmation required before slave connections are accepted. Refer to the
description of ControlMaster in ssh_config(5) for details.
-S ctl_path
Specifies the location of a control socket for connection sharing, or the string “none” to disable
connection sharing. Refer to the description of ControlPath and ControlMaster in ssh_config(5)
for details.
So you can start background process of ssh (with -Nf) and then check (or terminate) it with a another ssh call.
I use this in a project that requires a reverse tunnel to be established
from subprocess import call, STDOUT
import os
DEVNULL = open(os.devnull, 'wb')
CONFIG = dict(
SSH_SERVER='ssh.server.com',
SSH_PORT=2222,
SSH_USER='myuser',
SSH_KEY='/path/to/user.key',
REMOTE_PORT=62222,
UNIX_SOCKET='/tmp/ssh_tunnel.sock',
KNOWN_HOSTS='/path/to/specific_known_host_to_conflicts',
)
def start():
return call(
[
'ssh', CONFIG['SSH_SERVER'],
'-Nfi', CONFIG['SSH_KEY'],
'-MS', CONFIG['UNIX_SOCKET'],
'-o', 'UserKnownHostsFile=%s' % CONFIG['KNOWN_HOSTS'],
'-o', 'ExitOnForwardFailure=yes',
'-p', str(CONFIG['SSH_PORT']),
'-l', CONFIG['SSH_USER'],
'-R', '%d:localhost:22' % CONFIG['REMOTE_PORT']
],
stdout=DEVNULL,
stderr=STDOUT
) == 0
def stop():
return __control_ssh('exit') == 0
def status():
return __control_ssh('check') == 0
def __control_ssh(command):
return call(
['ssh', '-S', CONFIG['UNIX_SOCKET'], '-O', command, 'x'],
stdout=DEVNULL,
stderr=STDOUT
)
-o ExitOnForwardFailure=yes makes sure the ssh command will fail if the tunnel cannot be established, otherwise it will not exit.
Might I suggest trying something like pyngrok to programmatically manage an ngrok tunnel for you? Full disclosure, I am the developer of it. SSH example here, but it's as easy as installing pyngrok:
pip install pyngrok
and using it:
from pyngrok import ngrok
# <NgrokTunnel: "tcp://0.tcp.ngrok.io:12345" -> "localhost:22">
ssh_tunnel = ngrok.connect(22, "tcp")
I used paramiko for some project I had a year ago, here is the part of my code where I connected with another computer/server and executed a simple python file:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='...', username='...', password='...')
stdin, stdout, stderr = ssh.exec_command('python hello.py')
ssh.close()
stdin, stdout and sdterr contain the inputs/outputs of the command you executed.
From here, I think you can make the connection with the database.
Here is some good information about paramiko.
Related
ssh -R 80:localhost:8080 nokey#localhost.run is equivalent to what in Paramiko?
I search a lot but no success ...
Code I tried
import paramiko
command = "df"
# Update the next three lines with your
# server's information
username = "localhost:9876"
host = "nokey#localhost.run"
client = paramiko.client.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(host,port=80, username=username)
_stdin, _stdout,_stderr = client.exec_command("df")
print(stdout.read().decode())
client.close()
A partial equivalent is Transport.request_port_forward.
But that will only setup the server-side part of the forwarding. The local-side implementation that will establish the local connection from the SSH client (Paramiko) to the local server is up to you.
An example how to implement it is given in demos/rforward.py.
An equivalent of your command will be:
python3 rforward.py -u nokey -p 80 -r 127.0.0.1:8080 localhost.run
Note that the rforward.py script does not start a shell. It just does the forwarding. So it's actually an equivalent of ssh with -N switch.
If you need both forwarding and shell, you will need to modify the script to additionally do SSHClient.invoke_shell like demos/demo.py do.
I am new to python. I need to login to a server daily (Desktop -> 1.32 -> 0.20 -> 3.26). For this I need to open putty and using ssh connection i am logging in. To do all this I want to write a script using python.
By using google I thought subprocess.Popen will do that. But Its not working fine.
1st trail:
import subprocess
pid = subprocess.Popen("putty.exe user#xxx.xx.x.32 -pw password").pid
Its working fine (Opening window logging into .32). But cant able to give input. I came to know that to give input for the same process we need to use pipes.
2nd trail:
from subprocess import Popen, PIPE, STDOUT
p = Popen("putty.exe user#xxx.xx.x.32 -pw password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
grep_stdout = p.communicate(input=b'ssh xx.xx.x.20\n')[0]
print(grep_stdout.decode())
by using this i cant login for the first server also. After logging in to all servers I need the terminal as alive. how to do this???
Edit
I need to do this in a new putty window. After logging in dont close the window. I have some manual work to do.
use powershell to call putty in order to open a new window
from subprocess import Popen
Popen("powershell putty.exe user#host -pw mypassword")
Use paramiko library python
Establish a SSH connection using -
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,username, password)
Check the status if connection is alive using -
status = ssh.get_transport().is_active()
#returns True if connection is alive/active
ssh.exec_command() is basically a single session. Use exec_command(command1;command2) to execute multiple commands in one session
Also, you can use this to execute multiple commands in single session
channel = ssh.invoke_shell()
stdin = channel.makefile('wb')
stdout = channel.makefile('rb')
stdin.write('''
Command 1
Command 2
''')
print stdout.read()
There is a SSHv2 protocol implementation for python: http://www.paramiko.org/. You can easily install it with pip:
pip install paramiko
Then you can create ssh client, connect to your host and execute commands:
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect('hostname', username='login', password='pwd')
stdin, stdout, stderr = ssh_client.exec_command('command')
I created a bat file on windows, which references putty and putty session-specific info. This bat file can run by itself on windows. To call from python, I used the subprocess.run() -- python 3.5+.
Example of bat file named putty.bat:
start c:\app\PuTTy\putty.exe -load 192.168.1.230-node1-logs -l <logon user> -pw <logon user password for putty session>
Breaking down the bat file:
It begins with window's command "start".
c:\app\PuTTy\putty.exe --> is the putty directory on Windows containing putty.exe.
-load --> tells putty to load a putty profile. The profile is the name you see on the putty client, under "Saved Sessions".
192.168.1.230-node1-logs --> my putty session specific profile.
-l for logon --> followed by the putty logon user.
-pw is the logon password --> followed by the putty logon password.
That concludes the contents of "putty.bat".
From within python, is used the subprocess.run() command.
Example:
import subprocess
...
...
try:
process = subprocess.run(["putty.bat"], check=True, stdout=subprocess.PIPE, universal_newlines=True)
print(process.stdout)
except Exception as e:
print("subprocess call error in open putty command")
print(str(e))
I hope you find this helpful
I am using this code for executing command on remote server.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
When I try to execute this script, I get prompt for password. Is there any way I could avoid it, for example, can I enter password in script somehow? Also, password should be encrypted somehow so that people who have access to the script cannot see it.
Why make it so complicated? Here's what I suggest:
1) Create a ssh config section in your ~/.ssh/config file:
Host myserver
HostName 50.50.50.12 (fill in with your server's ip)
Port xxxx (optional)
User me (your username for server)
2) If you have generated your ssh keypair do it now (with ssh-keygen). Then upload with:
$ ssh-copy-id myserver
3) Now you can use subprocess with ssh. For example, to capture output, I call:
result = subprocess.check_output(['ssh', 'myserver', 'cat', 'somefile'])
Simple, robust, and the only time a password is needed is when you copy the public key to the server.
BTW, you code will probably work just fine as well using these steps.
One way is to create a public key, put it on the server, and do ssh -i /path/to/pub/key user#host or use paramiko like this:
import paramiko
import getpass
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
p = getpass.getpass()
ssh.connect('hostname', username='user', password=p)
stdin, stdout, stderr = ssh.exec_command('ls')
print stdout.readlines()
ssh.close()
You should use pexpect or paramiko to connect to remote machine,then spawn a child ,and then run subprocess to achieve what you want.
Here's what I did when encountering this issue before:
Set up your ssh keys for access to the server.
Set up an alias for the server you're accessing. Below I'll call it remote_server.
Put the following two lines at the end of ~/.bash_profile.
eval $(ssh-agent -s)
ssh-add
Now every time you start your shell, you will be prompted for a passphrase. By entering it, you will authenticate your ssh keys and put them 'in hand' at the start of your bash session. For the remainder of your session you will be able to run commands like
ssh remote_server ls
without being prompted for a passphrase. Here ls will run on the remote server and return the results to you. Likewise your python script should run without password prompt interruption if you execute it from the shell.
You'll also be able to ssh to the server just by typing ssh remote_server without having to enter your username or password every time.
The upside to doing it this way is that you should be doing this anyway to avoid password annoyances and remembering funky server names :) Also you don't have to worry about having passwords saved anywhere in your script. The only potential downside is that if you want to share the python script with others, they'll have to do this configuring as well (which they should anyway).
You don't really need something like pexpect to handle this. SSH keys already provide a very good and secure solution to this sort of issue.
The simplest way to get the results you want would probably be to generate an ssh key and place it in the .ssh folder of your device. I believe github has a pretty good guide to doing that, if you look into it. Once you set up the keys correctly on both systems, you won't actually have to add a single line to your code. When you don't specify a password it will automatically use the key to authenticate you.
While subprocess.Popen might work for wrapping ssh access, this is not the preferred way to do so.
I recommend using paramiko.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
def exec_cmd(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='bash $' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_cmd('pwd')
If you don't know the prompt in advance, you can set it with:
chan.send('PS1="python-ssh:"\n')
You could use following.
import subprocess
import sys
COMMAND="ls"
ssh = subprocess.Popen("powershell putty.exe user#HOST -pw "password", stdout=PIPE, stdin=PIPE, stderr=STDOUT)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print >>sys.stderr, "ERROR: %s" % error
else:
print result
I use Paramiko for establishing SSH connection with some target device and I want to execute reboot command.
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(zip_hostname, username=username, password=password, timeout=1)
try:
stdin, stdout, stderr = ssh.exec_command("/sbin/reboot -f")
# .........
# some code
# .........
except AuthenticationException, e:
print ''
finally:
ssh.close()
But after executing ssh.exec_command("/sbin/reboot -f") "some code" does not execute because program is stuck in exec_command (the disconnection takes place caused by rebooting). What should I do to solve my problem?
Try this:
ssh.exec_command("/sbin/reboot -f > /dev/null 2>&1 &")
All the output of reboot is redirected to /dev/null to make it produce no output and it is started in the background thanks to the '&' sign in the end. Hopefully the program won't hang on that line this way, because the remote shell gives the prompt back.
Get the transport from the ssh and set the keepalive using:
transport = ssh.get_transport()
transport.set_keepalive(5)
This sets the keepalive to 5 seconds; mind you I would have expected the timeout=1 to have achieved the same thing.
All you need to do is to call channel.exec_command() instead of the high-level interface client.exec_command()
# exec fire and forget
timeout=0.5
transport = ssh.get_transport()
chan = ssh.get_transport().open_session(timeout=timeout)
chan.settimeout(timeout)
try:
chan.exec_command(command)
except socket.timeout:
pass
I was having this issue and managed to avoid it by switching to this command:
/sbin/shutdown -r now
Note this command does not result in any STDOUT or STDERR output
In case you or anyone else gets stuck trying to reboot host with sudo using forwarding agents (ssh keys) or in my case (yubikey)
If you look at this as bash you would reboot a host as non root user like this.
ssh -t -A user#hostname sudo /sbin/reboot
For the -A flag, from ssh man page
Enables forwarding of the authentication agent connection. This can also be specified on a per-host basis in a
configuration file.
Agent forwarding should be enabled with caution. Users with the ability to bypass file permissions on the
remote host (for the agent’s Unix-domain socket) can access the local agent through the forwarded connection.
An attacker cannot obtain key material from the agent, however they can perform operations on the keys that
enable them to authenticate using the identities loaded into the agent.*
For the -t flag, from ssh man page
Force pseudo-tty allocation. This can be used to execute arbitrary screen-based programs on a remote machine,
which can be very useful, e.g. when implementing menu services. Multiple -t options force tty allocation, even
if ssh has no local tty.*
So lets break this down into how you would do this in paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=host, username=username)
s = ssh.get_transport().open_session()
paramiko.agent.AgentRequestHandler(s)
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
For authentication forwarding (-A flag in bash ssh command) for paramiko
ssh = paramiko.SSHClient() #'ssh' is client variable
s = ssh.get_transport().open_session() #get 'ssh' transport and open sessions assigned to 's' variable
paramiko.agent.AgentRequestHandler(s) #call in 's' to the forwarding agent for current ssh session
Now for force pseudo-tty allocation (-t flag in bash ssh command) for paramiko
ssh.exec_command("sudo /sbin/reboot", get_pty=True)
Adding 'get_pty=True' to exec_command will allow you execute sudo /sbin/reboot
Hope this helps, everyone's environments are different but this should work as it the exact same thing as if you ran it as bash.
What's the most pythonic way to scp a file in Python? The only route I'm aware of is
os.system('scp "%s" "%s:%s"' % (localfile, remotehost, remotefile) )
which is a hack, and which doesn't work outside Linux-like systems, and which needs help from the Pexpect module to avoid password prompts unless you already have passwordless SSH set up to the remote host.
I'm aware of Twisted's conch, but I'd prefer to avoid implementing scp myself via low-level ssh modules.
I'm aware of paramiko, a Python module that supports SSH and SFTP; but it doesn't support SCP.
Background: I'm connecting to a router which doesn't support SFTP but does support SSH/SCP, so SFTP isn't an option.
EDIT:
This is a duplicate of How to copy a file to a remote server in Python using SCP or SSH?. However, that question doesn't give an scp-specific answer that deals with keys from within Python. I'm hoping for a way to run code kind of like
import scp
client = scp.Client(host=host, user=user, keyfile=keyfile)
# or
client = scp.Client(host=host, user=user)
client.use_system_keys()
# or
client = scp.Client(host=host, user=user, password=password)
# and then
client.transfer('/etc/local/filename', '/etc/remote/filename')
Try the Python scp module for Paramiko. It's very easy to use. See the following example:
import paramiko
from scp import SCPClient
def createSSHClient(server, port, user, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, port, user, password)
return client
ssh = createSSHClient(server, port, user, password)
scp = SCPClient(ssh.get_transport())
Then call scp.get() or scp.put() to do SCP operations.
(SCPClient code)
You might be interested in trying Pexpect (source code). This would allow you to deal with interactive prompts for your password.
Here's a snip of example usage (for ftp) from the main website:
# This connects to the openbsd ftp site and
# downloads the recursive directory listing.
import pexpect
child = pexpect.spawn ('ftp ftp.openbsd.org')
child.expect ('Name .*: ')
child.sendline ('anonymous')
child.expect ('Password:')
child.sendline ('noah#example.com')
child.expect ('ftp> ')
child.sendline ('cd pub')
child.expect('ftp> ')
child.sendline ('get ls-lR.gz')
child.expect('ftp> ')
child.sendline ('bye')
Couldn't find a straight answer, and this "scp.Client" module doesn't exist.
Instead, this suits me:
from paramiko import SSHClient
from scp import SCPClient
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.connect('example.com')
with SCPClient(ssh.get_transport()) as scp:
scp.put('test.txt', 'test2.txt')
scp.get('test2.txt')
You could also check out paramiko. There's no scp module (yet), but it fully supports sftp.
[EDIT]
Sorry, missed the line where you mentioned paramiko.
The following module is simply an implementation of the scp protocol for paramiko.
If you don't want to use paramiko or conch (the only ssh implementations I know of for python), you could rework this to run over a regular ssh session using pipes.
scp.py for paramiko
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect('<IP Address>', username='<User Name>',password='' ,key_filename='<.PEM File path')
#Setup sftp connection and transmit this script
print ("copying")
sftp = client.open_sftp()
sftp.put(<Source>, <Destination>)
sftp.close()
if you install putty on win32 you get an pscp (putty scp).
so you can use the os.system hack on win32 too.
(and you can use the putty-agent for key-managment)
sorry it is only a hack
(but you can wrap it in a python class)
As of today, the best solution is probably AsyncSSH
https://asyncssh.readthedocs.io/en/latest/#scp-client
async with asyncssh.connect('host.tld') as conn:
await asyncssh.scp((conn, 'example.txt'), '.', recurse=True)
You can use the package subprocess and the command call to use the scp command from the shell.
from subprocess import call
cmd = "scp user1#host1:files user2#host2:files"
call(cmd.split(" "))
Have a look at fabric.transfer.
from fabric import Connection
with Connection(host="hostname",
user="admin",
connect_kwargs={"key_filename": "/home/myuser/.ssh/private.key"}
) as c:
c.get('/foo/bar/file.txt', '/tmp/')
It has been quite a while since this question was asked, and in the meantime, another library that can handle this has cropped up:
You can use the copy function included in the Plumbum library:
import plumbum
r = plumbum.machines.SshMachine("example.net")
# this will use your ssh config as `ssh` from shell
# depending on your config, you might also need additional
# params, eg: `user="username", keyfile=".ssh/some_key"`
fro = plumbum.local.path("some_file")
to = r.path("/path/to/destination/")
plumbum.path.utils.copy(fro, to)
If you are on *nix you can use sshpass
sshpass -p password scp -o User=username -o StrictHostKeyChecking=no src dst:/path
Hmmm, perhaps another option would be to use something like sshfs (there an sshfs for Mac too). Once your router is mounted you can just copy the files outright. I'm not sure if that works for your particular application but it's a nice solution to keep handy.
I while ago I put together a python SCP copy script that depends on paramiko. It includes code to handle connections with a private key or SSH key agent with a fallback to password authentication.
http://code.activestate.com/recipes/576810-copy-files-over-ssh-using-paramiko/