Connect to database through double SSH tunnel using Python Paramiko - python

Here is my ssh command that establish a tunnel with Postgres database by proxying through bastion host -
ssh -i C:/public/keys/my_key.pem -o "ProxyCommand ssh -W %h:%p username1#bastion_host.com" username2#ssh_host_ip -N -L 12345:postgres_host.com:5432 ssh_host_ip
I want to convert it into Python script using sshtunnel utility. But have hard time to figure out what to pass where in the utility as depicted:
in example 4 over this link: https://github.com/pahaz/sshtunnel#example-4
or this link: Double SSH tunnel within Python
I went through few posts on Stack Overflow but did not see a straightforward way of doing it. Developers are using agent forwarding as a solution to Proxy command. Any straightforward conversion of above command to sshtunnel or Paramiko or any other pythonic way would be really helpful.

Based on Connecting to PostgreSQL database through SSH tunneling in Python, the following should do:
with SSHTunnelForwarder(
'bastion_host',
ssh_username="username1", ssh_password="password1",
remote_bind_address=('ssh_host_ip', 22)) as bastion:
bastion.start()
with SSHTunnelForwarder(
('127.0.0.1', bastion.local_bind_port),
ssh_username="username2", ssh_pkey="C:/public/keys/my_key.pem",
remote_bind_address=('postgres_host', 5432)) as ssh:
ssh.start()
engine = create_engine(
'postgresql://<db_username>:<db_password>#127.0.0.1:' +
str(ssh.local_bind_port) + '/database_name')

Related

How to use Python to set up a tunnel matching an ssh command (for bastion)?

How would I set up a tunnel using Python code that could replace this command?
ssh -N -L 3307:xxxxxx.rds.amazonaws.com:3306 ec2-user#XX.XXX.XX.XX -i
~/.ssh/bastion_key.pem
You can use the sshtunnel library.
For example:
from sshtunnel import SSHTunnelForwarder
with SSHTunnelForwarder(
('XX.XXX.XX.XX', 22),
ssh_username='ec2-user',
ssh_pkey='~/.ssh/bastion_key.pem',
remote_bind_address=('xxxxxx.rds.amazonaws.com', 3306),
local_bind_address=('0.0.0.0', 3307)
) as tunnel:
# do stuff with tunnel

Problems establishing connection with DB

Platform: LINUX.
I am a beginner of MongoDB and pymongo. After installing pymongo, here is a simple test I tried on ipython:
import pymongo
client = pymongo.MongoClient();
# Also tried to specify the local host and port number
db = client['myDB']
collections = db['temptables']
collections.insert({'a':'1'})
At this point, it chokes. And in the end, spits out a "Error 111: connection refused" error. So, I tried invoking MongoDB straight from the terminal and I still got the error below [look at the far end]. So, I searched a bit and tried:
removing the lock ( sudo rm /var/lib/mongodb/mongod.lock ). Turns out there was no lock in the first place.
sudo mongod --repair
I even saw a suggestion to comment out the host and port number from the config file. Tried that too, didn't work.
None of the above worked.
This is the error I see when I try to invoke mongodb from command line.
017-08-17T15:25:30.265-0700 W NETWORK [thread1] Failed to connect to 127.0.0.1:27017, in(checking socket for error after poll), reason: Connection refused
2017-08-17T15:25:30.265-0700 E QUERY [thread1] Error: couldn't connect to server 127.0.0.1:27017, connection attempt failed :
connect#src/mongo/shell/mongo.js:237:13
#(connect):1:6
exception: connect failed
Help, please.
Your mongo server isn't running.
You can confirm this by executing sudo ps -ef | grep mongod
If you have mongo installed and in your path, you can execute:
cd && mkdir -p ~/temp_mongo_db && mongod --dbpath=./temp_mongo_db
This will launch mongo and place all database files in your home directory under 'temp_mongo_db'.
Finally, in a new terminal window, execute sudo ps -ef | grep mongod again. You'll now see mongod running.
If you want to run mongo in production, you should configure it to be managed by SystemD or some other init system.

How to do multihop ssh with fabric

I have a nat and it has various server
So from my local server I want to go to nat and then from nat i have to ssh to other machines
Local-->NAT(abcuser#publicIP with key 1)-->server1(xyzuser#localIP with key 2)
nat has different ssh key
and each of the server has different ssh key
how can i accomplish this type of multihop ssh using fabric
I tried using env.roledefs feature but it doesnt seems to be working
also I am not sure how to define two ssh keys.I know we can define a list of keys with env.key_filename but issue is will it check each key with each server?How can I be more specific and match a key with one server only
I have tried using command from my local machine
fab deploy -g 'ec2-user#54.251.151.39' -i '/home/aman/Downloads/aws_oms.pem'
and my script is
from __future__ import with_statement
from fabric.api import local, run, cd, env, execute
env.hosts=['ubuntu#10.0.0.77']
env.key_filename=['/home/ec2-user/varnish_cache.pem']
def deploy():
run("uname -a")
It's possible. Double hop to 10.0.0.2 (and list files) via gateway hop 10.0.0.1. Basically, you simply nest the connections with the gateway parameter.
# coding: utf-8
from fabric import Connection
path = '/'
conn1 = Connection(host='user1#10.0.0.1', connect_kwargs={'password': '***'})
conn2 = Connection(host='user2#10.0.0.2', connect_kwargs={'password': '***'}, gateway=conn1)
result = conn2.run(f'''cd {path} && ls -al''', hide=True)
conn2.close()
conn1.close()
msg = "Ran {0.command!r} on {0.connection.host}, got stdout:\n{0.stdout}"
print(msg.format(result))
Please remember to run the SSH connection manually once to introduce the servers to each other!
Install via
pip3 install --upgrade fabric
pip3 install cryptography==2.4.2 # optional to hide some annoying warnings
http://docs.fabfile.org/en/latest/concepts/networking.html
Python 3.6+.
In order to connect to remote hosts via an intermediate server, you can use the --gateway command-line option :
http://docs.fabfile.org/en/latest/usage/fab.html#cmdoption-g
Or, alternatively, set the env.gateway variable inside your fabfile :
http://docs.fabfile.org/en/latest/usage/env.html#gateway
For more detail information, see:
http://docs.fabfile.org/en/stable/concepts/networking.html#ssh-gateways

ssh python fabric

I don't actually understand how to do that.
I have access to local computer for example 192.168.1.101 with some_user. From that computer i have access to another comp (via vpn) 10.0.132.17 and only from here i can reach access to computer 10.0.132.15 where i need to deploy my script.
so I need to:
$ ssh some_user#192.168.1.101 -> ssh another_user#10.0.132.17 -> ssh another_user#10.0.132.15
may i somehow do: ssh some_user#192.168.1.101 -p 2222 and get access to another_user#10.0.132.15?
or in python fabric to write somehow env variable?
Another option than using an explicit tunnel is to set up ssh to transparently forward through your proxies. Put in your ~/.ssh/config something like the following:
Host proxy_midstage
User another_user
HostName 10.0.132.17
ProxyCommand ssh -q some_user#192.168.1.101 nc %h %p
Host proxy_final
User another_user
HostName 10.0.132.15
ProxyCommand ssh -q proxy_midstage nc %h %p
Then the command ssh proxy_final will jump you straight to the deployment server. Presumably fabric can use that, though I'm not positive.

Python SSH paramiko issue - ssh from inside of ssh session

import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
ip = '192.168.100.6'
client.connect(ip, username='root', password='mima')
i, o, e = client.exec_command('apt-get install sl -y --force-yes')
print o.read(), e.read()
client.close()
i used this example.. it is working fine but i want after login server1 to login server2
i mean nested ssh .
can't you call the ssh command from inside of your client.exec_command?
like:
client.exec_command('ssh user#host2 "apt-get install sl -y --force-yes"')
You exec the command "ssh" in the client, and not apt-get.
You can't really start a paramiko session on the client as long as your python program isn't there. The software you start using ssh must live on that machine.
Perhaps first scp a copy of your software, and start that using a parameter like -recursive_lvl = 1 ?

Categories

Resources