SSH twice with Python [duplicate] - python

This question already has an answer here:
Connecting to a server via another server using Paramiko
(1 answer)
Closed 1 year ago.
I would like to SSH in host1 first then SSH to host2 to get some files. SSH to host one by using Paramiko was success. But when I did the same as host1, it cannot SSH to host2. It shows 'Unable to establish SSH connection: Server 'host2' not found in known_hosts'
import paramiko
from paramiko.ssh_exception import AuthenticationException, SSHException, BadHostKeyException
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('host1', username='user1', password='pass1', timeout=5)
print ("Accessed host1 already")
try:
client2 = paramiko.SSHClient()
client2.load_system_host_keys()
client2.connect('host2', username='user2', password='pass2', timeout=5)
print ("Accessed host2 already")
except AuthenticationException:
print("Authentication failed, please verify your credentials: %s")
except SSHException as sshException:
print("Unable to establish SSH connection: %s" % sshException)
except BadHostKeyException as badHostKeyException:
print("Unable to verify server's host key: %s" % badHostKeyException)
except Exception as e:
print("Operation error: %s" % e)
except :
print ("SSH to host1 failed!!!")
Also I tried using command to get into host2 but it is still in host1 all the time. Not sure is this the right way to do. Please recommend how can I do. Thank you.
stdin1, stdout1, stderr1 = client.exec_command('ssh user2#host2;pass2;cd /;ls')
rawd = stdout1.read().decode('ascii').strip("\n")
print(rawd)

For an initial connection, SSH asks if you trust the remote computer. When you type yes, it gets stored in ~/.ssh/known_hosts.
On the system where you run the script, try making a SSH connection manually in console, let it store the server's info in that file, then start your program.

Related

Oops, unhandled type 3 ('unimplemented') error while connecting SFTP with Databricks

I'm trying to connect to SFTP from a notebook with databricks in python. My notebook looks like this:
import pysftp
import paramiko
# set parameters
host_name = 'xx.xxx.xxx.xxx'
username = 'FTP_USERNAME'
file_path_to_rsa_key = "/path/key_rsa"
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
# connect to SFTP
sftp = pysftp.Connection(host_name, username=username, private_key=file_path_to_rsa_key, cnopts=cnopts)
data = sftp.listdir()
sftp.close()
# Prints out the directories and files, line by line
for i in data:
print(i)
I have the following error:
Oops, unhandled type 3 ('unimplemented')
when running the following block:
try:
conn = pysftp.Connection(host_name, username=username, private_key=file_path_to_rsa_key, cnopts=cnopts)
print("connection established successfully")
except:
print('failed to establish connection to targeted server')
It print connection established successfully
What does it mean? What should I do? Is the issue with listdir()?
It seems that the issue is mostly because pysftp. I end up switching entirely to paramiko and everything is working fine. The SFTP server modified the login response and pysftp was too strict in the acceptable responses to take it without trouble. I noticed something similar when I tried to use a non-RSA private key that worked fine for ssh / sftp directly but not pysftp.
There is also some interesting information here pysftp vs. Paramiko

Run a command (to backup a configuration) on multiple servers (routers) in Python

Is it possible to have a script iterate through each IP and backup running configuration on a local tftp server
import paramiko
import sys
import time
USER = "root"
PASS = "cisco"
HOST = ["10.10.10.10","11.11.11.11","12.12.12.12"]
i=0
while i <len(HOST)
def fn():
client1=paramiko.SSHClient()
#Add missing client key
client1.set_missing_host_key_policy(paramiko.AutoAddPolicy())
#connect to switch
client1.connect(HOST,username=USER,password=PASS)
print "SSH connection to %s established" %HOST
show run | redirect tftp://10.10.10.20/HOST.cfg
print "Configuration has been backed up"for %HOST
i+1
show run | redirect tftp://10.10.10.20/HOST.cfg --- can I use variable name as a text file name?
Use for h in HOST to iterate your HOST array;
Use SSHClient.exec_command to execute the command;
Use string.format to format your messages and commands.
for h in HOST:
client = paramiko.SSHClient()
#Add missing client key
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
#connect to switch
client.connect(h, username = USER, password = PASS)
print("SSH connection to {0} established".format(h))
command = "show run | redirect tftp://10.10.10.20/{0}.cfg".format(h)
(stdin, stdout, stderr) = client.exec_command(command)
for line in stdout.readlines():
print(line)
client.close()
print("Configuration has been backed up for {0}".format(h))
Obligatory warning: Do not use AutoAddPolicy - You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".

Paramiko does not connect

I try to upload a file to a server via sftp using paramiko.
def send_file(server_, port_, user_, passwd_, file_, dir_):
"""
:return:
"""
try:
transport = paramiko.Transport((server_, int(port_)))
transport.connect(username=user_, password=passwd_)
sftp = paramiko.SFTPClient.from_transport(transport)
sftp.put(file_, dir_)
sftp.close()
except RuntimeError, err:
print(str(err))
If I execute this function it just hangs (no response, no error messages), until the socket times out.
The credentials are correct, I tried them with sftp and ssh clients from the same machine and the same network.
I also passed the Transport and connect values directly, no change.
The logs on the server_ don't show any connections when I use this function.
The host key is in my known_hosts file.
The first statement in the try-block succeeds (I passed a string instead of an int to port_, this throws an exception), the second line seems to have problems.
What's the problem here?
Thanks in advance!
UPDATE 1:
I tried this in ipython2 and it works. The function above is in a PyQt program and executed via
self.connect(self.b_upload, QtCore.SIGNAL("clicked()"), self.onUpload)
Function onUpload:
def onUpload(self):
file_, ok = QtGui.QInputDialog.getText(self, 'Input Dialog', 'Datei inklusive Pfad angeben: ')
server_, port_, user_, passwd_, dir_ = ftpmod.read_config()
ftpmod.send_file(server_, port_, user_, passwd_, file_, dir_)
You can send via sftp, using ssh.open_sftp:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(
paramiko.AutoAddPolicy())
ssh.connect(host, username, password)
ftp = ssh.open_sftp()
ftp.put(localpath, remotepath)
ftp.close()

stuck with status check of amazon ec2 instance

I am using Python to launch an ec2 instance, after I get "running" state of my instance, I am trying to SCP a shell script and run it via ssh.
I am getting the following error
"ssh: connect to host ec2-xx-xxx-xxx-xxx.compute-1.amazonaws.com port 22: Connection refused"
When I check in the console, the Status check is "Initializing", once It changes "2/2 checks passed", I am able to ssh or run any script.
Is there any way I can get the "status check" via python boto API?
I am using Python 2.7.5+,
boto 2.19.0
Thanks in advance.
Simple way is to check the port 22 of the newly created instance is reachable or not by using socket module
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.connect(('hostname', 22))
print "Port 22 reachable"
except socket.error as e:
print "Error on connect: %s" % e
s.close()
When you will able to reach the port 22 then you can invoke ssh to it.
Lazy way
import boto.ec2
for region in boto.ec2.regions():
connection = boto.ec2.connect_to_region(region.name,
aws_access_key_id = '<aws access key>', aws_secret_access_key = '<aws secret key>')
existing_instances = connection.get_all_instance_status()
print 'Listing instances from region ' + region.name
for instance in existing_instances:
print instance.system_status.status + '/' + instance.instance_status.status

Amazon Web Service/Boto: Upload and execute remote python/bash script via SSH on localhost

I am able to fire up AWS Ubuntu EC2 instance with boto. Have anyone tried to upload the script to the remote Ubuntu EC2 (More than 1) and execute the script via SSH locally?
The main objective is to automate the whole process using a Python script written on localhost. Is there an alternative way or Amazon api tools to made this possible?
I'd recommend Fabric, it's made for this kind of thing.
Use paramiko API
Here, Paramiko code to execute in remote AWS EC2 Python :
import paramiko
sftp, transport= None, None, None
try:
if keyfilepath=='': keyfilepath= AWS_KEY_PEM
if keyfiletype == 'DSA': key = paramiko.DSSKey.from_private_key_file(keyfilepath)
else: key = paramiko.RSAKey.from_private_key_file(keyfilepath)
if contype== 'sftp' :
transport = paramiko.Transport((host, port))
transport.add_server_key(key)
transport.connect(None, username, pkey=key)
sftp = paramiko.SFTPClient.from_transport(transport)
if isprint : print('Root Directory :\n ', sftp.listdir())
return sftp
except Exception as e:
print('An error occurred creating client: %s: %s' % (e.__class__, e))
if sftp is not None: sftp.close()
if transport is not None: transport.close()
if ssh is not None: ssh.close()

Categories

Resources