Python telnetlib telnet.write() dropping messages - python

Given the following code:
import telnetlib
import sys
def func1(IP,user,passw):
t=telnetlib.Telnet(IP)
t.write(user.encode('ascii')+b'\n')
t.write(passw.encode('ascii')+b'\n')
return t
def func2(t,command):
t.write(command.encode('ascii')+b'\n')
print(command)
user=sys.argv[1]
passw=sys.argv[2]
IP=sys.argv[3]
t=func1(IP,user,passw)
for i in range(6):
func2(t, "message "+str(i))
By looking at the server and also Wiresharking it, only messages 1 and 2 gets through.
Now, if I change the func2(t,command) function as follows:
def func2(t,command):
t.write(command.encode('ascii')+b'\n')
t.read_eager() #This is the new line.
print(command)
It all works fine and all messages are been transmitted.
Any idea?
Python3.3 WindowsXP

You need to read the text coming back to prevent the socket blocking. This is why your other method works.
In a real world session you would always be reading back the results of logging in and of commands.

Related

Subprocess to open python file and return data

I am trying to use Python to open another file. This file is going to start up a socket and create threads for listening for additional connections, and threads for sending/receiving data. The main thread will not return.
However, if the setup of sockets fail, I want to return a error code to the other python script that executed the subprocess.
main.py
py3output = subprocess.check_output(['python3', 'py3.py'])
print('py3 said:' + str(py3output))
py3.py
def returnme():
return 10
returnme()
When I run this, it prints:
py3 said:b''
I am just trying to figure out how to get the return value back to the main calling program.
To return an exit code n back to the OS, you need sys.exit(n). But seems like you do not want to check the exit code but the stdout otput. So your program might need to rewrite to:
def returnme():
return 10
print(returnme())
You should only return a string as a standard output using following code:
sample.py
import sys
def returnme():
sys.stdout.write(str(10))
sys.stdout.flush()
returnme()
main.py
from subprocess import check_output
output = check_output(['python','sample.py'])
print('Sample.py says :' + output)

Running into a Multithreading issue connecting to multiple devices at the same time

I am defining the main function with def get_info(). This function doesn't take the arguments. This program uses argumentParser to parse the arguments from the command line. The argument provided is the CSV file with --csv option. This picks up the csv file from the current directory and read the lines each containing an IP address, logs into devices serially and runs few commands return the output and appends in the text file. When the code runs it removes the old text file from the directory and create a new output text file when executed.
Problem: I want to achieve this using threading module so that it takes 5 devices in parallel and outputs to a file. The problem I am running is with the lock issues as the same object is being used by same process at the same time. Here the sample code I have written. The threading concept is very new to me so please understand.
import getpass
import csv
import time
import os
import netmiko
import paramiko
from argparse import ArgumentParser
from multiprocessing import Process, Queue
def get_ip(device_ip,output_q):
try:
ssh_session = netmiko.ConnectHandler(device_type='cisco_ios', ip=device_row['device_ip'],
username=ssh_username, password=ssh_password)
time.sleep(2)
ssh_session.clear_buffer()
except (netmiko.ssh_exception.NetMikoTimeoutException,
netmiko.ssh_exception.NetMikoAuthenticationException,
paramiko.ssh_exception.SSHException) as s_error:
print(s_error)
def main():
show_vlanfile = "pool.txt"
if os.path.isfile(show_vlanfile):
try:
os.remove(show_vlanfile)
except OSError as e:
print("Error: %s - %s." %(e.filename, e.strerror))
parser = ArgumentParser(description='Arguments for running oneLiner.py')
parser.add_argument('-c', '--csv', required=True, action='store', help='Location of CSV file')
args = parser.parse_args()
ssh_username = input("SSH username: ")
ssh_password = getpass.getpass('SSH Password: ')
with open(args.csv, "r") as file:
reader = csv.DictReader(file)
output_q = Queue(maxsize=5)
procs = []
for device_row in reader:
# print("+++++ {0} +++++".format(device_row['device_ip']))
my_proc = Process(target=show_version_queue, args=(device_row, output_q))
my_proc.start()
procs.append(my_proc)
# Make sure all processes have finished
for a_proc in procs:
a_proc.join()
commands = ["terminal length 0","terminal width 511","show run | inc hostname","show ip int brief | ex una","show
vlan brief","terminal length 70"]
output = ''
for cmd in commands:
output += "\n"
output += ssh_session.send_command(cmd)
output += "\n"
with open("pool.txt", 'a') as outputfile:
while not output_q.empty():
output_queue = output_q.get()
for x in output_queue:
outputfile.write(x)
if name == "main":
main()
Somewhat different take...
I run effectively a main task, and then just fire up a (limited) number of threads; and they communicate via 2 data queues - basically "requests" and "responses".
Main task
dumps the requests into the request queue.
fires up a number (i.e. 10 or so...) worker tasks.
sits on the "response" queue waiting for results. The results can be simple user info messages about status, error messages, or DATA responses to be written out to files.
When all the threads finish, program shuts down.
Workers basically:
get a request. If none, just shut down
connect to the device
send a log message to the response queue that it's started.
does what it has to do.
puts the result as DATA to the response queue
closes the connection to the device
loop back to the start
This way you don't inadvertently flood the processing host, as you have a limited number of concurrent threads going, all doing exactly the same thing in their own sweet time, until there's nothing left to do.
Note that you DON'T do any screen/file IO in the threads, as it will get jumbled with the different tasks running at the same time. Each essentially only sees inputQ, outputQ, and the Netmiko sessions that get cycled through.
It looks like you have code that is from a Django example:
def main():
'''
Use threads and Netmiko to connect to each of the devices in the database.
'''
devices = NetworkDevice.objects.all()
You should move the argument parsing into the main thread. You should read the CSV file in the main thread. You should have each child thread be the Netmiko-SSH connection.
I say this as your current solution--has all of the SSH connections happen in one thread which is not what you intended.
At a high-level main() should have argument parsing, delete old output file, obtain username/password (assuming these are the same for all the devices), loop over CSV file obtaining the IP address for each device.
Once you have the IP address, then you create a thread, the thread uses Netmiko-SSH to connect to each device and retrieve your output. I would then use a Queue to pass back the output from each device (back to the main thread).
Then back in the main thread, you would write all of the output to a single file.
It would look a bit like this:
https://github.com/ktbyers/netmiko/blob/develop/examples/use_cases/case16_concurrency/threads_netmiko.py
Here is an example using a queue (with multiprocessing) though you can probably adapt this using a thread-Queue pretty easily.
https://github.com/ktbyers/netmiko/blob/develop/examples/use_cases/case16_concurrency/processes_netmiko_queue.py

System not responding to pexpect commands

I'm trying to write very simple program which controls remote machine using pexpect. But remote system does not react to sent commands.
Here is source code:
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pexpect
import sys
child = pexpect.spawn('telnet 192.168.2.81 24')
res = child.expect('/ # ')
print(res)
res = child.sendline('touch foo')
print(res)
Here is output:
0
10
So, as far as I understand, commands are executed successfully but there is no result on target system, i.e. foo file is not created.
Could anybody help me?
Add the following line after pexpect.spawn() or you would see nothing.
# for Python 2
child.logfile_read = sys.stdout
# for Python 3
child.logfile_read = sys.stdout.buffer
You also need the following statements at the end (otherwise the script would immediately exit after sendline('touch foo') so touch foo does not have a chance to run):
child.sendline('exit')
child.expect(pexpect.EOF)
According to the manual:
The logfile_read and logfile_send members can be used to separately log the input from the child and output sent to the child. Sometimes you don’t want to see everything you write to the child. You only want to log what the child sends back. For example:
child = pexpect.spawn('some_command')
child.logfile_read = sys.stdout

refresh a shell subprocess in python

I have a webpy code that sends "ps aux" data to a webpage using a subprocess.
import subprocess
ps = subprocess.Popen(('ps', 'aux'), stdout-subprocess.PIPE)
out = ps.communicate()[0]
(bunch of webpy stuff)
class index:
def GET(self):
return (output)
(more webpy to start the web server)
It sends the ps aux data across no problem however it does not refresh the ps aux data so i only get 1 continuous set rather than a changing set of data i am needing.
How do i refresh the subprocess to send new data every time I reload the webpage ?
Put the Popen call into the def GET. By the way, if you’re using Python 2.7 or newer, you can use check_output to simplify the actual subprocess call:
def GET(self):
return subprocess.check_output(['ps', 'aux'])

how to read command output from serial device using python

I have an embedded linux device and here's what I would like to do using python:
Get the device console over serial port. I can do it like this:
>>> ser = serial.Serial('/dev/ttyUSB-17', 115200, timeout=1)
Now I want to run a tail command on the embedded device command line, like this:
# tail -f /var/log/messages
and capture the o/p and display on my python >>> console.
How do I do that ?
Just open the file inside python and keep readign from it. If needed be, in another thread:
>>> ser = serial.Serial('/dev/ttyUSB-17', 115200, timeout=1)
>>> output = open("/var/log/messages", "rb")
And inside any program loop, just do:
data = output.read()
print(data)
If you want it to just go printing on the console as you keep doing other stuff, type
in something like:
from time import sleep
from threading import Thread
class Display(Thread):
def run(self):
while True:
data = self.output.read()
if data: print(data)
sleep(1)
t = Display()
t.output = output
t.start()
very first you need to get log-in into the device.
then you can run the specified command on that device.
note:command which you are going to run must be supported by that device.
Now after opening a serial port using open() you need to find the login prompt using Read() and then write the username using write(), same thing repeat for password.
once you have logged-in you can now run the commands you needed to execute

Categories

Resources