I am trying to readlines from the tcp server that I ran in the same script. I am able to send one command and reads its output but at the end (after reading all outputs) it looks like that program hangs in readline, I have tried almost all the solutions here and here but still it hangs.
Most of these solutions propose to check if output of readline is none or not but in my case program never returns from last read and just hangs in there.
tcp server is not in my control, or say I just have to test server script therefore I can not modify it. Also, is it possible to send commands to runing server using python without using subprocess? any better alternative?
def subprocess_cmd(command):
process=subprocess.Popen(command,stdin=subprocess.PIPE,stderr=subprocess.STDOUT,stdout=subprocess.PIPE, shell=True)
for cmd in ['python3 -u tcp_server.py 123 port1']:
subprocess_cmd(cmd)
process.stdin.write('command like print_list')
process.stdin.flush()
while True:
line=process.stdout.readline()
if line == '':
break
readline hangs because your TCP connection is still open and readline expects more data to come in. You must close the connection from server side to notify readline that there is nothing more to read. Usually it is done by closing socket on client side notifying the server that there will not be any more requests to it. When server finishes processing all your commands it closes socket too. And this is the signal for you that you have received all the data that server sent to you.
Or, alternatively, if you don't want to close the connection, you must invent delimiters which will mark end of response. So the client will stop calling readline when such delimiter is read.
Related
I am using pySerial to communicate to a microcontroller over USB, Most of the communication is initiated by the desktop python script which sends a command packet and waits for reply.
But there is also a alert packet that may be sent by the microcontroller without a command from the python script. In this case I need to monitor the read stream for any alerts.
For handling alerts, I dedicate a seperate process to call readline() and loop around it, like so:
def serialMonitor(self):
while not self.stopMonitor:
self.lock.acquire()
message = self.stream.readline()
self.lock.release()
self.callback(message)
inside a class. The function is then started in a seperate process by
self.monitor = multiprocessing.Process(target = SerialManager.serialMonitor, args = [self])
Whenever a command packet is send, the command function needs to take back control of the stream, for which it must interrupt the readline() call which is in blocking. How do I interrupt the readline() call? Is there any way to terminate a process safely?
You can terminate a multiprocessing process with .terminate(). Is this safe? Probably it's alright for a readline case.
However, this is not how I would handle things here. As I read your scenario, there are two possibilities:
MCU initiates alert package
Computer sends data to MCU (and MCU perhaps responds)
I assume the MCU will not send an alert package whilst an exchange is going on initiated by the computer.
So I would just initiate the serial object with a small timeout, and leave it in a loop when I'm not using it. My overall flow would go like this:
ser = Serial(PORT, timeout=1)
response = None
command_to_send = None
running = True
while running: # event loop
while running and not command_to_send and not line:
try:
line = ser.readline()
except SerialTimeoutException:
pass
if not command_to_send:
process_mcu_alert(line)
else:
send_command(command_to_send)
command_to_send = None
response = ser.readline()
This is only a sketch, as it would need to be run in a thread or subprocess, since readline() is indeed blocking, so you need some thread-safe way of setting command_to_send and running (used to exit gracefully) and getting response, and you likely want to wrap all this state up in a class. The precise implementation of that depends upon what you are doing, but the principle is the same---have one loop which handles reading and writing to the serial port, have it timeout to respond relatively quickly (you can set a smaller timeout if you need to), and have it expose some interface you can handle.
Sadly to my knowledge python has no asyncio compatible serial library, otherwise that approach would seem neater.
A lot of resources, including the example in the official documentation at telnetlib suggest that at the end before you do a read_all(), you need to write exit after the command as:
tn.write("ls\n")
tn.write("exit\n")
Can someone please help me understand why is this needed ?
If I try doing it without the exit, the telnet connection hangs (or at least looks like it is hung) as the output of the command executed does not show on the terminal.
Also, another way of making it work, as I found in some resources was to use 'exec' to fire up the command and then you don't need the exit thing anymore.
Please help me understand this as well.
read_all() reads all the output until EOF. In other words, it waits until remote server closes connection and returns you all the data it has sent. If you have not previously notified the server with an "exit" command that you have no more commands for it, it will wait for them. And a deadlock occurs: you are holding open connection because you are waiting for server to tell you that it has sent everything it intended to say, and server waits for new orders from you and is ready to add more data to it's output.
If I am using Python telnetlib, is there a way to close the telnet session if that device does not support nothing to terminate telnet session, so no ctrl+something or quit or anything like that.
I need this so that I could use read.all
Network sockets let you shutdown write and/or read channels to let the other side know that you have finished that part of the conversation. For a telnet server, shutting down the write channel is an exit. It should finish sending whatever is in the send pipeline and then close the connection completely. That close is an EOF and read_all should return. So, assuming you've already got a connection called tn
tn.get_socket().shutdown(socket.SHUT_WR)
data = tn.read_all()
tn.close()
I have a program with 2 threads. Every thread sends different commands to remote host and redirect output to file. Threads use different remote hosts. I've created a connection with pxssh and trying to send commands to remote hosts with 'sendline':
s = pxssh.pxssh()
try:
s.login (ip, user, pswd)
except:
logging.error("login: error")
return
logging.debug("login: success")
s.sendline("ls / >> tmpfile.log")
s.prompt()
I can send fixed number of commands (about 500 commands on every host) and after that 'sendline' stops working. Connection is ok, but I can't get commands on remote hosts. It looks like some resources run out... what can it be?
Reposting as an answer, since it solved the issue:
Are you reading in between each write? If the host is producing output and you're not reading it, sooner or later a buffer will fill up and it will block until there's room to write some more. Make sure that before each write, you read any data that's available in the terminal, even if you don't want to do anything with it.
If you really don't care about the output at all, you could create a thread that constantly reads in a loop, so that your main thread can skip reading altogether. If your code needs to do anything with any part of the output, though, don't do this.
The server is at https://github.com/EmeraldHaze/Socketd/blob/master/Serv.py ; the process is at https://github.com/EmeraldHaze/QFTSOM/blob/master/main.py
A client too test this is at http://www.kongregate.com/games/EmeraldHaze/this-is-why-we-have-maps ; port forwarding and whatnot is set up correctly.
The point is that someone connecting too the server sends something like {"IP":"123.456.789.012"}, then a process is made for him, then the IO streams of the process and the user are connected. The reality is that the process outputs something, the user sees it, the user gives some input, the server gets it (and logs it), then nothing happens. Any ideas why? The buffers should be flushed.
Uh, I solved this. It was becouse sys.stdin.readline() stops blocking when it gets a \n, but either twisted or the client strip them off, meaning it will block indefinitly despite getting input.