Pyserial failed to read full line after sending data - python

I'm developping a script with Pyserial to send data to a microcontroller, the microcontroller then read the data, process them, and send back some debug information to my python script.
My python script was working without any issue when i was just reading the data from the microcontroller. Now that I need to send data to the microcontroller and start reading just after that, the data i'm reading are not complete.
I should receive something like [TAG1],10,11,12,[TAG1],13,14,15\n but sometimes I don't received the beginning of the data but only the end like 1,12,[TAG1],13,14,15\n
I'm basically doing:
serial.write(dataOut)
dataIn = serial.read(sizeOfTheDataToReceive)
The issue does not come from the microcontroller I'm sure of that, if I'm using putty to send/receive my data I always see the full data.
I tried to add some delay in my microcontroller code to send the data 10s after receiving the data from python, but still it's not working everytime.
Do you have any idea what can cause it ? The COM port is opened when the python script start and is closed at the end of the script.

You need to clear your read and write buffers:
serial.flushInput()
serial.flushOutput()
Then read the data byte-wise:
serial.write(dataOut)
time.Sleep(0.3)
s=''
while serial.inWaiting()>0:
b=serial.read(1)
# time.Sleep(0.1)
s += b

Related

PySerial seems to have a read() limit

I'm trying to send a large stream of data via serial, and using the PySerial library for my purposes. However, the program freezes whenever I need to read more than somewhere between 10,000 and 15,000 bytes.
s = ser.read(10000) works, however s = ser.read(15000) does not.
I've tried flushing the buffer, I've tried reading in one byte at a time with a loop, I've even tried opening and closing the port after calls of less bytes to try and receive all the data I'm sending - but no luck.
Any idea how to receive more data without the program freezing?

Difficulty in python reading and writing to arduinos connected to serial ports

I'm working with a python program on a Windows PC running under Anaconda that is talking to multiple USB-attached arduino megas. I can create a string on the python side and successfully encode it as bytes and send it to the arduino where it is correctly interpreted. I can read a string from the arduino into the python program and correctly interpret it. The issue I'm having is when I try to read from one arduino and then send that message back out to another arduino. Example:
Python side code:
response = ser1.readline() #Read from arduino #1
ser2.write(response) #Write the response read from #1 out to #2
Arduino side (for serial 2):
if (Serial.available()>0) {
newStr = Serial.readString();
}
The Serial.readString() never completes; the arduino just hangs at that point.
I'm sure it's something stupidly simple, but python is a new language for me so I haven't been able to figure it out.

PyVISA read closes before transfer has finished

I am writing a code in python to communicate with scopes through pyvisa.
Sometimes happens that during the transfer of data from the scope to the pc via ethernet connection, not all the data are transferred.
I open the connection with the scope as a SOCKET connection, as indicated in the manual:
inst = visa.ResourceManager().open_resource("TCPIP0::<ip_address>::<port>::SOCKET")
Everything runs properly except for data transfer.
I ask for data via the command inst.write('channel1:data?') as reported in the manual and then I read the data with inst.read(). But if I compare the number of points indicated in the data header with the length of the data array I obtain from the read() method I get a different result, not all the data is transferred. I tried to enable termination characters to the read operations and they work, but when I read data I get a warning from VISA saying that the string does not end with any termination character.
Is there a way to tell peeves when stop reading? Is there a way to force the read time to be longer?
Thanks

Open a sub-cmd window with Python

I'm making a cmd IRC client in Python. I want to receive data at the same time I can write message, in the previous code I did I could only write 2 messages and then it bugs and I can't write until it receives some kind of data.
The question is, can I have one cmd window running the received data and other one with a constant input waiting for me to write something to send?, maybe with threads?
I've looked through the subprocess library but I don't really know how to code it.
CMD1:
while Connected:
print socket.recv(1024)
CMD2:
while Connected:
text = raw_input("Text to send>> ")
socket.send(text)
(This is a pseudocode not a real one)
This approach you are proposing could be done by making a server like application, and 2 client applications that connect via localhost to send and receive events. So that way you could have 2 terminals open , connected to the same session of the server.
On the other side you should consider a different design approach that include ncurses which allow you to make a terminal ui with input and output at the same terminal (two sections up and down). You can reference: http://gnosis.cx/publish/programming/charming_python_6.html

Streaming of a log text file with constant updates

I made a program which is saving sensor data in a log file (server site).
This file is stored in a temporary directory (Ram-Disk).
Each line contains a timestamp and a JSON string.
The update rate is dependent on the sensor data, but the fastest is every 0.5s.
What I want to do is, to stream every update in this file to a client application.
I have several approaches in mind:
maybe a shared folder on server site (samba) with a script (client site), just checking the file every 0.5s
maybe a another server program running on the server, checking for updates (but this I don't want to do, because Raspberry Pi is slow)
Has anyone maybe done something like this before and can share some ideas? Is there maybe a python module for this already (which opens a file like a stream and if something changed then this stream is giving it out)? Is it smart to check a file constantly for updates?
To stream the log file to an application you can use
tail -n 1000000 -f | application
(This will continuously check the file for new lines and then stream them to the application, then hang again until new lines are present.)
But this will of course put load on your server as the querying whether there are new lines or not will be relayed to the Raspberry Pi to execute it. A small program (written in C, with a decent sleep) on the server itself might in fact put less load on it than querying for new lines via the network.
I'm doing something like that.
I have a server running on my raspberry pi + client that parse the output of the server and sends it to another server on the web.
What I'm doing is that the local server program write it's data in chunks.
Every time it writes the data (by the way, also on tmpfs) it writes it on a different file, so I don't get errors when trying to parse the file while something else is writing to that file..
After it writes the file, it starts the client program in order to parse and send the data (Using subprocess with the name of the file as a parameter).
Works great for me.

Categories

Resources