I am trying to create a communication between an STM32 and a laptop.
I am trying to receive data from the serial, sent thanks to an STM32. Actual code that I am sending is 0x08 0x09 0x0A 0x0B
I checked on the oscilloscope and I am indeed sending the correct values in the correct order.
What I receive is actually :
b'\n\x0b\x08\t'
I assume that Python is not reading an input that is greater than a 3 bit size, but can not figure out why
Please find my code below :
import serial
ser = serial.Serial('COM3', 115200, bytesize=8)
while 1 :
if(ser.inWaiting() != 0) :
print(ser.read(4))
If someone could help, it would be nice ! :)
check your uart rate, keep the python serial rate the same for stm32
What comes to my mind when looking at pySerial library is that while You initialize Your COM port:
You are not providing read timeout parameter.
You are awaiting 4 bytes of data from serial port.
The problem with such approach is that Your code will wait forever until it gets these 4 bytes and if You are sending the same array from STM32 for me it looks like You received 0x0A,0x0B from older packet and 0x08,0x09 from newer packet, so python code printed these 4 bytes and in fact received into buffer also 0x0A,0x0B of newer packet but it waits until it will receive 2 more bytes before it will be allowed to return with data to print.
Putting here a timeout on read and limiting read argument to single byte might solve Your problem.
Also for further development if You would like to create regular communication between microcontroller and computer I would suggest to move these received bytes into separate buffer and recognize single packets in separate thread parser. In Python it will be painful to create more complex serial communication as even with separate thread it will be quite slow.
Related
I'm writing something like an SMS gate on Ubuntu. The device is a Huawei E173 Modem.
I use pyserial to write/read to and from the device. Here is my code:
import serial
import time
port = '/dev/ttyUSB0'
ser = serial.Serial(port,
stopbits=serial.STOPBITS_ONE,
parity=serial.PARITY_NONE,
bytesize=serial.EIGHTBITS
)
ser.write(b'AT\r\n')
time.sleep(0.1)
print(ser.read(ser.in_waiting))
This code works. But sometimes when I reconnect the device, I find that it cannot read anything. (ser.in_waiting=0 and nothing changes even if I set n larger).
But I can still use minicom to work with that port.
My Question: Why doesn't pyserial work but minicom can? Is there any difference between them?
What I guess it's happening is that the delay you are using together with the timeout you set when you open the port is conspiring with the time it takes for the modem to process the command.
To avoid that try reading data repeatedly for a certain time with a loop:
...
ser.write(b'AT\r\n')
timeout=time.time()+3.0
while ser.inWaiting() or time.time()-timeout<0.0:
if ser.inWaiting()>0:
data+=ser.read(ser.inWaiting())
timeout=time.time()+3.0 print(data)
With minicom or any other terminal you are always listening on the port so you always get the answer no matter how long the modem takes to process the command. In your code you send the command, wait 100 ms and then listen on the port for a time period defined by the timeout setting. Since you are not defining a timeout you have the default wait forever but that behavior is overridden by the use of the bytes in the buffer as an argument. If you happen to check the buffer before data arrives in it (because the command took longer than the 100 ms you gave it) the timeout becomes zero.
Considering the previous paragraph, and assuming you know the number of bytes it might be better to define a finite timeout and read with a ser.read(n) with n the expected bytes.
In my case, (on the BeagleBone Black), the above answer helped me get some bytes but not all. I realized, due to some reasons, minicom was reading the port and (maybe) flushing it. So, PySerial got nothing to read.
I simply closed minicom terminal and everything is good now :)
I m trying to receive a byte from Atmega2560 at an unexpected time ( using USART ) on my pc. So how do I ensure that i don't miss the byte in my python code ( which has may functions running)
You didn't say what operating system you are using or how the ATmega2560 is connected to the computer, but the drivers in your operating system responsible for receiving the serial data from the ATmega2560 will almost certainly have a buffer for holding incoming bytes, so you don't need to worry about constantly reading from the serial port in your Python program. Just read when you get around to it, and the byte should be waiting for you in the buffer.
It's easy to test that this is the case: send a byte from the AVR, purposely wait a few seconds, then read the byte and make sure it was received correctly.
Let's say I'm using 1024 as buffer size for my client socket:
recv(1024)
Let's assume the message the server wants to send to me consists of 2024 bytes.
Only 1024 bytes can be received by my socket. What's happening to the other 1000 bytes?
Will the recv-method wait for a certain amount of time (say 2 seconds) for more data to come and stop working after this time span? (I.e., if the rest of the data arrives after 3 seconds, the data will not be received by the socket any more?)
or
Will the recv-method stop working immediately after having received 1024 bytes of data? (I.e. will the other 1000 bytes be discarded?)
In case that 1.) is correct ... is there a way for me to to determine the amount of time, the recv data should wait before returning or is it determined by the system? (I.e. could I tell the socket to wait for 5 seconds before stopping to wait for more data?)
UPDATE:
Assume, I have the following code:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((sys.argv[1], port))
s.send('Hello, world')
data = s.recv(1024)
print("received: {}".format(data))
s.close()
Assume that the server sends data of size > 1024 bytes. Can I be sure that the variable "data" will contain all the data (including those beyond the 1024th byte)?
If I can't be sure about that, how would I have to change the code so that I can always be sure that the variable "data" will contain all the data sent (in one or many steps) from the server?
It depends on the protocol. Some protocols like UDP send messages and exactly 1 message is returned per recv. Assuming you are talking about TCP specifically, there are several factors involved. TCP is stream oriented and because of things like the amount of currently outstanding send/recv data, lost/reordered packets on the wire, delayed acknowledgement of data, and the Nagle algorithm (which delays some small sends by a few hundred milliseconds), its behavior can change subtly as a conversation between client and server progresses.
All the receiver knows is that it is getting a stream of bytes. It could get anything from 1 to the fully requested buffer size on any recv. There is no one-to-one correlation between the send call on one side and the recv call on the other.
If you need to figure out message boundaries its up to the higher level protocols to figure that out. Take HTTP for example. It starts with a \r\n delimited header and then has a count of the remaining bytes the client should expect to receive. The client knows how to read the header because of the \r\n then knows exactly how many bytes are coming next. Part of the charm of RESTful protocols is that they are HTTP based and somebody else already figured this stuff out!
Some protocols use NUL to delimit messages. Others may have a fixed length binary header that includes a count of any variable data to come. I like zeromq which has a robust messaging system on top of TCP.
More details on what happens with receive...
When you do recv(1024), there are 6 possibilities
There is no receive data. recv will wait until there is receive data. You can change that by setting a timeout.
There is partial receive data. You'll get that part right away. The rest is either buffered or hasn't been sent yet and you just do another recv to get more (and the same rules apply).
There is more than 1024 bytes available. You'll get 1024 of that data and the rest is buffered in the kernel waiting for another receive.
The other side has shut down the socket. You'll get 0 bytes of data. 0 means you will never get more data on that socket. But if you keep asking for data, you'll keep getting 0 bytes.
The other side has reset the socket. You'll get an exception.
Some other strange thing has gone on and you'll get an exception for that.
First time poster.
Before I start, I just want to say that I am a beginner programmer so bear with me, but I can still follow along quite well.
I have a wireless device called a Pololu Wixel, which can send and receive data wirelessly. I'm using two of them. One to send and one to receive. It's USB so it can plug straight into my Raspberry Pi or PC, so all I have to do is connect to the COM port through a terminal to read and write data to it. It comes with a testing terminal program that allows me to send 1-16 bytes of info. I've done this and I've sent and received 2 bytes (which is what I need) with no problem.
Now here's my problem: When I open up the Ubuntu terminal and use Pyserial to connect to the correct sending Wixel COM Port and write a value larger than 255, my receiving COM port, which is connected to another instance of Terminal also using Pyserial, doesn't read the right value, hence I think I'm not being able to read and write two bytes, but only one. After doing something reading online in the pyserial documentation, I believe, not know, that Pyserial can only read and write 5,6,7, or 8 bits at a time.
I hope my problem is obvious now. How the heck can I write 2 bytes worth of info to the COM port to my device and send it to the other device which needs to read those 2 bytes through, all using pyserial?
I hope this all makes sense, and I would greatly appreciate any help.
Thanks
UPDATE
Okay, I think I've got this going now. So I did:
import serial
s=serial.Serial(3) //device #1 at COM Port 4 (sending)
r=serial.Serial(4) //device #4 at COM Port 5 (receiving)
s.timeout=1
r.timeout=1
s.write('0x80')
r.readline()
//Output said: '0x80'
s.write('hh')
r.readline()
//Output said: 'hh'
Honestly, I think this solves my problem. Maybe there never was a problem to begin with. Maybe I can take my 16bit binary data from the program, example "1101101011010101", turn it into characters (I've seen something called char() before I think that's it)
then use s.write('WHATEVER')
then use r.readline() and convert back to binary
You'll likely need to pull your number apart into multiple bytes, and send the pieces in little endian or big endian order.
EG:
low_byte = number % 256
high_byte = number // 256
That should get you up to 65535. You can reconstruct the number on the other side with high_byte * 256 + low_byte.
I am trying to communicating with a scale that does communication in ASCII format using python and pySerial. I have no experience how ever using ASCII format. So I have basic questions.
How would I send a a character T for example using pySerial and terminate it with CRLF using ASCII format?
I tried
myserialport.write('TCRLF')
myserialport.write('T\r\n')
myserialport.write('T\n\r')
I am also trying to read data from the scale which I would expect to be in a form of '208.01 g' for example. But when I use
myserialport.read(10)
or
myserialport.readline(10)
I get this from the scale
]ëýýÿ]W
ÿ]u_u]ÿ]uÕ
ýWýWë]uÝõW
ÿ½õÿ½WW]Ýý
WýW]Wÿ½ÿ×ë
From googling it seems as pySerial should receive data in ASCII format by default, and send it as well...but I am lost as to why its not working. Any help would be appreciated.
This is the right way to send a character with CRLF to a serial port:
myserialport.write('T\r\n')
Regarding messy response - make sure that you set the baudrate, the number of data bits, stop bits and parity bits correctly. You can find the required values in the scale's datasheet.
For example:
from serial import Serial, SEVENBITS, STOPBITS_ONE, PARITY_EVEN
myserialport = Serial('/dev/ttyS0', baudrate=9600, bytesize=SEVENBITS, parity=PARITY_EVEN, stopbits=STOPBITS_ONE)
The issue was ground connection on USB to Serial Converter (Pin 7 needed to be grounded). If you have same issue, check your pin out. (Fisher Scintific scales use Pin 7 as ground, which is not normal as pin 5 is ground...odd...) Thanks all.