I am writing a code in python to communicate with scopes through pyvisa.
Sometimes happens that during the transfer of data from the scope to the pc via ethernet connection, not all the data are transferred.
I open the connection with the scope as a SOCKET connection, as indicated in the manual:
inst = visa.ResourceManager().open_resource("TCPIP0::<ip_address>::<port>::SOCKET")
Everything runs properly except for data transfer.
I ask for data via the command inst.write('channel1:data?') as reported in the manual and then I read the data with inst.read(). But if I compare the number of points indicated in the data header with the length of the data array I obtain from the read() method I get a different result, not all the data is transferred. I tried to enable termination characters to the read operations and they work, but when I read data I get a warning from VISA saying that the string does not end with any termination character.
Is there a way to tell peeves when stop reading? Is there a way to force the read time to be longer?
Thanks
Related
I'm new to using buffers and I've just set up a server and client. When I send data over a socket, my understanding is it is sent as bytes. I'd like to send some arbitrary list of values between 1-4096 by using i.e. _conn.sendall(str(list).encode()). However, I don't know how to handle the data on the client side to put it into a buffer. My first attempt was using the io module and creating a buffer like buffer = io.BytesIO() and then writing to it from the socket.
data = socket.recv(buffer_size)
buffer.write(bytes(data))
However, I'm not sure if this is correct or if the buffer is written to at all. I've tried printing out the buffer by using print(data.getbuffer()) but nothing prints on my terminal not even if I set up an exception handler.
I'm trying to send a large stream of data via serial, and using the PySerial library for my purposes. However, the program freezes whenever I need to read more than somewhere between 10,000 and 15,000 bytes.
s = ser.read(10000) works, however s = ser.read(15000) does not.
I've tried flushing the buffer, I've tried reading in one byte at a time with a loop, I've even tried opening and closing the port after calls of less bytes to try and receive all the data I'm sending - but no luck.
Any idea how to receive more data without the program freezing?
I'm developping a script with Pyserial to send data to a microcontroller, the microcontroller then read the data, process them, and send back some debug information to my python script.
My python script was working without any issue when i was just reading the data from the microcontroller. Now that I need to send data to the microcontroller and start reading just after that, the data i'm reading are not complete.
I should receive something like [TAG1],10,11,12,[TAG1],13,14,15\n but sometimes I don't received the beginning of the data but only the end like 1,12,[TAG1],13,14,15\n
I'm basically doing:
serial.write(dataOut)
dataIn = serial.read(sizeOfTheDataToReceive)
The issue does not come from the microcontroller I'm sure of that, if I'm using putty to send/receive my data I always see the full data.
I tried to add some delay in my microcontroller code to send the data 10s after receiving the data from python, but still it's not working everytime.
Do you have any idea what can cause it ? The COM port is opened when the python script start and is closed at the end of the script.
You need to clear your read and write buffers:
serial.flushInput()
serial.flushOutput()
Then read the data byte-wise:
serial.write(dataOut)
time.Sleep(0.3)
s=''
while serial.inWaiting()>0:
b=serial.read(1)
# time.Sleep(0.1)
s += b
I have a problem with receiving data from server to client. I have the following client-side function that attempts to receive data from the server. The data sent by the server using the socket.sendall (data) function is greater than buff_size so I need a loop to read all the data.
def receiveAll (sock):
data = ""
buff_size = 4096
while True:
part = sock.recv (buff_size)
data + = part
if part <buff_size:
break;
return data
The problem that occurs to me is that after the first iteration (read the first 4096mb), in the second the program is blocked waiting for the other data in part = sock.recv (buff_size). How do I have to do so that recv() can continue reading the other missing data? Thank you.
Your interpretation is wrong. Your code reads all the data that it get from the server. It just doesn't know that it should stop listening for incoming data. It doesn't know that the server sent everything it had.
First of all note that these lines
if part <buff_size:
break;
are very wrong. First of all you are comparing a string to int (in Python3.x that would throw an exception). But even if you meant if len(part) <buff_size: then this is still wrong. Because first of all there might be a lag in the middle of streaming and you will only read a piece smaller then buff_size. Your code will stop there.
Also if your server sends a content of the size being a multiple of buff_size then the if part will never be satisfied and it will hang on .recv() forever.
Side note: don't use semicolons ;. It's Python.
There are several solutions to your problem but none of them can be used correctly without modyfing the server side.
As a client you have to know when to stop reading. But the only way to know it is if the server does something special and you will understand it. This is called a communication protocol. You have to add a meaning to data you send/receive.
For example if you use HTTP, then a server sends this header Content-Length: 12345 before body so now as a client you know that you only need to read 12345 bytes (your buffer doesn't have to be as big, but with that info you will know how many times you have to loop before reading it all).
Some binary protocols may send the size of the content in first 2 or 4 bytes for example. This can be easily interpreted on the client side as well.
Easier solution is this: simply make server close the connection after he sends all the data. Then you will only need to add check if not part: break in your code.
I am reading data from a microcontroller via serial, at a baudrate of 921600. I'm reading a large amount of ASCII csv data, and since it comes in so fast, the buffer get's filled and all the rest of the data gets lost before I can read it. I know I could manually edit the pyserial source code for serialwin32 to increase the buffer size, but I was wondering if there is another way around it?
I can only estimate the amount of data I will receive, but it is somewhere around 200kB of data.
Have you considered reading from the serial interface in a separate thread that is running prior to sending the command to uC to send the data?
This would remove some of the delay after the write command and starting the read. There are other SO users who have had success with this method, granted they weren't having buffer overruns.
If this isn't clear let me know and I can throw something together to show this.
EDIT
Thinking about it a bit more, if you're trying to read from the buffer and write it out to the file system even the standalone thread might not save you. To minimize the processing time you might consider reading say 100 bytes at a time serial.Read(size=100) and pushing that data into a Queue to process it all after the transfer has completed
Pseudo Code Example
def thread_main_loop(myserialobj, data_queue):
data_queue.put_no_wait(myserialobj.Read(size=100))
def process_queue_when_done(data_queue):
while(1):
if len(data_queue) > 0:
poped_data = data_queue.get_no_wait()
# Process the data as needed
else:
break;
There's a "Receive Buffer" slider that's accessible from the com port's Properties Page in Device Manager. It is found by following the Advanced button on the "Port Settings" tab.
More info:
http://support.microsoft.com/kb/131016 under heading Receive Buffer
http://tldp.org/HOWTO/Serial-HOWTO-4.html under heading Interrupts
Try knocking it down a notch or two.
You do not need to manually change pyserial code.
If you run your code on Windows platform, you simply need to add a line in your code
ser.set_buffer_size(rx_size = 12800, tx_size = 12800)
Where 12800 is an arbitrary number I chose. You can make receiving(rx) and transmitting(tx) buffer as big as 2147483647a
See also:
https://docs.python.org/3/library/ctypes.html
https://msdn.microsoft.com/en-us/library/system.io.ports.serialport.readbuffersize(v=vs.110).aspx
You might be able to setup the serial port from the DLL
// Setup serial
mySerialPort.BaudRate = 9600;
mySerialPort.PortName = comPort;
mySerialPort.Parity = Parity.None;
mySerialPort.StopBits = StopBits.One;
mySerialPort.DataBits = 8;
mySerialPort.Handshake = Handshake.None;
mySerialPort.RtsEnable = true;
mySerialPort.ReadBufferSize = 32768;
Property Value
Type: System.Int32
The buffer size, in bytes. The default value is 4096; the maximum value is that of a positive int, or 2147483647
And then open and use it in Python
I am somewhat surprised that nobody has yet mentioned the correct solution to such problems (when available), which is effective flow control through either software (XON/XOFF) or hardware flow control between the microcontroller and its sink. The issue is well described by this web article.
It may be that the source device doesn't honour such protocols, in which case you are stuck with a series of solutions that delegate the problem upwards to where more resources are available (move it from the UART buffer to the driver and upwards towards your application code). If you are losing data, it would certainly seem sensible to try and implement a lower data rate if that's a possibility.
For me the problem was it was overloading the buffer when receiving data from the Arduino.
All I had to do was mySerialPort.flushInput() and it worked.
I don't know why mySerialPort.flush() didn't work. flush() must only flush the outgoing data?
All I know is mySerialPort.flushInput() solved my problems.