When trying to correctly reverse-engineer and decode data on a UART connection, I have arrived at the following conclusions about the format of the data being received.
- Data is sent in "packets". Each packet is delimited only by time (spaces) between transmissions.
- Packets are of variable length. The length is specified by the third byte in the sequence.
- Data is not framed using any special characters or out-of-band signals, but a valid data packet can be (assumed to be) valid, based on the final byte which is a checksum value of the frame.
When using a logic analyzer, it is easy to discern packets. However, feeding the data via UART to a program makes delimiting packets impossible. All received data is enqueued by the operating system. While certain handlers may be added to trigger on data received events, this does not ensure that data available in the OS's uart queue will be a whole packet.
Are there any best practices for separating such data?
Addendum:
My current solution (which has huge overhead and a large error rate):
Starting from the first byte in the queue, try to parse a frame. If the size specified in the frame is larger than 0x20 (there are no packets larger than 32bytes - header and checksum included) then the current "start byte" is considered invalid and dropped, and recoginition continues from the next byte etc)
The other solution I am working on is using a microcontroller to parse the data and frame it properly, either in-band or out-of-band. This is a better solution, as such a time sensitive protocol should require a RTOS. But still, there must be a way to implement this on a normal OS.
Logic Analyzer:
(The first and second byte ARE NOT constant. I have deducted that the first byte is an address (or maybe a timeslot, and the second byte is a packet type).
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…