We were recently implementing a new UDP network protocol implementation, speaking to an existing device, and the documentation merely said "one byte for the station id, one byte for the message type, two bytes for the payload"--and then listed eight possible payloads.
And we built something that packed those numbers into 8-bit and 16-bit fields and sent them and it didn't work. And we're wondering why it needs two bytes for something that's only reached eight. That's a lot of headroom.
programming story Show more
We study and study captures of a successful communication and wonder--what the heck, why is it setting the third and fourth bits of all four bytes of the message? Why would you do that?
And then it hits us. We're not supposed to pack numbers into the payload as binary. We're supposed to pack individual digits into this four-byte payload as _ASCII_. So the last two bytes are so that last field can go to 99, not 65535.
It takes all kinds to make a network, I guess. 😆