How should I design my message protocol so that I distinguish between user data and termination of the message.
For example I take input from user to send it to client over TCP/IP, now the client-side needs to know the length of message, what I thought that I use some kind of termination by which the client recognizes that message end has been reached?
How do I something like this in C++. I’m using Windows sockets
A reasonably common strategy for application-specific protocols is, instead of using termination, to encode message length before the message. So the first few bytes (how many varies) specify the message length, and some special code (usually all bits on or all bits off) designates an overlong message that requires continuation.
Edit: An example has been requested. Let’s say we’ve designed our protocol so that the maximum size of a message is 255 bytes, and we want to send the string “hi”. Our protocol message will consist of three bytes:
2,104,105. The first byte,2, tells us the length of the message that follows it. The second and third bytes,104and105, are ASCII forhandi.