It seems that binary would be more compact and can be deserialized in a standard way, why is text used instead? It seems inefficient and web frameworks are forced to do nothing more than screwing around with strings. Why isn’t there a binary standard? The web would be way faster and browsers would be able to load binary pages very fast.
If I were to start a binary protocol (HBP hyper binary protocol) what sort of standards would I define?
The HTTP protocol itself is readable as text. This is useful because you can telnet into any server at all and communicate with it.
Being text also allows you to easily watch HTTP communication with a program like wireshark. You can then diagnose the source of problems easily.
HTTP defines a way to work with
resources. These resources do not need to be text, they can be images, or anything else. A text resource can be sent as binary by specifying theContent-Encodingheader. Your resource type is specified via theContent-Typeheader.So your question really only applies to the HTTP protocol itself, and not the payload which is the resources.
I don’t think this is true. The slowest part is probably connection establishment and slow TCP start.
Here is an example of how an HTTP response would send a text resource with a binary representation: