I’m a bit curious as to why one would want to use hex encoding over base64. It seems to me that base 64 is more efficient. In particular, why is it that databases seem to always use hex encoding? Is it a historical issue, or am I missing something about hex encoding?
Share
You must be a real geek to read
BASE64off the screen.In
Oracle, when I runHEXTORAW, I can get some idea of what’s in aRAWfield, but I couldn’t withBASE64.Like, when I see lots of
0x3F‘s, I know there’s something with encoding.And internally, these are just binary bytes, there is no other need to encode them but to show to a person on the other side of the screen.