I'm a bit curious as to why one would want to use hex encoding over base64. It seems to me that base 64 is more efficient. In particular, why is it that databases seem to always use hex encoding? Is it a historical issue, or am I missing something about hex encoding?
You must be a real geek to read BASE64
off the screen.
In Oracle
, when I run HEXTORAW
, I can get some idea of what's in a RAW
field, but I couldn't with BASE64
.
Like, when I see lots of 0x3F
's, I know there's something with encoding.
And internally, these are just binary bytes, there is no other need to encode them but to show to a person on the other side of the screen.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With