Does one byte of zeros mean null in utf16 and utf32? as in utf8 or do we need 2 and 4 bytes of zeros to create null in utf16 and utf32 correspondingly?
In UTF-16 it would be two bytes, and in UTF-32 it would be 4 bytes.
After all, otherwise you couldn't differentiate between a character whose encoded value just happened to start with a zero byte and a single zero byte representing U+0000.
Basically UTF-16 works in blocks of 2 bytes, and UTF-32 works in blocks of 4 bytes. (Admittedly for characters outside the BMP you need two "blocks" of UTF-16, but the principle is still the same.) If you were to implement a UTF-16 decoder, you'd read two bytes at a time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With