Just wondering why this is the case. I'm eager to know more about low level languages, and I'm only into the basics of C and this is already confusing me.
Do languages like PHP automatically null terminate strings as they are being interpreted and / or parsed?
From Joel's excellent article on the topic:
Remember the way strings work in C: they consist of a bunch of bytes followed by a null character, which has the value 0. This has two obvious implications:
There is no way to know where the string ends (that is, the string length) without moving through it, looking for the null character at the end. Your string can't have any zeros in it. So you can't store an arbitrary binary blob like a JPEG picture in a C string. Why do C strings work this way? It's because the PDP-7 microprocessor, on which UNIX and the C programming language were invented, had an ASCIZ string type. ASCIZ meant "ASCII with a Z (zero) at the end."
Is this the only way to store strings? No, in fact, it's one of the worst ways to store strings. For non-trivial programs, APIs, operating systems, class libraries, you should avoid ASCIZ strings like the plague.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With