Is there a historical reason as to why Xoring any alphabetic letter with the space character changes the case of the letter? (i.e., 'a' xor ' ' = 'A', 'F' xor ' ' = 'f', etc...)
Or is this just a coincidence?
(Assuming the characters are ASCII- or unicode-encoded.)
I'm sure it was deliberate that the case could be changed by changing a single bit - it will have made early software much more efficient. The fact that the space character is #32 is irrelevant.
From the wikipedia entry on ASCII:
The code itself was patterned so that most control codes were together, and all graphic codes were together, for ease of identification. The first two columns (32 positions) were reserved for control characters.[19] The "space" character had to come before graphics to make sorting easier, so it became position 20hex;[20] for the same reason, many special signs commonly used as separators were placed before digits. The committee decided it was important to support upper case 64-character alphabets, and chose to pattern ASCII so it could be reduced easily to a usable 64-character set of graphic codes.[21] Lower case letters were therefore not interleaved with upper case. To keep options available for lower case letters and other graphics, the special and numeric codes were arranged before the letters, and the letter "A" was placed in position 41hex to match the draft of the corresponding British standard.[22] The digits 0–9 were arranged so they correspond to values in binary prefixed with 011, making conversion with binary-coded decimal straightforward.
So it seems that the behaviour is a happy coincidence.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With