The other day a user reported a bug to me about a toolbar item that was disabled when it should be been enabled. The validation code (simplified for your benefit) looked like:
- (BOOL) validateToolbarItem: (NSToolbarItem *) toolbarItem {
NSArray* someArray = /* arrray from somewhere*/
return [someArray count];
}
It took me a few minutes to realize that -count returns a 32-bit unsigned int, while BOOL is an 8-bit signed char. It just so happened that in this case someArray had 768 elements in it, which meant the lower 8-bits were all 0. When the int is cast to a BOOL upon returning, it resolves to NO
, even though a human would expect the answer to be YES
.
I've since changed my code to return [someArray count] > 0;
however, now I'm curious why is BOOL really a signed char. Is that really "better" in some way then it being an int?
The Boolean type is unsigned and has the lowest ranking in its category of standard unsigned integer types; it may not be further qualified by the specifiers signed , unsigned , short , or long .
string char and bool are 3 data type of variable. bool use less memory space than other and have 2 only state true and false, char use only 1 byte and the max lenght is 1 character string is an array of char so it takes as much space as the character contained.
The unsigned char type can only store nonnegative integer values , it has a minimum range between 0 and 127 , as defined by the C standard. The signed char type can store , negative , zero , and positive integer values . It has a minimum range between -127 and 127 , as defined by the C standard .
The answers given (thus far) focus on why BOOL isn't an int. That answer is pretty clear: a char is smaller than an int, and when Objective-C was designed back in the 80s, shaving off a few bytes was always good.
But your question also seems to be asking, "Why is BOOL signed rather than unsigned?" For that, we can look where BOOL is typedef'ed, in /usr/include/objc/objc.h
:
typedef signed char BOOL;
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C"
// even if -funsigned-char is used.
So there's an answer: the Objective-C designers didn't want to typedef BOOL to char, because on some systems, under some compilers (and remember that Objective-C predates ANSI C, so C compilers differed), a char was signed, and under some, unsigned. The designers wanted @encode(BOOL)
to return a consistent value across platforms, so they included the signedness in the typedef.
But that still begs the question: why signed rather than unsigned? I don't have a definitive answer for that; I imagine the reason is that they had to pick one or the other, and decided to go with signed. If I had to further conjecture, I'd say it's because ints are signed by default (that is, if they don't include a signedness qualifier).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With