// <windef.h>
typedef int BOOL;
Isn't this a waste of memory since an int is 32 bits?
Just in case I was wrong, I tried sending a normal bool*
to a function that required BOOL*
and didn't work until I used the typedef int.
Wow, slow down a little bit there. First of all, I'm pretty sure programmers have been using 4-byte int
s for boolean variables since the beginning of programming on x86. (There used to be no such thing as a bool
datatype). And I'd venture to guess that this same typedef is in the Windows 3.1 <Windows.h>
.
Second, you need to understand a bit more about the architecture. You have a 32-bit machine, which means that all of the CPU registers are 4-bytes or 32-bits wide. So for most memory accesses, it is more efficient to store and access 4-byte values than it is for a 1-byte value.
If you have four 1-byte boolean variables packed into one 4-byte chunk of memory, three of those are not DWORD (4-byte) aligned. This means the CPU / memory controller actually has to do more work to get the value.
And before you go smashing on MS for making that "wasteful" typedef. Consider this: Under the hood, most compilers (probabily) still implement the bool
datatype as a 4-byte int
for the same reasons I just mentioned. Try it in gcc, and take a look at the map file. I bet I am right.
Firstly, the type used in the system API has to be as language-independent as possible, because that API will be used by a multitude of programming languages. For this reason, any "conceptual" types that might either not exist in some languages or might be implemented differently in other languages are out of question. For example, bool
fits into that category. On top of that, in a system API it is a very good idea to keep the number of interface types to a minimum. Anything that can be represented by int
should be represented by int
.
Secondly, your assertion about this being "a waste of memory" makes no sense whatsoever. In order to become "a waste of memory" one would have to build an aggregate data type that involves an extremely large number of BOOL
elements. Windows API uses no such data types. If you built such wasteful data type in your program, it is actually your fault. Meanwhile, Windows API does not in any way force you to store your boolean values in BOOL
type. You can use bytes and even bits for that purpose. In other words, BOOL
is a purely interface type. Object of BOOL
type normally don't occupy any long-term memory at all, if you are using it correctly.
Historically BOOL
was used as an anything-not-0 = TRUE type. For example, a dialog procedure returned a BOOL
, that could carry a lot of information. The signature below is from Microsoft's own documentation:
BOOL CALLBACK DlgProc(HWND hwndDlg, UINT message, WPARAM wParam, LPARAM lParam)
The signature and function result conflated several issues, so in the modern API it's instead
INT_PTR CALLBACK DialogProc(
_In_ HWND hwndDlg,
_In_ UINT uMsg,
_In_ WPARAM wParam,
_In_ LPARAM lParam
);
This newfangled declaration has to remain compatible with the old one. Which means that INT_PTR
and BOOL
have to be the same size. Which means that in 32-bit programming, BOOL
is 32 bits.
In general, since BOOL
can be any value, not just 0 and 1, it's a very ungood idea to compare a BOOL
to TRUE
. And even though it works to compare it against FALSE
, that's generally also bad practice because it can easily give people the impression that comparing against TRUE
would be OK. Also, because it's quite unnecessary.
By the way, there are more boolean types in the Windows API, in particular VARIANT_BOOL
which is 16 bits and where logical TRUE is represented as the all 1 bitpattern, i.e. -1
as a signed value…
That's an additional reason why it's not a good idea to compare directly with logical FALSE or TRUE.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With