I was learning C++ and come across the following question. I'm just a beginner and I got confused. Isn't sizeof() function supposed to return the size of the datatype? Why could a data object has different size from its sizeof()? I don't understand the explanation of the answer.
Suppose in a hypothetical machine, the size of char is 32 bits. What would sizeof(char) return?
a) 4
b) 1
c) Implementation dependent
d) Machine dependent
Answer:b
Explanation: The standard does NOT require a char to be 8-bits, but does require that sizeof(char) return 1.
The sizeof
operator yields the size of a type in bytes, where a byte is defined to be the size of a char
. So sizeof(char)
is always 1 by definition, regardless of how many bits char
has on a given platform.
This applies to both C and C++.
From the C11 standard, 6.5.3.4
- The
sizeof
operator yields the size (in bytes) of its operand, which may be an expression or the parenthesized name of a type. The size is determined from the type of the operand....
Then,
- When
sizeof
is applied to an operand that has typechar
,unsigned char
, orsigned char
, (or a qualified version thereof) the result is 1.
From the C++11 standard, 5.3.3
- The
sizeof
operator yields the number of bytes in the object representation of its operand. The operand is either an expression, which is an unevaluated operand (Clause 5), or a parenthesized type-id.... ...sizeof(char)
,sizeof(signed char)
andsizeof(unsigned char)
are 1.
(emphasis mine)
You're just confused with the difference between bytes and octets.
A byte is the size of one character. This yields to the always true sizeof(char) == 1
, because sizeof
return the size in bytes
While an octet consists out of 8 bits.
On almost all modern platforms, the size of a byte is coincidentally the same as of an octet. That's the reason why it's a common error to mix them up, even book authors and professors are doing this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With