I know that a boolean value is 1 byte (8 bits long) But I would like to know is what is its binary representation. e.g. decimal => binary 4 => 100 (0000 0100) 8 => 1000 (0000 1000) bool value => ???
bool
is a built-in basic type in C#. Any underlying representation would be an implementation detail.
The C# 4.0 Language Specification states in section 4.1.8:
The
bool
type represents boolean logical quantities. The possible values of typebool
aretrue
andfalse
.No standard conversions exist between
bool
and other types. In particular, thebool
type is distinct and separate from the integral types, and abool
value cannot be used in place of an integral value, and vice versa.In the C and C++ languages, a zero integral or floating-point value, or a null pointer can be converted to the boolean value
false
, and a non-zero integral or floating-point value, or a non-null pointer can be converted to the boolean valuetrue
. In C#, such conversions are accomplished by explicitly comparing an integral or floating-point value to zero, or by explicitly comparing an object reference to null.
If we take this one level deeper and see how the corresponding type is specied in the Common Intermediate language (CIL) we will see that a CLI Boolean type occupies 1 byte in memory. The Common Language Infrastructure (CLI) specification says in Partition III, section 1.1.2:
A CLI Boolean type occupies 1 byte in memory. A bit pattern of all zeroes denotes a value of false. A bit pattern with any one or more bits set (analogous to a non-zero integer) denotes a value of true.
However, this is specified on another level and from within C# you should not have to care; even if a future version of the CLI specification might change the representation of the boolean type, or if the C# compiler decided to map a bool
in C# to something different, your C# code would still have the same semantics.
Here's a quick bit of code that demonstrates the underlying representation of bool
, on the current platform wherever it happens to be running:
var x = new NotAGoodIdea();
x.TheBool = true;
Console.WriteLine(x.TheByte); // 1
x.TheBool = false;
Console.WriteLine(x.TheByte); // 0
// ...
[StructLayout(LayoutKind.Explicit)]
public struct NotAGoodIdea
{
[FieldOffset(0)]
public bool TheBool;
[FieldOffset(0)]
public byte TheByte;
}
(Note that although 1
appears to represent true
and 0
appears to represent false
, this is just an implementation detail. You shouldn't rely on this detail, or assume that it will remain consistent across different versions and/or implementations, or even that the current platform always uses the same consistent representation.)
EDIT...
The ECMA CLI spec (partition III, section 1.1.2) is pretty clear about the allowable representations of the Boolean
type:
1.1.2 Boolean data type
A CLI Boolean type occupies 1 byte in memory. A bit pattern of all zeroes denotes a value of false. A bit pattern with any one or more bits set (analogous to a non-zero integer) denotes a value of true.
It appears that the current Microsoft CLR adheres to the ECMA spec in allowing multiple representations of true
. The following example displays a single "False" line (for 0
) followed by 255 lines of "True":
// re-use the NotAGoodIdea struct from the previous example
var x = new NotAGoodIdea();
for (int i = 0; i < 256; i++ )
{
x.TheByte = (byte)i;
Console.WriteLine(x.TheBool);
}
I'm not contradicting 0xA3's answer, but if you use:
BitConverter.GetBytes(true);
BitConverter.GetBytes(false);
You'll get a byte array of { 1 }
and { 0 }
. In other words, the binary values would be 00000001
and 00000000
.
This doesn't mean that's how .NET handles booleans in memory - it's just how it converts them to byte arrays.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With