So I'm writing a program to test the endianess of a machine and print it. I understand the difference between little and big endian, however, from what I've found online, I don't understand why these tests show the endianess of a machine.
This is what I've found online. What does *(char *)&x mean and how does it equaling one prove that a machine is Little-Endian?
int x = 1;
if (*(char *)&x == 1) {
printf("Little-Endian\n");
} else {
printf("Big-Endian\n");
}
Big endian machine: An int is 4 bytes, and the first is the largest. I read 4 bytes (W X Y Z) and W is the largest. The number is 0x12345678. Little endian machine: Sure, an int is 4 bytes, but the first is smallest.
Big-endian is an order in which the "big end" (most significant value in the sequence) is stored first, at the lowest storage address. Little-endian is an order in which the "little end" (least significant value in the sequence) is stored first.
We can write a small tool to test Whether a Machine is Big Endian or Little Endian in C/C++. First, we declare a 16-bit integer (short int), which has the value 0x0001, then we gets its pointer and deference it. If the MSB is stored at lower address (e.g. the value that pointer points to), then it is little endian.
In the case of little endian format, the least significant byte appears first, followed by the most significant byte. The letter 'T' has a value of 0x54 and is represented in 16 bit little endian as 54 00.
If we split into different parts:
&x
: This gets the address of the location where the variable x
is, i.e. &x
is a pointer to x
. The type is int *
.
(char *)&x
: This takes the address of x
(which is a int *
) and converts it to a char *
.
*(char *)&x
: This dereferences the char *
pointed to by &x
, i.e. gets the values stored in x
.
Now if we go back to x
and how the data is stored. On most machines, x
is four bytes. Storing 1
in x
sets the least significant bit to 1
and the rest to 0
. On a little-endian machine this is stored in memory as 0x01 0x00 0x00 0x00
, while on a big-endian machine it's stored as 0x00 0x00 0x00 0x01
.
What the expression does is get the first of those bytes and check if it's 1
or not.
Here's what the memory will look like, assuming a 32b integer:
Little-endian
0x01000000 = 00000001000...00
Big-endian
0x00000001 = 0......01
Dereferencing a char *
gives you one byte. Your test fetches the first byte at that memory location by interpreting the address as a char *
and then dereferencing it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With