#include <stdio.h>
int main(){
unsigned char a[4] = {1, 2, 3, 4};
int b = *(int *)&a[0];
printf("%d\n", b);
return 0;
}
I just cannot understand why the result of b
is 0x4030201
.
Could someone help me out?
When you tell the compiler to create an array like this:
unsigned char a[4] = {1, 2, 3, 4};
These numbers are put somewhere in memory in following order:
MemoryAddress0: 0x01 -> a[0]
MemoryAddress1: 0x02 -> a[1]
MemoryAddress2: 0x03 -> a[2]
MemoryAddress3: 0x04 -> a[3]
&a[0]
is a char
pointer with the value of MemoryAddress0
and points a 1 byte value of 0x01
(int*)&a[0]
is a casted pointer with the same value of MemoryAddress0
but with int*
type this time so it points to four consecutive bytes.
Most machines we use in our daily lives are little endian which means that they store multibyte values in memory from the least significant byte to the most significant one.
When an int*
points to a memory of four bytes, the first byte it encounters is the least significant byte and the second byte is the the second least significant and so on.
MemoryAddress0: 0x01 -> 2^0 term
MemoryAddress1: 0x02 -> 2^8 term
MemoryAddress2: 0x03 -> 2^16 term
MemoryAddress3: 0x04 -> 2^24 term
Thus the 4-byte integer value becomes 0x01*2^0 + 0x02*2^8 + 0x03*2^16 + 0x04*2^24
which is equal to 0x04030201
.
You are on a little-endian machine, this means that integers with sizes larger than a byte store the least-significant bytes first.
Note that most architectures these days are little-endian thanks to the common-ness of x86.
Because your system is little endian. The first byte in a multi-byte integer is interpreted as the least significant byte in little endian systems.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With