Consider this little piece of code:
int a=0x10000001;
char b;
b=(char)a;
printf("%#x \n",b);
On my PC it prints 0x01 and I am not suprised.
How would it work on BIG ENDIAN machine? I expect that it would print 0x10000001
. Am I right?
I browsed books and web but I didn't find clear information how the casting operation really deals with the memory.
Broadly speaking, the endianness in use is determined by the CPU. Because there are a number of options, it is unsurprising that different semiconductor vendors have chosen different endianness for their CPUs.
The advantages of Big Endian and Little Endian in a computer architecture. According to Wiki, Big endian is “the most common format in data networking”, many network protocols like TCP, UPD, IPv4 and IPv6 are using Big endian order to transmit data. Little endian is mainly using on microprocessors.
The Motorola 6800 / 6801, the 6809 and the 68000 series of processors used the big-endian format.
Little and big endian are two ways of storing multibyte data-types ( int, float, etc). In little endian machines, last byte of binary representation of the multibyte data-type is stored first. On the other hand, in big endian machines, first byte of binary representation of the multibyte data-type is stored first.
No, casting like the one in question does preserve value if possible and does not depend on memory representation.
If you want to reinterpret the memory representation you need to cast pointers. Then it will depend on endianness:
b=*((char*)&a);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With