Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

I can't understand this type casting

I have done type casting with int and char but not with pointers so I posted this question.

#include <stdio.h>
int main() {
    int a[4] = { 1, 2, 6, 4 };
    char *p;
    p = (char *)a;   // what does this statement mean?
    printf("%d\n",*p);
    p = p + 1;        
    printf("%d",*p);  // after incrementing it gives 0 why?
}

The first call to printf gives the first element of the array. And after p=p+1 it gives 0. Why?

like image 920
Amol Singh Avatar asked Mar 05 '26 06:03

Amol Singh


2 Answers

Let's imagine a fairly typical platform in which a byte is eight bits, memory is arranged using little-endian byte ordering, and an int represents four bytes in memory. On this platform, a value of 1 would be laid out like so:

00000001 00000000 00000000 00000000
^
the first element of 'a'

p is declared as a pointer to char (not int) and is initialized to points to the first element of the array a. A char on this platform represents one byte. The int value above interpreted as a char would look like so:

00000001 -------- -------- --------
|      |
 ------
 char is only 8 bits wide

So, whether we read one byte or four, i.e., whether we read *p or a[0], the value is 1. However, when you increment p, a pointer to char, it now points to the next char in memory, which is the next byte:

00000001 00000000 00000000 00000000
00000001 00000000 -------- --------
^        ^        ^        ^
p       p+1      p+2      p+3       ...

a[1] points to the next int (2), p[1] points to the next char, which is 0.


On a side note, you've actually stumbled upon a method to determine if a given processor uses little- or big-endian byte order. If the system were big-endian (most significant byte first) then your first printf would have printed 0. This is because the memory layout would have changed:

0000000 00000000 00000000 00000001
^
the first element of 'a'

0000000 -------- -------- --------
^
p

If you have more than a single byte arranged in big-endian order which represent the value 1 and you read only the first byte you can use its value (1 or 0) to test the endianness of the machine:

int n = 1;
if(*(char*)&n == 1)
    // little endian
else
    // big endian
like image 83
Ed S. Avatar answered Mar 06 '26 20:03

Ed S.


To be exact, the first printf doesn't give the first element of the array, it gives the first 8 bits of first element, which just happens to be equal to the first elements numeric value. The second printf gives the next 8 bits of first element, which is 0 in this case.

1 = 00000000 00000000 00000000 00000001 (32 bits)

like image 38
Shamim Hafiz - MSFT Avatar answered Mar 06 '26 20:03

Shamim Hafiz - MSFT



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!