Here is the code in question
#include <stdio.h>
struct test {
unsigned char t;
unsigned short u;
unsigned char v;
};
int main ()
{
struct test * a = (void *) 0x1000;
printf("%x %p %p\n",
sizeof(struct test),
a + sizeof(struct test),
a - sizeof(struct test));
return 0;
}
The sizeof(struct test) prints 6, so I would expect to see:
6 0xffa 0x1006
Instead I get
6 0x1024 0xfdc
Last time I checked, 0x24, or 36, was not equal to 6. It's not even aligned to anything that I can tell. I am at a complete loss.
Can someone please explain to me why I'm getting these values?
The problem is that when you do pointer arithmetic, it increments by a multiple of the size of the datatype.
So what you're effectively doing is adding by the square of sizeof(struct test)
.
Since sizeof(struct test) = 6
, you are incrementing the address by 6 * 6 = 36
. Hence why you get 0x1024
and 0xfdc
instead of 0x1006
and 0xffa
. (You also switched the +
and -
, but that's a small thing.)
Instead, just do this:
printf("%x %p %p\n",
sizeof(struct test),
a + 1,
a - 1);
When you do pointer arithmetic like this, you move forward or back by that number of elements, as though that variable were in an array. So you really want to just use a + 1
and a - 1
, which should advance by 6 bytes each time.
IMPORTANT: Keep in mind that the compiler can add padding in your struct to help with alignment. Don't just assume that because you have two one-byte chars and a two-byte short that your struct will be 4 bytes in size---this isn't the case here. (In fact, don't assume that you know the size of char or short; I've seen 2-byte chars before).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With