I am in the process of learning C, and have begun exploring the world of pointers and pointer arithmetic. For example, in the following code snippet:
int nums[] = {1, 2, 3};
nums
is an Array variable and acts like a pointer that points to the first memory location of the array. I wrote the following sample code and am trying to understand why I am getting the results that I am getting:
#include <stdio.h>
#include <stdlib.h>
int main()
{
int nums[] = {1, 2, 3};
if(nums == &nums)
puts("nums == &nums");
else
puts("nums != &nums");
if((nums + 1) == (&nums + 1))
puts("(nums + 1) == (&nums + 1)");
else
puts("(nums + 1) != (&nums + 1)");
printf("nums: %i\n", nums);
printf("&nums: %i\n", &nums);
printf("nums + 1: %i\n", nums + 1);
printf("&nums + 1: %i\n", &nums + 1);
return 0;
}
I am getting that nums == &nums
is true as expected; however, when I apply pointer arithmetic and add 1
to nums
this result does not equal &nums + 1
. In other words (nums + 1) != (&nums + 1)
even though nums == &nums
.
This is the output of the program that I get:
nums == &nums
(nums + 1) != (&nums + 1)
nums: 2345600
&nums: 2345600
nums + 1: 2345604
&nums + 1: 2345612
It appears that nums
and nums + 1
are off set by 4 bytes; however, &nums
and &nums + 1
are offset by 12. Why is it that this offset is by 12 bytes and not by 4?
The confusion is related to how in C, arrays implicitly decay into pointers in certain contexts.
The easier one to explain, nums + 1
effectively means &nums[0] + 1
. nums[0]
is type int
, which is 4 bytes per element. Thus &nums[0] + 1
is 4 bytes after &nums
.
As for &nums + 1
, &nums
is of type int(*)[3]
, which is 12 bytes per element. Thus &nums + 1
is 12 bytes after &nums
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With