I wrote the following program to determine the size of a static array. When I ran it, I got a result I can't explain. I've done some searching on stackexchange and google, but nothing I've read has given me a hint.
#include <stdio.h>
int main()
{
  int arrSize, intSize, elemSize;
  int input[9][9];
  arrSize = sizeof(input);
  intSize = sizeof(int);
  elemSize = sizeof(input[0]);
  printf("Array: %d, Element: %d, Int: %d\n", arrSize, elemSize, intSize);
  return sizeof(input);
}
When I compile and run this program, I get the following result (using linux):
./a.out ; echo $?
Array: 324, Element: 36, Int: 4
68
I see from http://c-faq.com/malloc/sizeof.html that sizeof is computed at compile time, and if I change the return to return sizeof(input[0]) I get 36 which is expected, and I get 4 if I change it to return sizeof(input[0][0]) as expected.  So why does sizeof(input) give 68 in the return, but when stored it gives the expected 324?
The exit code for your system must be max 255 we can see that 324 % 256 = 68. 
After a child process terminated, its parent process could get the status information of this child process by
waitpid(-1, &status, 0);
And the return status could be extracted from status by WEXITSTATUS(status), according to waitpid(2) this macro 
returns the exit status of the child. This consists of the least significant 8 bits of the status argument that the child specified in a call to
exit(3)or_exit(2)or as the argument for a return statement inmain().
Therefore, if your main() returns 324, the return code you get from shell would be 324 % 256, i.e. 68.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With