Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Count multi-dimensional array's first dimension

Right now I can count the first dimension using var_num=sizeof(var)/sizeof(var[0]); which by the way I haved researched and am using on my code, but the problem is I don't know how it works, on my output it shows 20 / 4 = 5 and I can't figure out where did that 20 and 4 came from, and I just want to ask how do these values been fetched by sizeof(var) and sizeof(var[0]), what does that zero mean and is it refering to the 1st or 2nd dimension?

#include <stdio.h>
#include <conio.h>
void main(){
char *var[][2]={
        {"var0-0","var0-1"},
        {"var1-0","var1-1"},
        {"var2-0","var2-1"},
        {"var3-0","var3-1"},
        {"var4-0","var4-1"},
        };
int var_num=sizeof(var)/sizeof(var[0]);
clrscr();
printf("%d / %d = %d",sizeof(var),sizeof(var[0]),var_num);
getch();
}
like image 393
Aesthetic Avatar asked Jan 26 '26 01:01

Aesthetic


1 Answers

Let's take a complicated example for fun. Say we've

char a[10][20][30] = { };

Size of a will be sizeof(char) * 10 * 20 * 30; so sizeof(a) = 6000 (as per C99 standard). a can be seen as an array of (10) array of (20) array of (30) characters. Now

  • a[0] will be one dimension lesser giving us an array of (20) array of (30) characters
  • a[0][0] will give us an array of (30) characters
  • a[0][0][0] is a character.

In all these examples 0 is nothing but the first element of the respective array level.

Now finding the length of an array by doing sizeof(a)/sizeof(a[0]) is a trick rooted in the above logic. sizeof(a[0]) is nothing but the size of an array of 20 array of 30 characters, which is 600. sizeof(a) / sizeof(a[0]) = 6000/600 = 10, giving back the length of the first dimension. Similar math can be done for higher dimensions too.

Since in your question, you've a pointer type char* the sizeof(char*) should be taken as the base factor which would be multiplied with the lengths of each dimension. Size of a pointer type depends on the machine/architecture type and the compiler you use.


Each of us will have different machines and different compilers running on them, so we need a common reference for explanation. Running your program in an online compiler gives the result 40 / 8 = 5. like I've stated above, depending on the platform and compiler, the size of a pointer type will vary.

Like you've written in the comment, your array is of type char* [5][2]. Deferencing with [0] will remove one level and we've char* [2]; thus sizeof(var[0]) = sizeof(char* [2]) = 8, say that size of two char pointers is 8, which implies that sizeof(char*) is 4 on that online machine. On this basis sizeof(var) = 4 * 5 * 2 = 40, which is what we see in the output, thereby rightly giving the first array length as 5.

Now your output, like mentioned by glglgl, is a bit different (perhaps your machine or the compiler's data model is 16-bit); the machine you're running it on in combination with your compiler seems to give 2 as the size of a char pointer i.e sizeof(char*) = 2. Now when you do var[0], we've char* [2]; its size equals sizeof(char*) * 2 = 2 * 2 = 4. Likewise sizeof(var) = 2 * 5 * 2 = 20. Thus you've 20 / 4 = 5 as output.

How could I know how many bits do I have per element of the 1st dimension, I mean through calculation?

In char* [5][2] each element of the first dimension is the type char* [2], thus it's size is 2 * sizeof(char*).

Hope it helps!

like image 111
legends2k Avatar answered Jan 27 '26 20:01

legends2k



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!