Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can't compiler derive string length for array of strings?

Note: This question was influenced by this answer.

The following is valid C code:

char myString[] = "This is my string";

This will allocate a string of length 18 (including the \0 character) on the stack and assign the specified value to it.

However, the following:

char myStrings[][] = {"My 1st string", "My 2nd string", "My 3rd string"};

is not valid, giving the error "array type has incomplete element type".

So I have to specify the array like this:

char myStrings[][20] = {"My 1st string", "My 2nd string", "My 3rd string"};

Where 20 is a number which is larger than my longest string.

This compiles and works as expected.

If the compiler can dynamically sense the string length when allocating a single string on the stack, why can't it do so for an array of strings?

Edit:

Just to clarify, this is not a real life programming problem I am experiencing - this is just morbid curiosity.

like image 678
LeopardSkinPillBoxHat Avatar asked Dec 09 '22 18:12

LeopardSkinPillBoxHat


1 Answers

One thing is to "sense" the length of one string. Another thing is to calculate the maximum of lengths of many strings. There's a certain intuitive qualitative difference between the two. So, the language authors probably decided that the former is simple and useful, but the latter is too compilex and less useful.

like image 133
AnT Avatar answered Mar 05 '23 17:03

AnT