Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does C not define minimum size for an array?

C standard defines a lot of lower/upper limits (translation limits) and imposes an implementation should satisfy for each translation. Why there's no such minimum limit defined for an array size? The following program is going to compile fine and likely produce runtime error/segfault and would invoke undefined behaviour.

int main()
{
   int a[99999999];
   int i;

   for(i=0;i<99999999;i++)
   a[i]=i;

   return 0;
}

A possible reason could be local arrays are allocated on automatic storage and it depends on the size of the stack frame allocated. But why not a minimum limit like other limits defined by C?

Let's forget about the undefined cases like above. Consider the following:

int main()
{
   int a[10];
   int i;

   for(i=0;i<10;i++)
   a[i]=i;

   return 0;
}

In the above, what gives me the guarantee that the local array (despite a very small one) is going to work as expected and won't cause undefined behaviour due to allocation failure?

Although it's unlikely that an allocation for such a small array would fail on any modern systems. But the C standard doesn't define any requirements to satisfy and compilers don't (at least GCC doesn't) report allocation failures. Only a runtime error/undefined behaviour is possibility. The hard part is nobody can tell whether an arbitrary sized array is going cause undefined behaviour due to allocation failure.

Note that I am aware I can use dynamic arrays (via malloc & friends) for this purpose and have a better control over allocation failures. I am more interested in why there's no such limit defined for local arrays. Also, global arrays are going to be stored in static storage and is going to increase executable size which compilers can handle.

like image 663
P.P Avatar asked Nov 30 '22 04:11

P.P


2 Answers

Because C, the language, should not be imposing limitations on your available stack size. C operates in many (many) different environments. How could it possibly come up with a reasonable number? Hell, automatic storage duration != stack, a stack is an implementation detail. C, the language, says nothing of a "stack".

The environment decides this stuff, and for good reason. What if a certain environment implements automatic storage duration via an alternative method which imposes no such limitation? What if a breakthrough in hardware occurs and all of a sudden modern machines do not require such a limitation?

Should we rev the standard in such an event? We would have to if C, the language, specified such implementation details.

like image 115
Ed S. Avatar answered Dec 07 '22 23:12

Ed S.


You've already answered your own question; it's due to stack limitation.* Even this might not work:

void foo(void) {
    int a;

    ...
}

if the ... is actually a recursive call to foo.

In other words, this is nothing to do with arrays, as the same problem affects all local variables. The standard couldn't enforce a requirement, because in practice that would translate into a requirement for an infinite-sized stack.


* Yes, I know the C standard(s) don't talk about stacks. But that's the implicit model, in the sense that the standard was really a formalisation of the implementations that existed at the time.
like image 34
Oliver Charlesworth Avatar answered Dec 07 '22 23:12

Oliver Charlesworth