With regards to the ANSI C function declaration, how is this an improvement from the old K&R style? I know the differences between them, I just want to know what problems could arise from using the old style and how the new style is an improvement.
Old-style function declarations, in particular, don't allow for compile-time checking of calls.
For example:
int func(x, y)
char *x;
double y;
{
/* ... */
}
...
func(10, 20);
When the compiler sees the call, it doesn't know the types of the parameters of the function func
, so it can't diagnose the error.
By contrast:
int better_func(char *x, double y) {
/* ... */
}
...
better_func(10, 20);
will result in a compiler error message (or at least a warning).
Another improvement: prototypes make it possible to have functions with parameters of type float
, and of integer types narrower than int
(the 3 char
types and the two short
types). Without a prototype, float
is promoted to double
, and narrow integer types are promoted to int
or to unsigned int
. With a prototype, a float
argument is passed as a float
(unless the function is variadic, like printf
, in which case the old rules apply to the variadic arguments).
The C Rationale document discusses this in section 6.7.5.3, probably better than I have:
The function prototype mechanism is one of the most useful additions to the C language. The feature, of course, has precedent in many of the Algol-derived languages of the past 25 years. The particular form adopted in the Standard is based in large part upon C++.
Function prototypes provide a powerful translation-time error detection capability. In traditional C practice without prototypes, it is extremely difficult for the translator to detect errors (wrong number or type of arguments) in calls to functions declared in another source file. Detection of such errors has occurred either at runtime or through the use of auxiliary software tools.
In function calls not in the scope of a function prototype, integer arguments have the integer promotions applied and float arguments are widened to double. It is not possible in such a call to pass an unconverted char or float argument. Function prototypes give the programmer explicit control over the function argument type conversions, so that the often inappropriate and sometimes inefficient default widening rules for arguments can be suppressed by the implementation.
There's more; go read it.
A non-defining function declaration in K&R looks as follows
int foo();
and introduces a function that accepts unspecified number of arguments. The problem with such declaration style is obvious: it specifies neither the number of parameters nor their types. There's no way for the compiler to check the correctness of the call with respect to the number of arguments or their types at the point of the call. There's no way for the compiler to perform the argument type conversion or issue and error message in situations when argument type does not match the expected parameter type.
A function declaration, which is used as a part of function definition in K&R looks as follows
int foo(a, b)
int a;
char b;
{ ...
It specifies the number of parameters, but still does not specify their types. Moreover, even though the number of parameters appears to be exposed by this declaration, it still formally declares foo
the same way as int foo();
does, meaning that calling it as foo(1, 2, 3, 4, 5)
still does not constitute a constraint violation.
The new style, i.e. declaration with prototype is better for obvious reasons: it exposes both the number and the types of parameters. It forces the compiler to check the validity of the call (with regard to the number and the types of parameters). And it allows the compiler to perform implicit type conversions from argument types to parameter types.
There are other, less obvious benefits provided by prototype declarations. Since the number and types of function parameters are known precisely to both the caller and the function itself, it is possible to choose the most efficient method of passing the arguments (the calling convention) at the point of the call without seeing the function definition. Without that information K&R implementations were forced to follow a single pre-determined "one size fits all" calling convention for all functions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With