Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's the behavior of an uninitialized variable used as its own initializer?

I noticed just now that the following code can be compiled with clang/gcc/clang++/g++, using c99, c11, c++11 standards.

int main(void) {
    int i = i;
}

and even with -Wall -Wextra, none of the compilers even reports warnings.

By modifying the code to int i = i + 1; and with -Wall, they may report:

why.c:2:13: warning: variable 'i' is uninitialized when used within its own initialization [-Wuninitialized]
    int i = i + 1;
        ~   ^
1 warning generated.

My questions:

  • Why is this even allowed by compilers?
  • What does the C/C++ standards say about this? Specifically, what's the behavior of this? UB or implementation dependent?
like image 672
Hongxu Chen Avatar asked Jan 15 '19 14:01

Hongxu Chen


People also ask

What happens when you try to use an uninitialized variable?

An uninitialized variable is a variable that has not been given a value by the program (generally through initialization or assignment). Using the value stored in an uninitialized variable will result in undefined behavior.

What does it mean when a variable is uninitialized?

An uninitialized variable has an undefined value, often corresponding to the data that was already in the particular memory location that the variable is using. This can lead to errors that are very hard to detect since the variable's value is effectively random, different values cause different errors or none at all.

What happens in Java if you try to use an uninitialized variable?

Accessing an uninitialized local variable will result in a compile-time error. Show activity on this post. The default value will be based on the type of the data and place where you are using initialized variable .

When a variable is declared it is automatically also initialized?

When you declare a variable, you should also initialize it. Two types of variable initialization exist: explicit and implicit. Variables are explicitly initialized if they are assigned a value in the declaration statement. Implicit initialization occurs when variables are assigned a value during processing.


2 Answers

Because i is uninitialized when use to initialize itself, it has an indeterminate value at that time. An indeterminate value can be either an unspecified value or a trap representation.

If your implementation supports padding bits in integer types and if the indeterminate value in question happens to be a trap representation, then using it results in undefined behavior.

If your implementation does not have padding in integers, then the value is simply unspecified and there is no undefined behavior.

EDIT:

To elaborate further, the behavior can still be undefined if i never has its address taken at some point. This is detailed in section 6.3.2.1p2 of the C11 standard:

If the lvalue designates an object of automatic storage duration that could have been declared with the register storage class (never had its address taken), and that object is uninitialized (not declared with an initializer and no assignment to it has been performed prior to use), the behavior is undefined.

So if you never take the address of i, then you have undefined behavior. Otherwise, the statements above apply.

like image 87
dbush Avatar answered Oct 25 '22 10:10

dbush


This is a warning, it's not related to the standard.

Warnings are heuristic with "optimistic" approach. The warning is issued only when the compiler is sure that it's going to be a problem. In cases like this you have better luck with clang or newest versions of gcc as stated in comments (see another related question of mine: why am I not getting an "used uninitialized" warning from gcc in this trivial example?).

anyway, in the first case:

int i = i;

does nothing, since i==i already. It is possible that the assignment is completely optimized out as it's useless. With compilers which don't "see" self-initialization as a problem you can do this without a warning:

int i = i;
printf("%d\n",i);

Whereas this triggers a warning all right:

int i;
printf("%d\n",i);

Still, it's bad enough not to be warned about this, since from now on i is seen as initialized.

In the second case:

int i = i + 1;

A computation between an uninitialized value and 1 must be performed. Undefined behaviour happens there.

like image 36
Jean-François Fabre Avatar answered Oct 25 '22 08:10

Jean-François Fabre