Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Initializing variable with itself: how come it is not a compile-time error? [duplicate]

Tags:

c

syntax

I am a bit baffled that I managed to accidentally write code equivalent to this one

int a=a; // there is not a declared before this line

and the compiler happily compiled it - both gcc and clang, which are very standard-conforming and have good diagnostics. (With -Wall, gcc 4.8 warns about uninitialized variable; clang does not).

I thought the RHS of the assignment will be evaluated prior do LHS, hence causing a to be undefined on the RHS. Can I have some simple clarification about why is this syntactically legal?

like image 580
eudoxos Avatar asked Jan 16 '14 11:01

eudoxos


People also ask

Are variables initialized at compile time?

Initialization of static variables happens in two consecutive stages: static and dynamic initialization. Static initialization happens first and usually at compile time. If possible, initial values for static variables are evaluated during compilation and burned into the data section of the executable.

What does initializer element is not a compile time constant mean?

The reason is that your are defining your imageSegment outside of a function in your source code (static variable). In such cases, the initialization cannot include execution of code, like calling a function or allocation a class. Initializer must be a constant whose value is known at compile time.

What happens when variables are not properly initialized?

An uninitialized variable is a variable that has not been given a value by the program (generally through initialization or assignment). Using the value stored in an uninitialized variable will result in undefined behavior.

Why is it important to always initialize variables when you declare them?

Initializing a variable as Telastyn pointed out can prevent bugs. If the variable is a reference type, initializing it can prevent null reference errors down the line. A variable of any type that has a non null default will take up some memory to store the default value.


1 Answers

It will be a compile-time error if you tell GCC to make it so:

gcc -Winit-self -Werror

Note that sadly this diagnostic is not enabled by most of the usual suspects like -Wall.

like image 177
John Zwinck Avatar answered Nov 10 '22 22:11

John Zwinck