Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why are these constructs using pre and post-increment undefined behavior?

#include <stdio.h>  int main(void) {    int i = 0;    i = i++ + ++i;    printf("%d\n", i); // 3     i = 1;    i = (i++);    printf("%d\n", i); // 2 Should be 1, no ?     volatile int u = 0;    u = u++ + ++u;    printf("%d\n", u); // 1     u = 1;    u = (u++);    printf("%d\n", u); // 2 Should also be one, no ?     register int v = 0;    v = v++ + ++v;    printf("%d\n", v); // 3 (Should be the same as u ?)     int w = 0;    printf("%d %d\n", ++w, w); // shouldn't this print 1 1     int x[2] = { 5, 8 }, y = 0;    x[y] = y ++;    printf("%d %d\n", x[0], x[1]); // shouldn't this print 0 8? or 5 0? } 
like image 668
PiX Avatar asked Jun 04 '09 09:06

PiX


People also ask

What causes undefined behavior C++?

In C/C++ bitwise shifting a value by a number of bits which is either a negative number or is greater than or equal to the total number of bits in this value results in undefined behavior.

What is undefined behavior in programming?

In computer programming, undefined behaviour is defined as 'the result of compiling computer code which is not prescribed by the specs of the programming language in which it is written'.

Why does increment operation like a i i ++; result in undefined behavior?

Why doesn't this code: a[i] = i++; work? The subexpression i++ causes a side effect--it modifies i' s value--which leads to undefined behavior since i is also referenced elsewhere in the same expression.

What is pre-increment and post increment operator explain with example?

Pre-increment (++i) − Before assigning the value to the variable, the value is incremented by one. Post-increment (i++) − After assigning the value to the variable, the value is incremented. The following is the syntax of pre and post increment. ++variable_name; // Pre-increment variable_name++; // Post-increment.


1 Answers

C has the concept of undefined behavior, i.e. some language constructs are syntactically valid but you can't predict the behavior when the code is run.

As far as I know, the standard doesn't explicitly say why the concept of undefined behavior exists. In my mind, it's simply because the language designers wanted there to be some leeway in the semantics, instead of i.e. requiring that all implementations handle integer overflow in the exact same way, which would very likely impose serious performance costs, they just left the behavior undefined so that if you write code that causes integer overflow, anything can happen.

So, with that in mind, why are these "issues"? The language clearly says that certain things lead to undefined behavior. There is no problem, there is no "should" involved. If the undefined behavior changes when one of the involved variables is declared volatile, that doesn't prove or change anything. It is undefined; you cannot reason about the behavior.

Your most interesting-looking example, the one with

u = (u++); 

is a text-book example of undefined behavior (see Wikipedia's entry on sequence points).

like image 79
unwind Avatar answered Sep 29 '22 11:09

unwind