Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is over/underflow an undefined behavior at execution time?

I was reading about undefined behavior, and I'm not sure if it's a compile-time only feature, or if it can occurs at execution-time.

I understand this example well (this is extracted from the Undefined Behavior page of Wikipedia):

An example for the C language:

int foo(unsigned x)
{
    int value = 5;
    value += x;
    if (value < 5)
        bar();
    return value;
}

The value of x cannot be negative and, given that signed integer overflow is undefined behavior in C, the compiler can assume that at the line of the if check value >= 5. Thus the if and the call to the function bar can be ignored by the compiler since the if has no side effects and its condition will never be satisfied. The code above is therefore semantically equivalent to:

int foo(unsigned x)
{
     int value = 5;
     value += x;
     return value;
}

But this occurs at compilation-time.

What if I write, for example:

void foo(int x) {
    if (x + 150 < 5)
         bar();
}

int main() {
    int x;
    std::cin >> x;
    foo(x);
}

and then the user type in MAX_INT - 100 ("2147483547", if 32 bits-integer).

There will be an integer overflow, but AFAIK, it is the arithmetic logic unit of the CPU that will make an overflow, so the compiler is not involved here.

Is it still undefined behavior?

If yes, how does the compiler detect the overflow?

The best I could imagine is with the overflow flag of the CPU. If this is the case, does it means that the compiler can do anything he wants if the overflow flag of the CPU is set anytime at execution-time?

like image 735
Jules Lamur Avatar asked Jan 09 '17 23:01

Jules Lamur


People also ask

What is overflow and underflow condition?

Simply put, overflow and underflow happen when we assign a value that is out of range of the declared data type of the variable. If the (absolute) value is too big, we call it overflow, if the value is too small, we call it underflow.

What is undefined behavior in programming?

So, in C/C++ programming, undefined behavior means when the program fails to compile, or it may execute incorrectly, either crashes or generates incorrect results, or when it may fortuitously do exactly what the programmer intended.

Is int overflow undefined?

In contrast, the C standard says that signed integer overflow leads to undefined behavior where a program can do anything, including dumping core or overrunning a buffer. The misbehavior can even precede the overflow. Such an overflow can occur during addition, subtraction, multiplication, division, and left shift.

Is unsigned overflow undefined behavior?

-fsanitize=unsigned-integer-overflow : Unsigned integer overflow, where the result of an unsigned integer computation cannot be represented in its type. Unlike signed integer overflow, this is not undefined behavior, but it is often unintentional.


1 Answers

Yes but not necessarily in the way I think you might have meant it, that is, if in the machine code there is an addition and at runtime that addition wraps (or otherwise overflows, but on most architectures it would wrap) that is not UB by itself. The UB is solely in the domain of C (or C++). That addition may have been adding unsigned integers or be some sort of optimizations that the compiler can make because it knows the semantics of the target platform and can safely use optimizations that rely on wrapping (but you cannot, unless of course you do it with unsigned types).

Of course that does not at all mean that it is safe to use constructs that "wrap only at runtime", because those code paths are poisoned at compile time as well. For example in your example,

extern void bar(void);

void foo(int x) {
    if (x + 150 < 5)
         bar();
}

Is compiled by GCC 6.3 targeting x64 to

foo:
        cmp     edi, -145
        jl      .L4
        ret
.L4:
        jmp     bar

Which is the equivalent of

void foo(int x) {
    if (x < -145)
         bar(); // with tail call optimization
}

.. which is the same if you assume that signed integer overflow is impossible (in the sense that it puts an implicit precondition on the inputs to be such that overflow will not happen).

like image 74
harold Avatar answered Nov 15 '22 03:11

harold