I was just wondering how disastrous integer overflow really is. Take the following example program:
#include <iostream> int main() { int a = 46341; int b = a * a; std::cout << "hello world\n"; }
Since a * a
overflows on 32 bit platforms, and integer overflow triggers undefined behavior, do I have any guarantees at all that hello world
will actually appear on my screen?
I removed the "signed" part from my question based on the following standard quotes:
(§5/5 C++03, §5/4 C++11) If during the evaluation of an expression, the result is not mathematically defined or not in the range of representable values for its type, the behavior is undefined.
(§3.9.1/4) Unsigned integers, declared
unsigned
, shall obey the laws of arithmetic modulo 2^n where n is the number of bits in the value representation of that particular size of integer. This implies that unsigned arithmetic does not overflow because a result that cannot be represented by the resulting unsigned integer type is reduced modulo the number that is one greater than the largest value that can be represented by the resulting unsigned integer type.
An integer overflow occurs when you attempt to store inside an integer variable a value that is larger than the maximum value the variable can hold. The C standard defines this situation as undefined behavior (meaning that anything might happen).
An integer overflow can lead to data corruption, unexpected behavior, infinite loops and system crashes.
Integer overflows are a significant security threat. In 2021, they ranked 12th in the updated Common Weakness Enumeration (CWE) list of the most common flaws, bugs, faults, and other errors in either hardware or software.
An integer overflow is a type of an arithmetic overflow error when the result of an integer operation does not fit within the allocated memory space. Instead of an error in the program, it usually causes the result to be unexpected.
You may trigger some hardware safety feature. So no, you don't have any guarantee.
Edit: Note that gcc has the -ftrapv
option (but it doesn't seem to work for me).
As pointed out by @Xeo in the comments (I actually brought it up in the C++ chat first):
Undefined behavior really means it and it can hit you when you least expect it.
The best example of this is here: Why does integer overflow on x86 with GCC cause an infinite loop?
On x86, signed integer overflow is just a simple wrap-around. So normally, you'd expect the same thing to happen in C or C++. However, the compiler can intervene - and use undefined behavior as an opportunity to optimize.
In the example taken from that question:
#include <iostream> using namespace std; int main(){ int i = 0x10000000; int c = 0; do{ c++; i += i; cout << i << endl; }while (i > 0); cout << c << endl; return 0; }
When compiled with GCC, GCC optimizes out the loop test and makes this an infinite loop.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With