I write the following code:
#include <iostream>
using namespace std;
int main() {
unsigned int i=1;
i=i-3;
cout<<i;
return 0;
}
The output is a garbage value, which is understandable.
Now I write the following code:
#include <iostream>
using namespace std;
int main() {
unsigned int i=1;
i=i-3;
i=i+5;
cout<<i;
return 0;
}
Now the output is 3. What's happening here? How is the garbage value being added by 5 here?
Think of the values of unsigned int
being drawn on a large clock face with the largest possible value (UINT_MAX) being next to zero.
Subtracting 3 from 1 moves you 3 places back on the clock (which gives you UINT_MAX - 1), and adding 5 to this moves you 5 places forward.
The net effect is to add 2 to 1, but it's important to know that the intermediate value is perfectly well defined by the C++ standard. It is not garbage, but related to the value of UINT_MAX
on your platform.
Note that the well-defined nature of this overflow is not true for signed
types. The behaviour on overflowing a signed
type is undefined in C++.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With