Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why dividing int.MinValue by -1 threw OverflowException in unchecked context?

int y = -2147483648; int z = unchecked(y / -1); 

The second line causes an OverflowException. Shouldn't unchecked prevent this?

For example:

int y = -2147483648; int z = unchecked(y * 2); 

doesn't cause an exception.

like image 237
Marcel Niehüsener Avatar asked Oct 27 '14 18:10

Marcel Niehüsener


1 Answers

This is not an exception that the C# compiler or the jitter have any control over. It is specific to Intel/AMD processors, the CPU generates a #DE trap (Divide Error) when the IDIV instruction fails. The operating system handles the processor trap and reflects it back into the process with a STATUS_INTEGER_OVERFLOW exception. The CLR dutifully translates it to a matching managed exception.

The Intel Processor Manual is not exactly a gold mine of information about it:

Non-integral results are truncated (chopped) towards 0. The remainder is always less than the divisor in magnitude. Overflow is indicated with the #DE (divide error) exception rather than with the CF flag.

In English: the result of the signed division is +2147483648, not representable in an int since it is Int32.MaxValue + 1. Otherwise an inevitable side-effect of the way the processor represents negative values, it uses two's-complement encoding. Which produces a single value to represent 0, leaving an odd number of other possible encodings to represent negative and positive values. There is one more for negative values. Same kind of overflow as -Int32.MinValue, except that the processor doesn't trap on the NEG instruction and just produces a garbage result.

The C# language is of course not the only one with this problem. The C# Language Spec makes it implementation defined behavior (chapter 7.8.2) by noting the special behavior. No other reasonable thing they could do with it, generating the code to handle the exception surely was considered too unpractical, producing undiagnosably slow code. Not the C# way.

The C and C++ language specs up the ante by making it undefined behavior. That can truly get ugly, like a program compiled with the gcc or g++ compiler, typically with the MinGW toolchain. Which has imperfect runtime support for SEH, it swallows the exception and allows the processor to restart the division instruction. The program hangs, burning 100% core with the processor constantly generating #DE traps. Turning division into the legendary Halt and Catch Fire instruction :)

like image 59
Hans Passant Avatar answered Oct 05 '22 23:10

Hans Passant