Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does Decimal.Divide(int, int) work, but not (int / int)?

Tags:

c#

int

math

divide

People also ask

What happens if you divide an int?

when you divide two integers, you get an integer result, and when it tries to implicitly cast it to double, it just fails. double d = num2 / (double)num1; cast at least one of them to double, so the compiler understands that it's fractional division you want.

Does int work for decimals?

The INT function returns the integer part of a decimal number by rounding down to the integer. It is important to understand that the INT function returns the integer part of a decimal number, after rounding down.

What happens when you divide a number by a decimal?

When dividing by decimals, move the decimal point in the dividend the same number of places to the right as you move the decimal point in the divisor.

How does division work with decimals?

To divide a decimal number by a whole number, long divide as you would with two whole numbers, but put the decimal point in the answer at the same place it is at in the dividend. If it does not divide evenly, add a 0 to the end of the dividend and continue dividing until there is no remainder.


int is an integer type; dividing two ints performs an integer division, i.e. the fractional part is truncated since it can't be stored in the result type (also int!). Decimal, by contrast, has got a fractional part. By invoking Decimal.Divide, your int arguments get implicitly converted to Decimals.

You can enforce non-integer division on int arguments by explicitly casting at least one of the arguments to a floating-point type, e.g.:

int a = 42;
int b = 23;
double result = (double)a / b;

In the first case, you're doing integer division, so the result is truncated (the decimal part is chopped off) and an integer is returned.

In the second case, the ints are converted to decimals first, and the result is a decimal. Hence they are not truncated and you get the correct result.


The following line:

int a = 1, b = 2;
object result = a / b;

...will be performed using integer arithmetic. Decimal.Divide on the other hand takes two parameters of the type Decimal, so the division will be performed on decimal values rather than integer values. That is equivalent of this:

int a = 1, b = 2;
object result = (Decimal)a / (Decimal)b;

To examine this, you can add the following code lines after each of the above examples:

Console.WriteLine(result.ToString());
Console.WriteLine(result.GetType().ToString());

The output in the first case will be

0
System.Int32

..and in the second case:

0,5
System.Decimal

I reckon Decimal.Divide(decimal, decimal) implicitly converts its 2 int arguments to decimals before returning a decimal value (precise) where as 4/5 is treated as integer division and returns 0